Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Cramer's rule
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Proof== The proof for Cramer's rule uses the following [[Determinant#Properties_of_the_determinant|properties of the determinants]]: linearity with respect to any given column and the fact that the determinant is zero whenever two columns are equal, which is implied by the property that the sign of the determinant flips if you switch two columns. Fix the index {{math|''j''}} of a column, and consider that the entries of the other columns have fixed values. This makes the determinant a function of the entries of the {{mvar|j}}th column. Linearity with respect to this column means that this function has the form :<math>D_j(a_{1,j}, \ldots, a_{n,j})= C_{1,j}a_{1,j}+\cdots, C_{n,j}a_{n,j},</math> where the <math>C_{i,j}</math> are coefficients that depend on the entries of {{mvar|A}} that are not in column {{mvar|j}}. So, one has :<math>\det(A)=D_j(a_{1,j}, \ldots, a_{n,j})=C_{1,j}a_{1,j}+\cdots, C_{n,j}a_{n,j}</math> ([[Laplace expansion]] provides a formula for computing the <math>C_{i,j}</math> but their expression is not important here.) If the function <math>D_j</math> is applied to any ''other'' column {{math|''k''}} of {{mvar|A}}, then the result is the determinant of the matrix obtained from {{mvar|A}} by replacing column {{math|''j''}} by a copy of column {{math|''k''}}, so the resulting determinant is 0 (the case of two equal columns). Now consider a system of {{mvar|n}} linear equations in {{mvar|n}} unknowns <math>x_1, \ldots,x_n</math>, whose coefficient matrix is {{mvar|A}}, with det(''A'') assumed to be nonzero: :<math>\begin{matrix} a_{11}x_1+a_{12}x_2+\cdots+a_{1n}x_n&=&b_1\\ a_{21}x_1+a_{22}x_2+\cdots+a_{2n}x_n&=&b_2\\ &\vdots&\\ a_{n1}x_1+a_{n2}x_2+\cdots+a_{nn}x_n&=&b_n. \end{matrix}</math> If one combines these equations by taking {{math|''C''<sub>1,''j''</sub>}} times the first equation, plus {{math|''C''<sub>2,''j''</sub>}} times the second, and so forth until {{math|''C''<sub>''n'',''j''</sub>}} times the last, then for every {{mvar|k}} the resulting coefficient of {{mvar|x<sub>k</sub>}} becomes :<math>D_j(a_{1,k},\ldots,a_{n,k}).</math> So, all coefficients become zero, except the coefficient of <math>x_j</math> that becomes <math>\det(A).</math> Similarly, the constant coefficient becomes <math>D_j(b_1,\ldots,b_n),</math> and the resulting equation is thus :<math>\det(A)x_j=D_j(b_1,\ldots, b_n),</math> which gives the value of <math>x_j</math> as :<math>x_j=\frac1{\det(A)}D_j(b_1,\ldots, b_n).</math> As, by construction, the numerator is the determinant of the matrix obtained from {{mvar|A}} by replacing column {{math|''j''}} by {{math|'''b'''}}, we get the expression of Cramer's rule as a necessary condition for a solution. It remains to prove that these values for the unknowns form a solution. Let {{mvar|M}} be the {{math|''n'' Γ ''n''}} matrix that has the coefficients of <math>D_j</math> as {{mvar|j}}th row, for <math>j=1,\ldots,n</math> (this is the [[adjugate matrix]] for {{mvar|A}}). Expressed in matrix terms, we have thus to prove that :<math>\mathbf x = \frac1{\det(A)}M\mathbf b</math> is a solution; that is, that :<math>A\left(\frac1{\det(A)}M\right)\mathbf b=\mathbf b.</math> For that, it suffices to prove that :<math>A\,\left(\frac1{\det(A)}M\right)=I_n,</math> where <math>I_n</math> is the [[identity matrix]]. The above properties of the functions <math>D_j</math> show that one has {{math|''MA'' {{=}} det(''A'')''I<sub>n</sub>''}}, and therefore, :<math>\left(\frac1{\det(A)}M\right)\,A=I_n.</math> This completes the proof, since a [[inverse element |left inverse]] of a square matrix is also a right-inverse (see [[Invertible matrix theorem]]). For other proofs, see [[#Other proofs|below]].
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Cramer's rule
(section)
Add topic