Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Cayley–Hamilton theorem
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== A proof using polynomials with matrix coefficients === This proof is similar to the first one, but tries to give meaning to the notion of polynomial with matrix coefficients that was suggested by the expressions occurring in that proof. This requires considerable care, since it is somewhat unusual to consider polynomials with coefficients in a non-commutative ring, and not all reasoning that is valid for commutative polynomials can be applied in this setting. Notably, while arithmetic of polynomials over a commutative ring models the arithmetic of [[polynomial function]]s, this is not the case over a non-commutative ring (in fact there is no obvious notion of polynomial function in this case that is closed under multiplication). So when considering polynomials in {{mvar|t}} with matrix coefficients, the variable {{mvar|t}} must not be thought of as an "unknown", but as a formal symbol that is to be manipulated according to given rules; in particular one cannot just set {{mvar|t}} to a specific value. <math display="block">(f+g)(x) = \sum_i \left (f_i+g_i \right )x^i = \sum_i{f_i x^i} + \sum_i{g_i x^i} = f(x) + g(x).</math> Let <math>M(n,R)</math> be the ring of {{math|''n'' × ''n''}} matrices with entries in some ring ''R'' (such as the real or complex numbers) that has {{mvar|A}} as an element. Matrices with as coefficients polynomials in {{mvar|t}}, such as <math>t I_n - A</math> or its adjugate ''B'' in the first proof, are elements of <math>M(n,R[t])</math>. By collecting like powers of {{mvar|t}}, such matrices can be written as "polynomials" in {{mvar|t}} with constant matrices as coefficients; write <math>M(n,R)[t]</math> for the set of such polynomials. Since this set is in [[bijection]] with <math>M(n,R[t])</math>, one defines arithmetic operations on it correspondingly, in particular multiplication is given by <math display="block">\left( \sum_i M_i t^i \right) \!\!\left( \sum_j N_j t^j \right) = \sum_{i,j} (M_i N_j) t^{i+j},</math> respecting the order of the coefficient matrices from the two operands; obviously this gives a non-commutative multiplication. Thus, the identity <math display="block">(t I_n - A)B = p(t) I_n.</math> from the first proof can be viewed as one involving a multiplication of elements in <math>M(n,R)[t]</math>. At this point, it is tempting to simply set {{mvar|t}} equal to the matrix {{mvar|A}}, which makes the first factor on the left equal to the zero matrix, and the right hand side equal to {{math|''p''(''A'')}}; however, this is not an allowed operation when coefficients do not commute. It is possible to define a "right-evaluation map" {{math|ev<sub>''A''</sub> : '''M'''[''t'' ] → '''M'''}}, which replaces each {{math|''t''<sup> ''i''</sup>}} by the matrix power {{math|''A''<sup>''i''</sup>}} of {{mvar|A}}, where one stipulates that the power is always to be multiplied on the right to the corresponding coefficient. But this map is not a [[ring homomorphism]]: the right-evaluation of a product differs in general from the product of the right-evaluations. This is so because multiplication of polynomials with matrix coefficients does not model multiplication of expressions containing unknowns: a product <math>Mt^i Nt^j = (M\cdot N) t^{i+j}</math> is defined assuming that {{mvar|t}} commutes with {{mvar|N}}, but this may fail if {{mvar|t}} is replaced by the matrix {{mvar|A}}. One can work around this difficulty in the particular situation at hand, since the above right-evaluation map does become a ring homomorphism if the matrix {{mvar|A}} is in the [[center (ring theory)|center]] of the ring of coefficients, so that it commutes with all the coefficients of the polynomials (the argument proving this is straightforward, exactly because commuting {{mvar|t}} with coefficients is now justified after evaluation). Now, {{mvar|A}} is not always in the center of {{math|'''M'''}}, but we may replace {{math|'''M'''}} with a smaller ring provided it contains all the coefficients of the polynomials in question: <math>I_n</math>, {{mvar|A}}, and the coefficients <math>B_i</math> of the polynomial {{math|''B''}}. The obvious choice for such a [[subring]] is the [[centralizer]] {{math|''Z''}} of {{mvar|A}}, the subring of all matrices that commute with {{mvar|A}}; by definition {{mvar|A}} is in the center of {{math|''Z''}}. This centralizer obviously contains <math>I_n</math>, and {{mvar|A}}, but one has to show that it contains the matrices <math>B_i</math>. To do this, one combines the two fundamental relations for adjugates, writing out the adjugate {{math|''B''}} as a polynomial: <math display="block">\begin{align} \left(\sum_{i = 0}^m B_i t^i\right)\!(t I_n - A) &= (tI_n - A) \sum_{i = 0}^m B_i t^i \\ \sum_{i = 0}^m B_i t^{i + 1} - \sum_{i = 0}^m B_i A t^i &= \sum_{i = 0}^m B_i t^{i + 1} - \sum_{i = 0}^m A B_i t^i \\ \sum_{i = 0}^m B_i A t^i &= \sum_{i = 0}^m A B_i t^i . \end{align}</math> [[Equating the coefficients]] shows that for each {{math|''i''}}, we have {{math|1=''AB''<sub>''i''</sub> = ''B''<sub>''i'' </sub>''A''}} as desired. Having found the proper setting in which {{math|ev<sub>''A''</sub>}} is indeed a [[homomorphism]] of rings, one can complete the proof as suggested above: <math display="block">\begin{align} \operatorname{ev}_A\left(p(t)I_n\right) &= \operatorname{ev}_A((tI_n-A)B) \\[5pt] p(A) &= \operatorname{ev}_A(tI_n - A)\cdot \operatorname{ev}_A(B) \\[5pt] p(A) &= (AI_n-A) \cdot \operatorname{ev}_A(B) = O\cdot\operatorname{ev}_A(B)=O. \end{align}</math> This completes the proof.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Cayley–Hamilton theorem
(section)
Add topic