Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Matrix multiplication
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==General properties== Matrix multiplication shares some properties with usual [[multiplication]]. However, matrix multiplication is not defined if the number of columns of the first factor differs from the number of rows of the second factor, and it is [[non-commutative]],<ref name=":2">{{Cite web|last=Weisstein|first=Eric W.|title=Matrix Multiplication|url=https://mathworld.wolfram.com/MatrixMultiplication.html|access-date=2020-09-06|website=mathworld.wolfram.com|language=en}}</ref> even when the product remains defined after changing the order of the factors.<ref>{{cite book|title=Linear Algebra|edition=4th| first1 = S. | last1 = Lipcshutz | first2 = M. | last2 = Lipson|series=Schaum's Outlines|publisher=McGraw Hill (USA)|chapter=2|date=2009|isbn=978-0-07-154352-1}}</ref><ref>{{cite book|title=Matrix Analysis| last = Horn | first = Johnson |edition=2nd |chapter=Chapter 0 |publisher=Cambridge University Press |date=2013|isbn=978-0-521-54823-6}}</ref> ===Non-commutativity=== An operation is [[commutative property|commutative]] if, given two elements {{math|'''A'''}} and {{math|'''B'''}} such that the product <math>\mathbf{A}\mathbf{B}</math> is defined, then <math>\mathbf{B}\mathbf{A}</math> is also defined, and <math>\mathbf{A}\mathbf{B}=\mathbf{B}\mathbf{A}.</math> If {{math|'''A'''}} and {{math|'''B'''}} are matrices of respective sizes {{tmath|m\times n}} and {{tmath|p\times q}}, then <math>\mathbf{A}\mathbf{B}</math> is defined if {{tmath|1=n=p}}, and <math>\mathbf{B}\mathbf{A}</math> is defined if {{tmath|1=m=q}}. Therefore, if one of the products is defined, the other one need not be defined. If {{tmath|1=m=q\neq n=p}}, the two products are defined, but have different sizes; thus they cannot be equal. Only if {{tmath|1=m=q= n=p}}, that is, if {{math|'''A'''}} and {{math|'''B'''}} are [[square matrices]] of the same size, are both products defined and of the same size. Even in this case, one has in general :<math>\mathbf{A}\mathbf{B} \neq \mathbf{B}\mathbf{A}.</math> For example :<math>\begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix}\begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix}=\begin{pmatrix} 1 & 0 \\ 0 & 0 \end{pmatrix},</math> but :<math>\begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix}\begin{pmatrix} 0 & 1 \\ 0 & 0 \end{pmatrix} = \begin{pmatrix} 0 & 0 \\ 0 & 1 \end{pmatrix}.</math> This example may be expanded for showing that, if {{math|'''A'''}} is a {{tmath|n\times n}} matrix with entries in a [[field (mathematics)|field]] {{mvar|F}}, then <math>\mathbf{A}\mathbf{B} = \mathbf{B}\mathbf{A}</math> for every {{tmath|n\times n}} matrix {{math|'''B'''}} with entries in {{mvar|F}}, [[if and only if]] <math>\mathbf{A}=c\,\mathbf{I}</math> where {{tmath|c\in F}}, and {{math|'''I'''}} is the {{tmath|n\times n}} [[identity matrix]]. If, instead of a field, the entries are supposed to belong to a [[ring (mathematics)|ring]], then one must add the condition that {{mvar|c}} belongs to the [[center (ring theory)|center]] of the ring. One special case where commutativity does occur is when {{math|'''D'''}} and {{math|'''E'''}} are two (square) [[diagonal matrices]] (of the same size); then {{math|1='''DE''' = '''ED'''}}.<ref name=":2" /> Again, if the matrices are over a general ring rather than a field, the corresponding entries in each must also commute with each other for this to hold. ===Distributivity=== The matrix product is [[distributive property|distributive]] with respect to [[matrix addition]]. That is, if {{math|'''A''', '''B''', '''C''', '''D'''}} are matrices of respective sizes {{math|''m'' Γ ''n''}}, {{math|''n'' Γ ''p''}}, {{math|''n'' Γ ''p''}}, and {{math|''p'' Γ ''q''}}, one has (left distributivity) :<math>\mathbf{A}(\mathbf{B} + \mathbf{C}) = \mathbf{AB} + \mathbf{AC},</math> and (right distributivity) :<math>(\mathbf{B} + \mathbf{C} )\mathbf{D} = \mathbf{BD} + \mathbf{CD}.</math><ref name=":2" /> This results from the distributivity for coefficients by :<math>\sum_k a_{ik}(b_{kj} + c_{kj}) = \sum_k a_{ik}b_{kj} + \sum_k a_{ik}c_{kj} </math> :<math>\sum_k (b_{ik} + c_{ik}) d_{kj} = \sum_k b_{ik}d_{kj} + \sum_k c_{ik}d_{kj}. </math> ===Product with a scalar=== If {{math|'''A'''}} is a matrix and {{mvar|c}} a scalar, then the matrices <math>c\mathbf{A}</math> and <math>\mathbf{A}c</math> are obtained by left or right multiplying all entries of {{math|'''A'''}} by {{mvar|c}}. If the scalars have the [[commutative property]], then <math>c\mathbf{A} = \mathbf{A}c.</math> If the product <math>\mathbf{AB}</math> is defined (that is, the number of columns of {{math|'''A'''}} equals the number of rows of {{math|'''B'''}}), then :<math> c(\mathbf{AB}) = (c \mathbf{A})\mathbf{B}</math> and <math> (\mathbf{A} \mathbf{B})c=\mathbf{A}(\mathbf{B}c).</math> If the scalars have the commutative property, then all four matrices are equal. More generally, all four are equal if {{math|''c''}} belongs to the [[Center (ring theory)|center]] of a [[ring (mathematics)|ring]] containing the entries of the matrices, because in this case, {{math|''c'''''X''' {{=}} '''X'''''c''}} for all matrices {{math|'''X'''}}. These properties result from the [[bilinearity]] of the product of scalars: :<math>c \left(\sum_k a_{ik}b_{kj}\right) = \sum_k (c a_{ik} ) b_{kj} </math> :<math>\left(\sum_k a_{ik}b_{kj}\right) c = \sum_k a_{ik} ( b_{kj}c). </math> ===Transpose=== If the scalars have the [[commutative property]], the [[transpose]] of a product of matrices is the product, in the reverse order, of the transposes of the factors. That is :<math> (\mathbf{AB})^\mathsf{T} = \mathbf{B}^\mathsf{T}\mathbf{A}^\mathsf{T} </math> where <sup>T</sup> denotes the transpose, that is the interchange of rows and columns. This identity does not hold for noncommutative entries, since the order between the entries of {{math|'''A'''}} and {{math|'''B'''}} is reversed, when one expands the definition of the matrix product. ===Complex conjugate=== If {{math|'''A'''}} and {{math|'''B'''}} have [[complex number|complex]] entries, then :<math> (\mathbf{AB})^* = \mathbf{A}^*\mathbf{B}^* </math> where {{math|<sup>*</sup>}} denotes the entry-wise [[complex conjugate]] of a matrix. This results from applying to the definition of matrix product the fact that the conjugate of a sum is the sum of the conjugates of the summands and the conjugate of a product is the product of the conjugates of the factors. Transposition acts on the indices of the entries, while conjugation acts independently on the entries themselves. It results that, if {{math|'''A'''}} and {{math|'''B'''}} have complex entries, one has :<math> (\mathbf{AB})^\dagger = \mathbf{B}^\dagger\mathbf{A}^\dagger ,</math> where {{math|<sup>β </sup>}} denotes the [[conjugate transpose]] (conjugate of the transpose, or equivalently transpose of the conjugate). ===Associativity=== Given three matrices {{math|'''A''', '''B'''}} and {{math|'''C'''}}, the products {{math|('''AB''')'''C'''}} and {{math|'''A'''('''BC''')}} are defined if and only if the number of columns of {{math|'''A'''}} equals the number of rows of {{math|'''B'''}}, and the number of columns of {{math|'''B'''}} equals the number of rows of {{math|'''C'''}} (in particular, if one of the products is defined, then the other is also defined). In this case, one has the [[associative property]] :<math>(\mathbf{AB})\mathbf{C}=\mathbf{A}(\mathbf{BC}).</math> As for any associative operation, this allows omitting parentheses, and writing the above products as {{tmath|\mathbf{ABC}.}} This extends naturally to the product of any number of matrices provided that the dimensions match. That is, if {{math|'''A'''<sub>1</sub>, '''A'''<sub>2</sub>, ..., '''A'''<sub>''n''</sub>}} are matrices such that the number of columns of {{math|'''A'''<sub>''i''</sub>}} equals the number of rows of {{math|'''A'''<sub>''i'' + 1</sub>}} for {{math|1=''i'' = 1, ..., ''n'' β 1}}, then the product :<math> \prod_{i=1}^n \mathbf{A}_i = \mathbf{A}_1\mathbf{A}_2\cdots\mathbf{A}_n </math> is defined and does not depend on the [[order of operations|order of the multiplications]], if the order of the matrices is kept fixed. These properties may be proved by straightforward but complicated [[summation]] manipulations. This result also follows from the fact that matrices represent [[linear map]]s. Therefore, the associative property of matrices is simply a specific case of the associative property of [[function composition]]. ====Computational complexity depends on parenthesization==== Although the result of a sequence of matrix products does not depend on the [[order of operation]] (provided that the order of the matrices is not changed), the [[computational complexity]] may depend dramatically on this order. For example, if {{math|'''A''', '''B'''}} and {{math|'''C'''}} are matrices of respective sizes {{math|10Γ30, 30Γ5, 5Γ60}}, computing {{math|('''AB''')'''C'''}} needs {{math|1=10Γ30Γ5 + 10Γ5Γ60 = 4,500}} multiplications, while computing {{math|'''A'''('''BC''')}} needs {{math|1=30Γ5Γ60 + 10Γ30Γ60 = 27,000}} multiplications. Algorithms have been designed for choosing the best order of products; see [[Matrix chain multiplication]]. When the number {{mvar|n}} of matrices increases, it has been shown that the choice of the best order has a complexity of <math>O(n \log n).</math><ref>{{cite journal | last1 = Hu | first1 = T. C. | author1-link = T. C. Hu | last2 = Shing | first2 = M.-T. | title = Computation of Matrix Chain Products, Part I | journal = SIAM Journal on Computing | volume = 11 | issue = 2 | pages = 362β373 | year = 1982 | url = http://www.cs.ust.hk/mjg_lib/bibs/DPSu/DPSu.Files/0211028.pdf | issn = 0097-5397 | doi=10.1137/0211028 | citeseerx = 10.1.1.695.2923 }} </ref><ref>{{cite journal | last1 = Hu | first1 = T. C. | author1-link = T. C. Hu | last2 = Shing | first2 = M.-T. | title = Computation of Matrix Chain Products, Part II | journal = SIAM Journal on Computing | volume = 13 | issue = 2 | pages = 228β251 | year = 1984 | url = http://www.cs.ust.hk/mjg_lib/bibs/DPSu/DPSu.Files/0213017.pdf | issn = 0097-5397 | doi=10.1137/0213017 | citeseerx = 10.1.1.695.4875 }} </ref> ====Application to similarity==== Any [[invertible matrix]] <math>\mathbf{P}</math> defines a [[similar matrix|similarity transformation]] (on square matrices of the same size as <math>\mathbf{P}</math>) :<math>S_\mathbf{P}(\mathbf{A}) = \mathbf{P}^{-1} \mathbf{A} \mathbf{P}.</math> Similarity transformations map product to products, that is :<math>S_\mathbf{P}(\mathbf{AB}) = S_\mathbf{P}(\mathbf{A})S_\mathbf{P}(\mathbf{B}).</math> In fact, one has :<math>\mathbf{P}^{-1} (\mathbf{AB}) \mathbf{P} = \mathbf{P}^{-1} \mathbf{A}(\mathbf{P}\mathbf{P}^{-1})\mathbf{B} \mathbf{P} =(\mathbf{P}^{-1} \mathbf{A}\mathbf{P})(\mathbf{P}^{-1}\mathbf{B} \mathbf{P}).</math>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Matrix multiplication
(section)
Add topic