Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Rank (linear algebra)
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Proofs that column rank = row rank == ===Proof using row reduction=== The fact that the column and row ranks of any matrix are equal forms is fundamental in linear algebra. Many proofs have been given. One of the most elementary ones has been sketched in {{slink||Rank from row echelon forms}}. Here is a variant of this proof: It is straightforward to show that neither the row rank nor the column rank are changed by an [[elementary row operation]]. As [[Gaussian elimination]] proceeds by elementary row operations, the [[reduced row echelon form]] of a matrix has the same row rank and the same column rank as the original matrix. Further elementary column operations allow putting the matrix in the form of an [[identity matrix]] possibly bordered by rows and columns of zeros. Again, this changes neither the row rank nor the column rank. It is immediate that both the row and column ranks of this resulting matrix is the number of its nonzero entries. We present two other proofs of this result. The first uses only basic properties of [[linear combination]]s of vectors, and is valid over any [[field (mathematics)|field]]. The proof is based upon Wardlaw (2005).<ref name="wardlaw"> {{Citation| last=Wardlaw| first=William P.| title=Row Rank Equals Column Rank| year=2005| journal=[[Mathematics Magazine]]| volume=78| issue=4| pages=316β318| doi=10.1080/0025570X.2005.11953349| s2cid=218542661}}</ref> The second uses [[orthogonality]] and is valid for matrices over the [[real numbers]]; it is based upon Mackiw (1995).<ref name="mackiw" /> Both proofs can be found in the book by Banerjee and Roy (2014).<ref name="banerjee-roy">{{Citation | last1 = Banerjee | first1 = Sudipto | last2 = Roy | first2 = Anindya | date = 2014 | title = Linear Algebra and Matrix Analysis for Statistics | series = Texts in Statistical Science | publisher = Chapman and Hall/CRC | edition = 1st | isbn = 978-1420095388}}</ref> ===Proof using linear combinations=== Let {{mvar|A}} be an {{math|''m'' Γ ''n''}} matrix. Let the column rank of {{mvar|A}} be {{mvar|r}}, and let {{math|'''c'''<sub>1</sub>, ..., '''c'''<sub>''r''</sub>}} be any basis for the column space of {{mvar|A}}. Place these as the columns of an {{math|''m'' Γ ''r''}} matrix {{mvar|C}}. Every column of {{mvar|A}} can be expressed as a linear combination of the {{mvar|r}} columns in {{mvar|C}}. This means that there is an {{math|''r'' Γ ''n''}} matrix {{mvar|R}} such that {{math|1=''A'' = ''CR''}}. {{mvar|R}} is the matrix whose {{mvar|i}}th column is formed from the coefficients giving the {{mvar|i}}th column of {{mvar|A}} as a linear combination of the {{mvar|r}} columns of {{mvar|C}}. In other words, {{mvar|R}} is the matrix which contains the multiples for the bases of the column space of {{mvar|A}} (which is {{mvar|C}}), which are then used to form {{mvar|A}} as a whole. Now, each row of {{mvar|A}} is given by a linear combination of the {{mvar|r}} rows of {{mvar|R}}. Therefore, the rows of {{mvar|R}} form a spanning set of the row space of {{mvar|A}} and, by the [[Steinitz exchange lemma]], the row rank of {{mvar|A}} cannot exceed {{mvar|r}}. This proves that the row rank of {{mvar|A}} is less than or equal to the column rank of {{mvar|A}}. This result can be applied to any matrix, so apply the result to the transpose of {{mvar|A}}. Since the row rank of the transpose of {{mvar|A}} is the column rank of {{mvar|A}} and the column rank of the transpose of {{mvar|A}} is the row rank of {{mvar|A}}, this establishes the reverse inequality and we obtain the equality of the row rank and the column rank of {{mvar|A}}. (Also see [[Rank factorization]].) ===Proof using orthogonality=== Let {{mvar|A}} be an {{math|''m'' Γ ''n''}} matrix with entries in the [[real number]]s whose row rank is {{mvar|r}}. Therefore, the dimension of the row space of {{mvar|A}} is {{mvar|r}}. Let {{math|'''x'''<sub>1</sub>, '''x'''<sub>2</sub>, β¦, '''x'''<sub>''r''</sub>}} be a [[basis (linear algebra)|basis]] of the row space of {{mvar|A}}. We claim that the vectors {{math|''A'''''x'''<sub>1</sub>, ''A'''''x'''<sub>2</sub>, β¦, ''A'''''x'''<sub>''r''</sub>}} are [[linearly independent]]. To see why, consider a linear homogeneous relation involving these vectors with scalar coefficients {{math|''c''<sub>1</sub>, ''c''<sub>2</sub>, β¦, ''c<sub>r</sub>''}}: <math display="block">0 = c_1 A\mathbf{x}_1 + c_2 A\mathbf{x}_2 + \cdots + c_r A\mathbf{x}_r = A(c_1 \mathbf{x}_1 + c_2 \mathbf{x}_2 + \cdots + c_r \mathbf{x}_r) = A\mathbf{v}, </math> where {{math|1='''v''' = ''c''<sub>1</sub>'''x'''<sub>1</sub> + ''c''<sub>2</sub>'''x'''<sub>2</sub> + β― + ''c<sub>r</sub>'''''x'''<sub>''r''</sub>}}. We make two observations: (a) {{math|'''v'''}} is a linear combination of vectors in the row space of {{mvar|A}}, which implies that {{math|'''v'''}} belongs to the row space of {{mvar|A}}, and (b) since {{math|1=''A'''''v''' = 0}}, the vector {{math|'''v'''}} is [[orthogonal]] to every row vector of {{mvar|A}} and, hence, is orthogonal to every vector in the row space of {{mvar|A}}. The facts (a) and (b) together imply that {{math|'''v'''}} is orthogonal to itself, which proves that {{math|1='''v''' = 0}} or, by the definition of {{math|'''v'''}}, <math display="block">c_1\mathbf{x}_1 + c_2\mathbf{x}_2 + \cdots + c_r \mathbf{x}_r = 0.</math> But recall that the {{math|'''x'''<sub>''i''</sub>}} were chosen as a basis of the row space of {{mvar|A}} and so are linearly independent. This implies that {{math|1=''c''<sub>1</sub> = ''c''<sub>2</sub> = β― = ''c<sub>r</sub>'' = 0}}. It follows that {{math|''A'''''x'''<sub>1</sub>, ''A'''''x'''<sub>2</sub>, β¦, ''A'''''x'''<sub>''r''</sub>}} are linearly independent. Now, each {{math|''A'''''x'''<sub>''i''</sub>}} is obviously a vector in the column space of {{mvar|A}}. So, {{math|''A'''''x'''<sub>1</sub>, ''A'''''x'''<sub>2</sub>, β¦, ''A'''''x'''<sub>''r''</sub>}} is a set of {{mvar|r}} linearly independent vectors in the column space of {{mvar|A}} and, hence, the dimension of the column space of {{mvar|A}} (i.e., the column rank of {{mvar|A}}) must be at least as big as {{mvar|r}}. This proves that row rank of {{mvar|A}} is no larger than the column rank of {{mvar|A}}. Now apply this result to the transpose of {{mvar|A}} to get the reverse inequality and conclude as in the previous proof.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Rank (linear algebra)
(section)
Add topic