Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Singular value decomposition
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== SVD and spectral decomposition == === Singular values, singular vectors, and their relation to the SVD === A non-negative real number {{tmath|\sigma}} is a '''[[singular value]]''' for {{tmath|\mathbf M}} if and only if there exist unit-length vectors {{tmath|\mathbf u}} in {{tmath|K^m}} and {{tmath|\mathbf v}} in {{tmath|K^n}} such that <math display=block>\begin{align} \mathbf{M v} &= \sigma \mathbf{u}, \\[3mu] \mathbf M^*\mathbf u &= \sigma \mathbf{v}. \end{align}</math> The vectors {{tmath|\mathbf u}} and {{tmath|\mathbf v}} are called '''left-singular''' and '''right-singular vectors''' for {{tmath|\sigma,}} respectively. In any singular value decomposition <math display=block> \mathbf M = \mathbf U \mathbf \Sigma \mathbf V^* </math> the diagonal entries of {{tmath|\mathbf \Sigma}} are equal to the singular values of {{tmath|\mathbf M.}} The first {{tmath|p {{=}} \min(m,n)}} columns of {{tmath|\mathbf U}} and {{tmath|\mathbf V}} are, respectively, left- and right-singular vectors for the corresponding singular values. Consequently, the above theorem implies that: * An {{tmath|m \times n}} matrix {{tmath|\mathbf M}} has at most {{tmath|p}} distinct singular values. * It is always possible to find a [[orthogonal basis|unitary basis]] {{tmath|\mathbf U}} for {{tmath|K^m}} with a subset of basis vectors spanning the left-singular vectors of each singular value of {{tmath|\mathbf M.}} * It is always possible to find a unitary basis {{tmath|\mathbf V}} for {{tmath|K^n}} with a subset of basis vectors spanning the right-singular vectors of each singular value of {{tmath|\mathbf M.}} A singular value for which we can find two left (or right) singular vectors that are linearly independent is called ''degenerate''. If {{tmath|\mathbf u_1}} and {{tmath|\mathbf u_2}} are two left-singular vectors which both correspond to the singular value σ, then any normalized linear combination of the two vectors is also a left-singular vector corresponding to the singular value σ. The similar statement is true for right-singular vectors. The number of independent left and right-singular vectors coincides, and these singular vectors appear in the same columns of {{tmath|\mathbf U}} and {{tmath|\mathbf V}} corresponding to diagonal elements of {{tmath|\mathbf \Sigma}} all with the same value {{tmath|\sigma.}} As an exception, the left and right-singular vectors of singular value 0 comprise all unit vectors in the [[cokernel]] and [[Kernel (linear algebra)|kernel]], respectively, of {{tmath|\mathbf M,}} which by the [[rank–nullity theorem]] cannot be the same dimension if {{tmath|m \neq n.}} Even if all singular values are nonzero, if {{tmath|m > n}} then the cokernel is nontrivial, in which case {{tmath|\mathbf U}} is padded with {{tmath|m - n}} orthogonal vectors from the cokernel. Conversely, if {{tmath|m < n,}} then {{tmath|\mathbf V}} is padded by {{tmath|n - m}} orthogonal vectors from the kernel. However, if the singular value of {{tmath|0}} exists, the extra columns of {{tmath|\mathbf U}} or {{tmath|\mathbf V}} already appear as left or right-singular vectors. Non-degenerate singular values always have unique left- and right-singular vectors, up to multiplication by a unit-phase factor {{tmath|e^{i\varphi} }} (for the real case up to a sign). Consequently, if all singular values of a square matrix {{tmath|\mathbf M}} are non-degenerate and non-zero, then its singular value decomposition is unique, up to multiplication of a column of {{tmath|\mathbf U}} by a unit-phase factor and simultaneous multiplication of the corresponding column of {{tmath|\mathbf V}} by the same unit-phase factor. In general, the SVD is unique up to arbitrary unitary transformations applied uniformly to the column vectors of both {{tmath|\mathbf U}} and {{tmath|\mathbf V}} spanning the subspaces of each singular value, and up to arbitrary unitary transformations on vectors of {{tmath|\mathbf U}} and {{tmath|\mathbf V}} spanning the kernel and cokernel, respectively, of {{tmath|\mathbf M.}} === Relation to eigenvalue decomposition === The singular value decomposition is very general in the sense that it can be applied to any {{tmath|m \times n}} matrix, whereas [[eigenvalue decomposition]] can only be applied to square [[Diagonalizable matrix|diagonalizable matrices]]. Nevertheless, the two decompositions are related. If {{tmath|\mathbf M}} has SVD {{tmath|\mathbf M {{=}} \mathbf U \mathbf \Sigma \mathbf V^*,}} the following two relations hold: <math display=block>\begin{align} \mathbf{M}^* \mathbf{M} &= \mathbf{V} \mathbf \Sigma^* \mathbf{U}^*\, \mathbf{U} \mathbf \Sigma \mathbf{V}^* = \mathbf{V} (\mathbf \Sigma^* \mathbf \Sigma) \mathbf{V}^*, \\[3mu] \mathbf{M} \mathbf{M}^* &= \mathbf{U} \mathbf \Sigma \mathbf{V}^*\, \mathbf{V} \mathbf \Sigma^* \mathbf{U}^* = \mathbf{U} (\mathbf \Sigma \mathbf \Sigma^*) \mathbf{U}^*. \end{align}</math> The right-hand sides of these relations describe the eigenvalue decompositions of the left-hand sides. Consequently: * The columns of {{tmath|\mathbf V}} (referred to as right-singular vectors) are [[eigenvectors]] of {{tmath|\mathbf M^* \mathbf M.}} * The columns of {{tmath|\mathbf U}} (referred to as left-singular vectors) are eigenvectors of {{tmath|\mathbf M \mathbf M^*.}} * The non-zero elements of {{tmath|\mathbf \Sigma}} (non-zero singular values) are the square roots of the non-zero [[eigenvalues]] of {{tmath|\mathbf M^* \mathbf M}} or {{tmath|\mathbf M \mathbf M^*.}} In the special case of {{tmath|\mathbf M}} being a [[normal matrix]], and thus also square, the [[Spectral theorem#Finite-dimensional case|spectral theorem]] ensures that it can be [[Unitary transform|unitarily]] [[Diagonalizable matrix|diagonalized]] using a basis of [[eigenvector]]s, and thus decomposed as {{tmath|\mathbf M {{=}} \mathbf U\mathbf D\mathbf U^*}} for some unitary matrix {{tmath|\mathbf U}} and diagonal matrix {{tmath|\mathbf D}} with complex elements {{tmath|\sigma_i}} along the diagonal. When {{tmath|\mathbf M}} is [[Positive-definite matrix|positive semi-definite]], {{tmath|\sigma_i}} will be non-negative real numbers so that the decomposition {{tmath|\mathbf M {{=}} \mathbf U \mathbf D \mathbf U^*}} is also a singular value decomposition. Otherwise, it can be recast as an SVD by moving the phase {{tmath|e^{i\varphi} }} of each {{tmath|\sigma_i}} to either its corresponding {{tmath|\mathbf V_i}} or {{tmath|\mathbf U_i.}} The natural connection of the SVD to non-normal matrices is through the [[polar decomposition]] theorem: {{tmath|\mathbf M {{=}} \mathbf S \mathbf R,}} where {{tmath|\mathbf S {{=}} \mathbf U \mathbf\Sigma \mathbf U^*}} is positive semidefinite and normal, and {{tmath|\mathbf R {{=}} \mathbf U \mathbf V^*}} is unitary. Thus, except for positive semi-definite matrices, the eigenvalue decomposition and SVD of {{tmath|\mathbf M,}} while related, differ: the eigenvalue decomposition is {{tmath|1= \mathbf M = \mathbf U \mathbf D \mathbf U^{-1},}} where {{tmath|\mathbf U}} is not necessarily unitary and {{tmath|\mathbf D}} is not necessarily positive semi-definite, while the SVD is {{tmath|1= \mathbf M = \mathbf U \mathbf \Sigma \mathbf V^*,}} where {{tmath|\mathbf \Sigma}} is diagonal and positive semi-definite, and {{tmath|\mathbf U}} and {{tmath|\mathbf V}} are unitary matrices that are not necessarily related except through the matrix {{tmath|\mathbf M.}} While only [[defective matrix|non-defective]] square matrices have an eigenvalue decomposition, any {{tmath|m \times n}} matrix has a SVD.
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Singular value decomposition
(section)
Add topic