Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Singular value decomposition
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Calculating the SVD == === One-sided Jacobi algorithm === One-sided Jacobi algorithm is an iterative algorithm,<ref>{{cite journal|first1=P.P.M. de|last1=Rijk|title=A one-sided Jacobi algorithm for computing the singular value decomposition on a vector computer|journal=SIAM J. Sci. Stat. Comput.|volume=10|pages=359–371|year=1989|issue=2 |doi=10.1137/0910023 }}</ref> where a matrix is iteratively transformed into a matrix with orthogonal columns. The elementary iteration is given as a [[Jacobi rotation]], <math display=block> M\leftarrow MJ(p, q, \theta), </math> where the angle <math>\theta</math> of the Jacobi rotation matrix <math>J(p,q,\theta)</math> is chosen such that after the rotation the columns with numbers <math>p</math> and <math>q</math> become orthogonal. The indices <math>(p,q)</math> are swept cyclically, <math>(p=1\dots m,q=p+1\dots m)</math>, where <math>m</math> is the number of columns. After the algorithm has converged, the singular value decomposition <math>M=USV^T</math> is recovered as follows: the matrix <math>V</math> is the accumulation of Jacobi rotation matrices, the matrix <math>U</math> is given by [[norm_(mathematics)|normalising]] the columns of the transformed matrix <math>M</math>, and the singular values are given as the norms of the columns of the transformed matrix <math>M</math>. === Two-sided Jacobi algorithm === Two-sided Jacobi SVD algorithm—a generalization of the [[Jacobi eigenvalue algorithm]]—is an iterative algorithm where a square matrix is iteratively transformed into a diagonal matrix. If the matrix is not square the [[QR decomposition]] is performed first and then the algorithm is applied to the <math>R</math> matrix. The elementary iteration zeroes a pair of off-diagonal elements by first applying a [[Givens rotation]] to symmetrize the pair of elements and then applying a [[Jacobi transformation]] to zero them, <math display=block> M \leftarrow J^TGMJ </math> where <math>G</math> is the Givens rotation matrix with the angle chosen such that the given pair of off-diagonal elements become equal after the rotation, and where <math>J</math> is the Jacobi transformation matrix that zeroes these off-diagonal elements. The iterations proceeds exactly as in the Jacobi eigenvalue algorithm: by cyclic sweeps over all off-diagonal elements. After the algorithm has converged the resulting diagonal matrix contains the singular values. The matrices <math>U</math> and <math>V</math> are accumulated as follows: <math>U\leftarrow UG^TJ</math>, <math>V\leftarrow VJ</math>. === Numerical approach === The singular value decomposition can be computed using the following observations: * The left-singular vectors of {{tmath|\mathbf M}} are a set of [[orthonormal]] [[eigenvectors]] of {{tmath|\mathbf M \mathbf M^*}}. * The right-singular vectors of {{tmath|\mathbf M}} are a set of orthonormal eigenvectors of {{tmath|\mathbf M^* \mathbf M}}. * The non-zero singular values of {{tmath|\mathbf M}} (found on the diagonal entries of <math>\mathbf \Sigma</math>) are the square roots of the non-zero [[eigenvalues]] of both {{tmath|\mathbf M^* \mathbf M}} and {{tmath|\mathbf M \mathbf M^*}}. The SVD of a matrix {{tmath|\mathbf M}} is typically computed by a two-step procedure. In the first step, the matrix is reduced to a [[bidiagonal matrix]]. This takes [[big O notation|order]] {{tmath|O(mn^2)}} floating-point operations (flop), assuming that {{tmath|m \geq n.}} The second step is to compute the SVD of the bidiagonal matrix. This step can only be done with an [[iterative method]] (as with [[eigenvalue algorithm]]s). However, in practice it suffices to compute the SVD up to a certain precision, like the [[machine epsilon]]. If this precision is considered constant, then the second step takes {{tmath|O(n)}} iterations, each costing {{tmath|O(n)}} flops. Thus, the first step is more expensive, and the overall cost is {{tmath|O(mn^2)}} flops {{harv|Trefethen|Bau III|1997|loc=Lecture 31}}. The first step can be done using [[Householder reflection]]s for a cost of {{tmath|4mn^2 - 4n^3/3}} flops, assuming that only the singular values are needed and not the singular vectors. If {{tmath|m}} is much larger than {{tmath|n}} then it is advantageous to first reduce the matrix {{tmath|\mathbf M}} to a triangular matrix with the [[QR decomposition]] and then use Householder reflections to further reduce the matrix to bidiagonal form; the combined cost is {{tmath|2mn^2 + 2n^3}} flops {{harv|Trefethen|Bau III|1997|loc=Lecture 31}}. The second step can be done by a variant of the [[QR algorithm]] for the computation of eigenvalues, which was first described by {{harvtxt|Golub|Kahan|1965}}. The [[LAPACK]] subroutine DBDSQR<ref>[http://www.netlib.org/lapack/double/dbdsqr.f Netlib.org]</ref> implements this iterative method, with some modifications to cover the case where the singular values are very small {{harv|Demmel|Kahan|1990}}. Together with a first step using Householder reflections and, if appropriate, QR decomposition, this forms the DGESVD<ref>[http://www.netlib.org/lapack/double/dgesvd.f Netlib.org]</ref> routine for the computation of the singular value decomposition. The same algorithm is implemented in the [[GNU Scientific Library]] (GSL). The GSL also offers an alternative method that uses a one-sided [[Jacobi orthogonalization]] in step 2 {{harv|GSL Team|2007}}. This method computes the SVD of the bidiagonal matrix by solving a sequence of {{tmath|2 \times 2}} SVD problems, similar to how the [[Jacobi eigenvalue algorithm]] solves a sequence of {{tmath|2 \times 2}} eigenvalue methods {{harv|Golub|Van Loan|1996|loc=§8.6.3}}. Yet another method for step 2 uses the idea of [[divide-and-conquer eigenvalue algorithm]]s {{harv|Trefethen|Bau III|1997|loc=Lecture 31}}. There is an alternative way that does not explicitly use the eigenvalue decomposition.<ref>[http://www.mathworks.co.kr/matlabcentral/fileexchange/12674-simple-svd mathworks.co.kr/matlabcentral/fileexchange/12674-simple-svd]</ref> Usually the singular value problem of a matrix {{tmath|\mathbf M}} is converted into an equivalent symmetric eigenvalue problem such as {{tmath|\mathbf M \mathbf M^*,}} {{tmath|\mathbf M^* \mathbf M,}} or <math display=block> \begin{bmatrix} \mathbf{0} & \mathbf{M} \\ \mathbf{M}^* & \mathbf{0} \end{bmatrix}. </math> The approaches that use eigenvalue decompositions are based on the [[QR algorithm]], which is well-developed to be stable and fast. Note that the singular values are real and right- and left- singular vectors are not required to form similarity transformations. One can iteratively alternate between the [[QR decomposition]] and the [[LQ decomposition]] to find the real diagonal [[Hermitian matrix|Hermitian matrices]]. The [[QR decomposition]] gives {{tmath|\mathbf M \Rightarrow \mathbf Q \mathbf R}} and the [[LQ decomposition]] of {{tmath|\mathbf R}} gives {{tmath|\mathbf R \Rightarrow \mathbf L \mathbf P^*.}} Thus, at every iteration, we have {{tmath|\mathbf M \Rightarrow \mathbf Q \mathbf L \mathbf P^*,}} update {{tmath|\mathbf M \Leftarrow \mathbf L}} and repeat the orthogonalizations. Eventually,{{clarify|date=April 2021}} this iteration between [[QR decomposition]] and [[LQ decomposition]] produces left- and right- unitary singular matrices. This approach cannot readily be accelerated, as the QR algorithm can with spectral shifts or deflation. This is because the shift method is not easily defined without using similarity transformations. However, this iterative approach is very simple to implement, so is a good choice when speed does not matter. This method also provides insight into how purely orthogonal/unitary transformations can obtain the SVD. === Analytic result of 2 × 2 SVD === The singular values of a {{tmath|2 \times 2}} matrix can be found analytically. Let the matrix be <math>\mathbf{M} = z_0\mathbf{I} + z_1\sigma_1 + z_2\sigma_2 + z_3\sigma_3</math> where <math>z_i \in \mathbb{C}</math> are complex numbers that parameterize the matrix, {{tmath|\mathbf I}} is the identity matrix, and <math>\sigma_i</math> denote the [[Pauli matrices]]. Then its two singular values are given by <math display=block>\begin{align} \sigma_\pm &= \sqrt{|z_0|^2 + |z_1|^2 + |z_2|^2 + |z_3|^2 \pm \sqrt{\bigl(|z_0|^2 + |z_1|^2 + |z_2|^2 + |z_3|^2\bigr)^2 - |z_0^2 - z_1^2 - z_2^2 - z_3^2|^2}} \\ &= \sqrt{|z_0|^2 + |z_1|^2 + |z_2|^2 + |z_3|^2 \pm 2\sqrt{(\operatorname{Re}z_0z_1^*)^2 + (\operatorname{Re}z_0z_2^*)^2 + (\operatorname{Re}z_0z_3^*)^2 + (\operatorname{Im}z_1z_2^*)^2 + (\operatorname{Im}z_2z_3^*)^2 + (\operatorname{Im}z_3z_1^*)^2}} \end{align}</math>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Singular value decomposition
(section)
Add topic