Singular Value Decomposition (SVD)¶
Any matrix \(\in \mathbb{R}^{n\times m}\) can be decomposed into three new matrices:
where,
- \(A\) = input n x m matrix
- \(U\) = orthogonal matrix (m x m), left singular values
- \(\Sigma\) = diagonal singular values matrix (m x n)
- \(V\) = orthogonal matrix (n x n), right singular values
Orthogonal Matrix¶
A matrix is orthogonal if its rows and columns are orthonormal vectors.
A vector is considered orthonormal if it has a magnitude of 1 and a dot product (\(\cdot\)) of 0 with other vectors.
Example of a rotation matrix:
The rotation matrix is orthogonal because \(R R^T = I\) and \(R^T = R^{-1}\).
Singular Values¶
Singular values are essentially the square roots of the eigenvalues (\(\lambda_1, \lambda_2, \lambda_3, ..., \lambda_n\)) of \(A^TA\), so the singular values are:
The matrix \(A^TA\) is symmetric and of size \(n \times n\). The diagonal matrix \(\Sigma\) contains the singular values \(\sigma_n\) arranged in descending order and all are \(\geq 0\).
Non-commutative of Matrix Multiplication
\(AB \neq BA\), so in the context of SVD \(A^TA\neq AA^T\)
Rank Matrix¶
The rank of a matrix is the number of linearly independent rows or columns.
Example:
Thus, the rank of matrix \(A\) is 2 (the number of non-zero rows or columns).
How To¶
- Compute \(A^TA\) (which is symmetric). The eigenvectors of a symmetric matrix are orthogonal. To obtain the matrix \(V^T\), find the eigenvectors of \(A^TA\).
- Compute the eigenvalues (\(\lambda\)) by solving the equation \(\text{det}(A^TA - \lambda I) = 0\), which gives the eigenvalues \(\lambda_1, \lambda_2, ..., \lambda_n\).
- Compute the eigenvectors (\(v\)) corresponding to the eigenvalues \(\lambda_{1 }\). To ensure the eigenvectors are normalized, divide each element of the eigenvector by \(\sqrt{v_1^2 + v_2^2 + \dots + v_n^2}\).
- Compute the singular values \(\sigma_i = \sqrt{\lambda_i}\).
- Repeat steps 1-3 with \(AA^T\) to obtain the matrix \(U\).