Home Matrix Decompositions
Post
Cancel

Matrix Decompositions

References:

  1. Multivariate Statistics Lecture notes - by Shahin Tavokoli
  2. Choleski decompostion - blog post - notes
  3. Some very nice linear algebra blog posts

Spectral Decomposition and SVD

Singular Value Decomposition (SVD)

Any real $n \times p$ matrix $X$ admits the following decomposition:

\[\begin{align*} X = ULV^T \end{align*}\]

Where $U$ is an $n \times n$ orthogonal matrix, $L$ is an $n \times p$ rectangular diagonal matrix with non-negative entries (singular values), and $V$ is a $p \times p$ orthogonal matrix.

This is saying that:

“Every matrix is diagonal provided one uses the proper bases for the domain and range spaces.”

And the geometric interpretation of this is that the linear transformation represented by any matrix is given by: a rotation, then scaling of the axis by the singular vlaues, and then rotating the axis again. Figure 4 in this blog post visualises this.

This also helps to explain to me why we take such an interest in eigenvalues/singular values, as they tells us by how much a linear transformation distort the spaces. This then helps to explain why matrices with zero valued eigenvalues are not invertible, as they collapse the space. In addition, why orthogonal matrices have all eigenvalues equal to 1, as they do not distort the space.


Spectral Decomposition

Let $A$ be a symmetric matrix, then $A$ admits the following decomposition:

\[\begin{align*} A = EDE^T \end{align*}\]

Where $E$ is the orthogonal matrix with columns being the eigenvectors of $A$. And $D$ is a diagonal matrix with eigenvalues of $A$.

This has the same geometric interpretation but now the second rotation is the same as the first. This video provides a visualisation.

Relationship between the two

https://gregorygundersen.com/blog/2018/12/20/svd-proof/#1-gram-matrices-as-positive-semi-definite

http://theanalysisofdata.com/probability/C_5.html#:~:text=The%20following%20corollary%20shows%20that,are%20the%20eigenvectors%20of%20A.

Cholesky decomposition

\[A = R^TR\]

Where $A$ is a symmetric positive definite matrix, and $R$ is an upper triangular matrix.

  • Unique Cholesky decomposition exists $\iff$ $M$ is symmetric and positive definite.

QR decomposition

Any real $r \times c$ matrix $A$ can be decomposed into the product of an orthogonal matrix $Q$ and rectangulat upper triangular matrix $R$.

Which can then be simplified to:

\(\begin{align*} A = QR = \begin{bmatrix} Q_1 & Q_2 \end{bmatrix} \begin{bmatrix} R_1 \\ 0 \end{bmatrix} = Q_1R_1 \end{align*}\) Where $Q_1$ is an $r \times c$ matrix with orthonormal columns, and $R_1$ is an $c \times c$ upper triangular matrix.

This post is licensed under CC BY 4.0 by the author.