Home Eigenvectors
Post
Cancel

Eigenvectors

Linear Algebra - Eigenvectors

Eigenvalues and eigenvectors

If $\mathbf{v}_i$ is an eigenvector of matrix $A$ with eigenvalue $\lambda_i$ and $\alpha \neq 0$, then any $\alpha \mathbf{v}_i$ is also an eigenvector of A with eigenvalue $\lambda_i$. Hence, often w.l.o.g we assume that the eigenvectors are normalised. But even they are not uniquely defined, for example they are only identified up to sign.

Singular values

Singular values of a matrix $A$, are the square roots of the eigenvalues of $A^TA$

If $A$ is symmetric the eigenvalues of $A^TA$ are the squared eigenvalues of $A$.

Proof
\[\begin{align*} Ax &= \lambda x \\ A^TAx &= AAx = \lambda Ax = \lambda^2 x \end{align*}\]

Therefore the singular values of a symmetric matrix are the absolute values of its eigenvalues.

Positive definiteness

  • An $n \times n$ matrix $M$ is positive definite if $x^T M x > 0$ for all non-zero vectors $x \in \mathbb{R}^n$.
  • An $n \times n$ matrix $M$ is positive semi-definite if $x^T M x \geq 0$ for all non-zero vectors $x \in \mathbb{R}^n$.

  • Covariance matrices are positive semi-definite.
    The quadratic form can be equal to zero if $\exists x \in \mathbb{R}^n$ such that $x^T (X - \mu) = 0$ a.s. In other words, if one of the random variables is a linear combination of the others.
This post is licensed under CC BY 4.0 by the author.