ABOUT ME

-

Today
-
Yesterday
-
Total
-
  • Eigendecomposition (spectral decomposition), Singular Value Decomposition (SVD)
    Mathematics 2021. 1. 7. 19:23

    Eigendecomposition (Spectral Decomposition)

    Fundamental theory of eigenvalues and eigenvectors

    The definitions of the eigenvalue and eigenvector are expressed in the following equation:

    $$ A \boldsymbol{v} = \lambda \boldsymbol{v} $$

    where a non-zero vector $\boldsymbol{v}$ is an eigenvector of a square $N \times N$ matrix $A$ and $\lambda$ is a scalar, termed the eigenvalue corresponding to $\boldsymbol{v}$. That is, the eigenvectors are the vectors that the linear transformation $A$ merely elongates or shrinks, and the amount that they elongate/shrink by is the eigenvalue. The above equation is called the eigenvalue equation or the eigenvalue problem.

    This yields this equation:

    $$ A \boldsymbol{v} - \lambda \boldsymbol{v} = 0 $$

    $$ \rightarrow \boldsymbol{v} (A -\lambda I) = 0 $$

    since $v$ must not be zero-vector,

    $$ p(\lambda) = det(A -\lambda I) = 0 $$

    $p(\lambda)$ is called the characteristic equation. The set of solutions from the characteristic equation is the eigenvalues (= the spectrum of $A$).

    For each eigenvalue $\lambda_i$, we have a specifi eigenvalue equation:

    $$ (A - \lambda_i I) \boldsymbol{v} = 0 $$

    There will be linearly independent solutions to each eigenvalue equation. The solutions are the eigenvectors corresponding to each eigenvalue $\lambda_i$.

    Eigendecomposition of a matrix

    The decomposition can be derived from the fundamental property of eigenvectors:

    $$ A \boldsymbol{v} = \lambda \boldsymbol{v} $$

    $$AQ = Q \Lambda$$

    where $Q$ is the square $n \times n$ matrix whose i-th column is the eigenvector $q_i$ (where $i=1, \cdots, n$) of $A$, and $\Lambda$ is the diagonal matrix whose diagonal elements are the corresponding eigenvalues, $\Lambda_{ii} = \lambda_{i}$. Finally,

    $$ A = Q \Lambda Q^{-1} $$

    can be derived, termed the eigendecomposition of a matrix $A$.

    Singular Value Decomposition

    The singular value decomposition (SVD) is a factorization of a real or complex matrix that generalizes the eigendecomposition of a square normal matrix to any $m \times n$ matrix. The square normal matrix refers to $A^T A$ or $A A^T$ where $A$ is an $n \times n$ matrix. The equation of the SVD is as follows:

    $$ M = U \Sigma V^T$$

    where $U$ is an $m \times m$ real or complex unitary matrix (orthonormal matrix = a real unitary matrix), $\Sigma$ is an $m \times n$ rectangular diagonal matrix with non-negative real numbers on the diagonal, and $V$ is an $n \times n$ real or complex unitary matrix. If $M$ is real, $U$ and $V^T$ are real orthonormal matrices. The diagonal entries $\sigma_i = \Sigma_{ii}$ of $\Sigma$ are known as the singular values of $M$. The columns of $U$ and the columns of $V$ are called the left-singular vectors and right-singular vectors of $M$, respectively.

    The SVD equation can be expressed as in the following illustration:

    Illustration of the singular value decomposition $U \Sigma V^T$ of a real 2x2 matrix $M$. $V^* = V^T$. $\sigma$ denotes a singular value

    In the figure:

    • Top: The action of $M$, indicated by its effect on the unit disc $D$ and the two unit vectors $\boldsymbol{e_1}$ and $\boldsymbol{e_2}$.
    • Left: The action of $V^T$, a rotation, on $D, \boldsymbol{e_1}, \boldsymbol{e_2}$.
    • Bottom: The action of $\Sigma$, a scaling by the singular values $\sigma_1$ horizontally and $\sigma_2$ vertically.
    • Right: The action of $U$, another rotation.

    The SVD equation can also be expressed as follows:

    Visualization of the matrix multiplication in the singular value decomposition

    Relation to Eigendecomposition

    The singular value decomposition is very general in the sense that it can be applied to any $m \times n$ matrix, whereas eigenvalue decomposition can only be applied to diagonalizable matrices. Nevertheless, the two decompositions are related.

    Given an SVD of $M$, as described above, the following two relations hold:

    $$ M^T M = V \Sigma^TU^T U \Sigma V^T = V(\Sigma^T \Sigma) V^T $$

    $$ M M^T = U \Sigma V^T V \Sigma^T U^T = U (\Sigma \Sigma^T) U^T $$

    The right-hand sides of these relations describe the eigenvalue decompositions of the left-hand sides. Consequently:

    • The columns of $V$ (right-singular vectors) are eigenvectors of $M^T M$.
    • The columns of $U$ (left-singular vectors) are eigenvectors of $M M^T$.
    • The non-zero elements of $\Sigma$ (non-zero singular values) are the square roots of the non-zero eigenvalues of $M^T M$ or $M M^T$.

    While the eigenvalue decomposition and SVD of $M$ are related, they differ: the eigenvalue decomposition is $M = U D U^{-1}$, where $U$ is not necessarily unitary (orthonormal) and $D$ is not necessarily positive semi-definite, while the SVD is $M = U \Sigma V^T$, where $\Sigma$ is diagonal and positive semi-definite, and $U$ and $V$ are unitary matrices (orthonormal matrices) that are not necessarily related except through the matrix $M$.

     

    Source: Wikipedia

    'Mathematics' 카테고리의 다른 글

    Perpendicular Distance Between a Hyperplane and a Point  (0) 2021.01.13
    Multivariate Gaussian Distribution  (0) 2021.01.07
    Cross Entropy  (0) 2021.01.07

    Comments