ABOUT ME

-

Today
-
Yesterday
-
Total
-
  • Multivariate Gaussian Distribution
    Mathematics 2021. 1. 7. 17:05

    First, a one-dimensional Gaussian distribution is introduced, and then the multivariate Gaussian distribution is introduced.

    One-dimensional Gaussian Distribution

    $N(\mu, \sigma^2) = \frac{1}{\sqrt{2 \pi \sigma^2}} \exp{(-\frac{1}{2 \sigma^2} (x - \mu)^2)}$

    The corresponding mean and standard deviation are:

    $\mu = \frac{1}{N} \sum_i{x_i}$

    $\sigma^2 = \frac{1}{N} \sum_i{(x_i - \mu)}$

    Multivariate Gaussian Distribution

    $ N(\boldsymbol{\mu}, \boldsymbol{\Sigma}) = \frac{1}{(2 \pi)^{d/2}} |\boldsymbol{\Sigma}|^{-1/2} \exp\{\ -\frac{1}{2} (\boldsymbol{x}-\boldsymbol{\mu}) \boldsymbol{\Sigma}^{-1} (\boldsymbol{x}-\boldsymbol{\mu})^T \} $

    $\boldsymbol{\mu}$ denotes $[\mu_1 \: \mu_2 \: \cdots \: \mu_d]$, $\boldsymbol{\Sigma}$ denotes a $d \times d$ (covariance) matrix, $|\boldsymbol{\Sigma}|$ denotes determinant of $\boldsymbol{\Sigma}$, and $\boldsymbol{x}$ denotes $[x_1 \: x_2]$ where $x$ is a value of a random variable $X$.

    Two-dimensional Gaussian distribution

    Independent Gaussian Models

    We can prove the equation of the multivariate Gaussian distribution. To make it simple, we consider the two-dimensional Gaussian distribution. If a random variable $X_1$ follows $N(\mu_1, \sigma_1^2)$ and another random variable $X_2$ follows $N(\mu_1, \sigma_2^2)$,

    $$X_1 \sim N(\mu_1, \sigma_1^2)$$

    $$X_2 \sim N(\mu_2, \sigma_2^2)$$

    If $X_1$ and $X_2$ are independent, $P(X_1 \cap X_2)$ can be represented as $P(X_1)P(X_2)$. Then,

    $$X_1 \cap X_2 \sim N(\mu_1, \sigma_1^2) N(\mu_2, \sigma_2^2)$$

    If we represent the one-dimensional Gaussian distribution as $N(\mu, \sigma^2) = \frac{1}{Z} \exp{(-\frac{1}{2 \sigma^2} (x - \mu)^2)}$

    $$N(\mu_1, \sigma_1^2) N(\mu_2, \sigma_2^2) = \{ \frac{1}{Z_1} \exp{(-\frac{1}{2 \sigma_1^2} (x - \mu_1)^2)} \} \{ \frac{1}{Z_2} \exp{(-\frac{1}{2 \sigma_2^2} (x - \mu_2)^2)} \} $$

    $$ = \frac{1}{Z_1 Z_2} \exp{ \{-\frac{1}{2} (\boldsymbol{x} - \boldsymbol{\mu})^T \boldsymbol{\Sigma}^{-1} (\boldsymbol{x} - \boldsymbol{\mu}) \} } $$

    where $\boldsymbol{x} = [x_1 \: x_2]$, $\boldsymbol{\mu} = [\mu_1 \: \mu_2]$, and $\boldsymbol{\Sigma} = diag(\sigma_1^2, \sigma_2^2) = \begin{bmatrix} \sigma_1^2&0 \\ 0&\sigma_2^2 \end{bmatrix}$.

     

    Source: Multivariate Gaussian distributions, Youtube

    Comments