Gaussian or Normal Distribution#

The most prevalent distribution appearing in countless fields is the Gaussian or normal distribution.

XNormal(μ,σ)
f(x)=12πσ2e(xμ)22σ2

Standard Normal Distribution : The standard normal distribution is N(0,1) which plays an important role in motivating why we standardize random variables. Say for a Gaussian random variable X,

$$ Z = \frac{X-\mu_X}{\sigma_X} \iff Z \sim N(0,1)$$

Where $Z$ has the PDF known as the **standard normal distribution**,

$$
\phi(z) = \frac{1}{\sqrt{2\pi}}e^{-\frac{1}{2}z^2}
$$

Cumulative Density Function : The CDF of the normal distribution is,

Φ(x)=xϕ(xμσ)dx

Expectation : The expected value of the normal distribution is famously μ,

$$
\text{E}(X) = \mu
$$

Variance : The variance of the normal distribution is famously σ2

$$
\text{Var}(X) = \sigma^2
$$

Sum : The sum of multiple normal random variables is also normal with mean and variance

$$
\begin{gather*}
\sum_k^n X_k \sim \text{Normal}(\mu, \sigma)\\
\mu = \sum_k^n{\mu_k}\\
\sigma^2 = \sum_k^n{\sigma_k^2}
\end{gather*}
$$

Independent Joint Probability (Rotational Invariant) : Due to the distribution’s property of rotational invariance, the joint property of two iid Gaussian is a Gaussian however you rotate the random variable axes. Even better the Gaussian has the mean and variance of the sums of the two Gaussians.

$$
P(X,Y) = \text{N}(\mu_X + \mu_Y, \sigma_X^2 + \sigma_Y^2)
$$

MGF : For the standard normal random variable Z,

$$
M_Z(t) = e^{t^2/2}
$$

We can apply linear transformation to find the MGF for the normal distribution for the random variable $X = \sigma Z + \mu$,

$$
M_{\sigma Z + \mu}(t) = e^{\mu t + \sigma^2 t^2/2}
$$

Notably, any distribution with an MGF that is the exponential of a degree 2 polynomial is a normal distribution

Characteristics Function : $p~(x)=exp[ikμk2σ22]$

Cumulants and Moments : $\avgxc=μ,\avgx2c=σ2,\avgxnc=0,,\avgxnc=0\avgx=μ,\avgx2=σ2+μ2,\avgxn=3σ2μ+μ3,$

Multivariate Normal#

Let X have the multivariate normal distribution with mean vector μ and covariance matrix Σ. Let x represent the vector value of at some X,

f(x)=([2π]ndetΣ)1/2exp[12(xμ)Σ1(xμ)]
  • Σ : The covariance matrix

More compact is to treat (xμ)TΣ1(xμ) as the squared distance of some vector Σ1/2Δ where Δ=xμ which is known as the deviation vector. This compact form is given as,

P(Δ)=([2π]ndetΣ)1/2exp[12|Σ1/2Δ|2]

Multivariate Normal Are Made of Normal Random Variables : All multivariate normal distributions only describe a joint distribution of normal random variables. In other words, the marginal distribution of any random variable using the multivariate normal distribution is the normal distribution.

$$
X_k \mid X_1, \ldots, X_{k-1}, X_{k+1}, \ldots , X_n \sim \text{Normal}(\mu_k, \Sigma_{kk})
$$

However, the reverse is not always true. The joint distribution of normal random variables need not to be multivariate normal.

Joint Distribution of Linear Combinations of Normal : The joint distribution of linear combinations of normal random variables is multivariate normal.

$$
\sum AX + b
$$

Multivariate Standard Normal Transformation : All multivariate normal can be expressed as a linear transformation of the multivariate standard normal,

$$
X = AZ + b
$$

By close inspection of the $Z$ as a function of $X$ (aka the preimage), we find that

$$
\Sigma = AA^\top
$$

$$
\mu = b
$$

Covariance Matrix#

The covariance matrix Σ is a semipositive definite (symmetric) matrix

Independence and Diagonal Covariance Matrix : We have a special case: the multivariate normal random variables are independent if and only if they are not uncorrelated. Equivalently, the covariance matrix is a diagonal matrix

$$ P(\vec{x}) = \prod{P(X_i)} $$

Independence out of Bivariate Normal#

The multivariate normal has interesting independence properties emerge from a collection of dependent normal random variables.

Consider two independent standard normal random variable X and Z. The surface of the joint distribution maps a perfect circle. We may always define another normal random variable dependent of X and/or Z by taking the following linear combination,

Y=ρX+1ρ2 Z

such that μY=0, σY=1, Cov(X,Y)=ρ, r(X,Y)=ρ.

Therefore, all bivariate normal distribution can express its normal random variables as the sum of two independent random variables.

Relation to Rayleigh Distribution#

Given two independent random variable X,Y with standard normal distribution, let

R=X2+Y2

Then R is a the Rayleigh distribution of scale σ=1,

fR(r)=re12r2,for r>0

Relation to Chi-Square Distribution#

See Chi-Square.