Expecation values and characteristic functions
# Expecation value
Definition: Expectation value The expectation value $\text{E}(X)$ of a random variable $X$ is defined as $$\text{E}(X) = \int^{\infty}{-\infty}\text{d}x xp_X(x)$$ More generally the expectation value of some function $g(X)$ is defined as $$\text{E}(g(X)) = \int^{\infty}{-\infty}\text{d}x g(x)p_X(x)$$
Particularly important are the moment of order $m$: $$\text{E}(X^m) = \int^\infty_{-\infty}\text{d}x x^m p_X(x)$$
# Variance
Definition: Variance The variance of a random variable $X$ is defined as $$\text{Var}(X) = \text{E}\left[(X-\text{E}(X))\right]= \text{E}(X^2) - \text{E}(X)^2.$$ It measures the deviations of the realizations from the expectation value
Definition: Covariance matrix For multivariate random variables $X=(X_1, …, X_d)$ one defines the covariance matrix $$\text{Cov}(X_1, X_2)=E[(X_i-E(X_i))(X_j-E(X_j))].$$ Statistical independence leads to vanishing off diagonal elements
Definition: Correlation coefficient Correlation between two random variables can be measured by the Corralation coefficient $$\text{Cor}(X_1, X_2) = \frac{\text{Cov}(X_1,X_2)}{\sqrt{\text{Var}(X_1)}\sqrt{\text{Var}(X_2)}}$$ If $\text{Cor}(X_1, X_2) = 1$ the random variables are linearly dependent: $X_2 = aX_1 + b$
Definition: Characteristic function The characteristic function $G(k)$ is defined as the fourier transform of the probability density $$G(k) = \text{E}(\exp[ikX]) = \int \text{d}x p_X(x) \exp(ikx)$$ The characteristic function uniquely belongs to a probability density The $m$th-derivative of the characteristic function correstponds to the $m$th moment $$\text{E}(X^m) = \frac{1}{i^m}\frac{\text{d}^m}{\text{d}k^m}G(k)\big|_{k=0}$$ It is also called generating function