Random variables
# Random variables
- Formally a random variable is defined as a map from the sample space of a Probability space to some field.
Definition: Random variable
Map between the sample space and a Field $$X: \Omega \rightarrow \Reals$$ It assigns a number $x=X(\omega)$ to every event $\omega$ which is called realization The function $X$ must be measurable which ensures that the probability of $X^{-1}(B)$ for a Borel set $B$ is defined
- Random variables are denoted by capitals $X$ and their realizations by lowercase $x$.
# Probability distributions
Definition: Probability distribution
The probability distribution of a random variable over a probability space is defined as $P_X(B) = \mu(X^{-1}(B))$ Therefore defines a distribution of probabilites for all sets emerging from the real numbers
From the particular Borel set $(-\infty,x], x\in\Reals$ one obtains the cumulative distribution function of X
Definition: Cumulative distribution function The cumulative distribtuioin function $F_X(x)$ gives the probability that a random variable takes a value lower than X $$F_X(x) = \mu(X\leq x)$$
Properties:
- $F_X(x)$ increases monotonically
- $\lim_{x\to-\infty} F_X(x) = 0, \lim_{x\to\infty F_X(x)=1$
If $F_X(x)$ is condinous we get the probability density
Definition: Probability density The probability density $p_X(x)$ is defined as $$p_X(x) = \frac{\text{d}F_X(x)}{\text{d}x}$$
Usually a probablity distribution is represented by the density. The only difference is that the probablity distribution is defined over all borel sets and the distribution function over the realizations $x$. One might obtain the probability distribution for a Borel set $B$ by integrating $$P_X(B) = \int_B \text{d}x p_X(x).$$
One can also study a collection of random variables on the same probability space $X=(X_1,X_2,…,X_d)$. Two random variables are statistically independent if $$\mu(X_1 \leq x_1, X_2 \leq x_2) = \mu(X_1 \leq x_1)\cdot\mu(X_2 \leq x_2) \forall x_1,x_2.$$ The $\mu(\cdot,\cdot)$ thereby denotes an and.
# Transformation of random variables
We can transform random variables with suitable transformation functions $$g: \Reals^d \to \Reals^l.$$
For a given random variable $X$, $Y = g(X)$ defines a new $f$-dimensional random variable $Y$.
The new probability distribution is given by $$P_Y(B) = P_X(g^{-1}(B)).$$ The density $p_Y$ in turn calculates by the $d$-dimensional integral $$p_Y(y)=\int\text{d}^dx \delta^{(f)}(y-g(x))p_X(x).$$
Example: We have $p_X(x_1, x_2) = p_{X_1}(x_1)\cdot p_{X_2}(x_2)$ and want to transform to $Y = X_1 + X_2$ then with $g(x_1,x_2) = x_1 + x_2$ the integral over the delta function gives $$p_Y(y) = \int \text{d}x_1 p_{X_1}(x_1)p_{X_2}(y-x_1)$$