Joint Probability#
The joint probability distribution describes the chance of the values of multiple random variables. The joint distribution is denoted as,
Equivalently the comma can be replaced with interesection \(\cap\) or “logical and” \(\land\)
There are many different ways to write joint events:
Random variables take singular values,
\[ P(X=x, Y=y) \]Equality and inequality of random variables,
\[ P(g(X) \ge h(Y)) \]In general all joint distribution can be described by set of all possible values that makes the event true,
\[ \begin{gather} P((X,Y) \in S), S = \set{(x_1,y_1), (x_2,y_2), \ldots} \end{gather} \]
Derived from Conditional Probability#
Take two random variables \(X\) and \(Y\). The joint probability for some value \(x,y\) can be determined by the relationship with the conditional probability,
The equality of the two lines comes from the divisoin rule or Bayes rule:
In general the probability of a joint probability is given by the chain rule,
Independent Joint Probability#
Random variables are considered independent if the their probability distribution are separable. For instance for \(X\) and \(Y\),
A more compact notation,
Alternatively two random variables are independent if the conditional probability is the probability of itself,
Marginal Probability#
The marginal probability is a posterior probability summing over all joint probability for each prior event. For instance given that \(X\) is a random variable of the posterior and \(Y\) of the prior
Expectation Value#
The expectation value of a joint probability follows linearity such that,
Union Probability#
Inclusion-Exclusion Formula#
The inclusion exclusion formula defines the probability of the union of two or more events.
Let’s start simply with the union of two events which states:
In general,
Boole’s Inequality#
Boole’s Inequality is an upper bound for the union of two or more events. It’s simply says that the probability of a collection of event is no greater than the sum of all its probabilities,