Markov’s Inequality#
Markov’s inequality at its simplest deals with the expected value (or first moment) of the distribution. For a non-negative random variable \(X\) with a finite expected value \(\mathbb E[X]\), the probability that its at least the constant constant \(c\) is,
In terms of the expected value,
Proof : Let \(I\) be the indicator that \(X \ge c\). The expected value is also its probability,
Consider the function \(f(X) = X/c\) which is always bigger than \(I(X \ge c)\). Because the expected value of a linear function is linear, the larger function has the higher expected value,
Thus we’ve proven Markov’s inequality,
Markov’s Inequality - Monotonically Increasing Function#
To generalize the simple Markov’s inequality, we allow the first moment to be of any monotonically increasing function \(\varphi(X)\),
General Moment Markov Inequality#
Any \(k\)-th moment is also a monotonically increasing function thus w e can generalize the moment to any \(k\),
Bound on random variable deviates from some value.