Markov's inequality
Introduction
If X is a nonnegative random variable and \(a>0\), then the probability that \(X\) is at least \(a\) is at most the expectation of \(X\) divided by \(a:\)
In the language of measure theory, Markov's inequality states that if \((X\mathbf{,\sigma,\mu})\) is a measure space, \(f\) is a measurable extended real-valued function, and \(\epsilon>0\), then
This measure-theoretic definition is sometimes referred to as Chebyshev's inequality.
Chebyshev's Inequality
Let X be any random variable. If you define \(Y=(X−EX)^2,\) then \(Y\) is a nonnegative random variable, so we can apply Markov's inequality to \(Y\). In particular, for any positive real number \(b\), we have
Note that
Thus, we conclude that
Deviration of Markov's Inequality
Reference
https://en.wikipedia.org/wiki/Markov's_inequality
https://www.probabilitycourse.com/chapter6/6_2_2_markov_chebyshev_inequalities.php