This exercise will lead you through a proof of Chebyshev's inequality. Let X be a continuous random variable with probability density function f(x). Suppose that P(X<0) = 0, so f(x) = 0 for x ≤ 0.

a. Show that

b. Let k> 0 be a constant. Show that

c. Use part (b) to show that P(X ≥ k) ≤ μX / k. This is called Markov's inequality. It is true for discrete as well as for continuous random variables.

d. Let Y be any random variable with mean μY and variance Let X = (Y − μY)2. Show that

e. Let k > 0 be a constant. Show that

f. Use part (e) along with Markov's inequality to prove Chebyshev's inequality: P(|Y − μY| ≥ kσY) ≤ 1/k2.

Answer

Step 1 of 6

a) Let x is a continuous random variable with probability density function f(x)

Here we have to show that

E(X)

=

=

Hence proven that