This exercise will lead you through a proof of Chebyshev's

Chapter 2, Problem 33SE

(choose chapter or problem)

Get Unlimited Answers
QUESTION:

This exercise will lead you through a proof of Chebyshev's inequality. Let \(X\) be a continuous random variable with probability density function \(f(x)\). Suppose that \(P(X<0)=0\), so \(f(x)=0\) for \(x \leq 0\).

a. Show that \(\mu_{X}=\int_{0}^{\infty} x f(x) d x\).

b. Let \(k>0\) be a constant. Show that \(\mu_{X} \geq \int_{k}^{\infty} k f(x) d x=k P(X \geq k)\).

c. Use part (b) to show that \(P(X \geq k) \leq \mu_{X} / k\). This is called Markov's inequality. It is true for discrete as well as for continuous random variables.

d. Let \(Y\) be any random variable with mean \(\mu_{Y}\) and variance \(\sigma_{Y}^{2}\). Let \(X=\left(Y-\mu_{Y}\right)^{2}\). Show that \(\mu_{X}=\sigma_{Y}^{2}\).

e. Let be \(k>0\)  a constant. Show that \(P\left(\left|Y-\mu_{Y}\right| \geq k \sigma_{Y}\right)=P\left(X \geq k^{2} \sigma_{Y}^{2}\right)\)

f. Use part (e) along with Markov's inequality to prove Chebyshev's inequality: \(P\left(\left|Y-\mu_{Y}\right| \geq k \sigma_{Y}\right) \leq 1 / k^{2}\)

Equation Transcription:

 

 

 

 

 

Text Transcription:

 X

f(x)

P(X<0)=0

f(x)=0

x leq 0

muX=integral_0^ infinity  xf(x)dx

mu_X geq integral_k^ infinity kf(x)dx=kP(X qec k)

P(X gec k)lec mu_X/k

Y

mu_Y

sigma_Y^2

X=(Y-mu_Y)^2

mu_X=sigma_Y^2

P(|Y-mu_Y| gec k sigma_Y)=P(X gec k^2 sigma_Y^2)

P(|Y-mu_Y|gec k sigma_Y) lec 1/k^2

Questions & Answers

QUESTION:

This exercise will lead you through a proof of Chebyshev's inequality. Let \(X\) be a continuous random variable with probability density function \(f(x)\). Suppose that \(P(X<0)=0\), so \(f(x)=0\) for \(x \leq 0\).

a. Show that \(\mu_{X}=\int_{0}^{\infty} x f(x) d x\).

b. Let \(k>0\) be a constant. Show that \(\mu_{X} \geq \int_{k}^{\infty} k f(x) d x=k P(X \geq k)\).

c. Use part (b) to show that \(P(X \geq k) \leq \mu_{X} / k\). This is called Markov's inequality. It is true for discrete as well as for continuous random variables.

d. Let \(Y\) be any random variable with mean \(\mu_{Y}\) and variance \(\sigma_{Y}^{2}\). Let \(X=\left(Y-\mu_{Y}\right)^{2}\). Show that \(\mu_{X}=\sigma_{Y}^{2}\).

e. Let be \(k>0\)  a constant. Show that \(P\left(\left|Y-\mu_{Y}\right| \geq k \sigma_{Y}\right)=P\left(X \geq k^{2} \sigma_{Y}^{2}\right)\)

f. Use part (e) along with Markov's inequality to prove Chebyshev's inequality: \(P\left(\left|Y-\mu_{Y}\right| \geq k \sigma_{Y}\right) \leq 1 / k^{2}\)

Equation Transcription:

 

 

 

 

 

Text Transcription:

 X

f(x)

P(X<0)=0

f(x)=0

x leq 0

muX=integral_0^ infinity  xf(x)dx

mu_X geq integral_k^ infinity kf(x)dx=kP(X qec k)

P(X gec k)lec mu_X/k

Y

mu_Y

sigma_Y^2

X=(Y-mu_Y)^2

mu_X=sigma_Y^2

P(|Y-mu_Y| gec k sigma_Y)=P(X gec k^2 sigma_Y^2)

P(|Y-mu_Y|gec k sigma_Y) lec 1/k^2

ANSWER:

Answer

Step 1 of 6

a) Let x is a continuous random variable with probability density function f(x)

       Here we have to show that

    E(X)

                          =    

                     =

     Hence proven that


Add to cart


Study Tools You Might Need

Not The Solution You Need? Search for Your Answer Here:

×

Login

Login or Sign up for access to all of our study tools and educational content!

Forgot password?
Register Now

×

Register

Sign up for access to all content on our site!

Or login if you already have an account

×

Reset password

If you have an active account we’ll send you an e-mail for password recovery

Or login if you have your password back