The likelihood function L(y1, y2, . . . , yn

Chapter 9, Problem 66E

(choose chapter or problem)

The likelihood function \(L\left(y_{1}, y_{2}, \ldots, y_{n} \mid \theta\right)\) takes on different values depending on the arguments \(\left(y_{1}, y_{2}, \ldots, y_{n}\right)\). A method for deriving a minimal sufficient statistic developed by Lehmann and Scheffé uses the ratio of the likelihoods evaluated at two points, \(\left(x_{1}, x_{2}, \ldots, x_{n}\right)\) and \(\left(y_{1}, y_{2}, \ldots, y_{n}\right)\):

                                                                              \(\frac{L\left(x_{1}, x_{2}, \ldots, x_{n} \mid \theta\right)}{L\left(y_{1}, y_{2}, \ldots, y_{n} \mid \theta\right)}\)

Many times it is possible to find a function \(\left(x_{1}, x_{2}, \ldots, x_{n}\right)\) such that this ratio is free of the unknown parameter \(\theta\) if and only if \(g\left(x_{1}, x_{2}, \ldots, x_{n}\right)=g\left(y_{1}, y_{2}, \ldots, y_{n}\right)\). If such a function \(g\) can be found, then \(g\left(Y_{1}, Y_{2}, \ldots, Y_{n}\right)i\) s a minimal sufficient statistic for \(\theta\).

a Let \(Y_{1}, Y_{2}, \ldots, Y_{n}\) be a random sample from a Bernoulli distribution (see Example 9.6 and Exercise 9.65 ) with \(p\) unknown.

i Show that

                                                                                 \(\frac{L\left(x_{1}, x_{2}, \ldots, x_{n} \mid p\right)}{L\left(y_{1} y_{2}, \ldots y_{n} \mid p\right)}=\left(\frac{p}{1-p}\right)^{z x_{i}-\Sigma_{y}}\)

ii Argue that for this ratio to be independent of \(p\), we must have

                                                                                  \(\sum_{i=1}^{n} x_{i}-\sum_{i=1}^{n} y_{i}=0 \text { or } \sum_{i=1}^{n} x_{i}=\sum_{i=1}^{n} y_{i}\)

iii Using the method of Lehmann and Scheffé, what is a minimal sufficient statistic for \(p\)? How does this sufficient statistic compare to the sufficient statistic derived in Example 9.6 by using the factorization criterion?

b Consider the Weibull density discussed in Example 9.7

i Show that

                                                                                 \(\frac{L\left(x_{1}, x_{2}, \ldots, x_{n} \mid p\right)}{L\left(y_{1} y_{2}, \ldots y_{n} \mid p\right)}=\left(\frac{x_{1} x_{2} \cdots x_{n}}{y_{1} y_{2} \cdots y_{n}}\right) \exp \left[-\frac{1}{\theta}\left(\sum_{i=1}^{n} x_{i}^{2}-\sum_{i=1}^{n} y_{i}^{2}\right)\right]\)

                                                   

ii Argue that \(\sum_{i=1}^{n} Y_{i}^{2}\) is a minimal sufficient statistic for\(\theta\).

Equation Transcription:

           

 

Text Transcription:

L(y_1,y_2,...,y_n|)

(y_1,y_2,...,y_n)

(x_1,x_2,...,x_n)

(y_1,y_2,...,y_n)

L(x_1,x_2,...,x_n|theta)/L(y_1,y_2,...,y_n|theta)

g(x_1,x_2,...,x_n)

g(x_1,x_2,...,x_n)=g(y_1,y_2,...,y_n)

g

g(Y_1,Y_2,...,Y_n)                          

Y_1,Y_2,...,Y_n

p

L(x_1,x_2,...,x_n|p)/L(y_1,y_2,...,y_n|p)=(p/1-p)^zx_i-sum_y

p

Sum over i=1 ^n x_i- sum over i=1 ^n y_i=0  sum over  i=1 ^n x_i = sum over i=1 ^n y_i    

p

L(x_1,x_2,...,x_n|p)/L(y_1,y_2,...,y_n|p)=(x_1 x_2...x_n/y_1 y_2... y_n)exp-[1/theta ((sum over i=1 ^n x^ _i^2- sum over i=1 ^n y_i^2

t sum over i=1 ^n Y_i ^2

theta

Unfortunately, we don't have that question answered yet. But you can get it answered in just 5 hours by Logging in or Becoming a subscriber.

Becoming a subscriber
Or look for another answer

×

Login

Login or Sign up for access to all of our study tools and educational content!

Forgot password?
Register Now

×

Register

Sign up for access to all content on our site!

Or login if you already have an account

×

Reset password

If you have an active account we’ll send you an e-mail for password recovery

Or login if you have your password back