We wish to estimate an unknown parameter , based on an r.v. X we will get to observe. As
Chapter 9, Problem 27(choose chapter or problem)
We wish to estimate an unknown parameter , based on an r.v. X we will get to observe. As in the Bayesian perspective, assume that X and have a joint distribution. Let be the estimator (which is a function of X). Then is said to be unbiased if E(|) = , and is said to be the Bayes procedure if E(|X) = . (a) Let be unbiased. Find E( ) 2 (the average squared dierence between the estimator and the true value of ), in terms of marginal moments of and . Hint: Condition on . (b) Repeat (a), except in this part suppose that is the Bayes procedure rather than assuming that it is unbiased. Hint: Condition on X. (c) Show that it is impossible for to be both the Bayes procedure and unbiased, except in silly problems where we get to know perfectly by observing X. Hint: If Y is a nonnegative r.v. with mean 0, then P(Y = 0) = 1.
Unfortunately, we don't have that question answered yet. But you can get it answered in just 5 hours by Logging in or Becoming a subscriber.
Becoming a subscriber
Or look for another answer