Unbiased ness and consistency in linear regression. In a

Chapter , Problem 18

(choose chapter or problem)

Unbiased ness and consistency in linear regression. In a probabilistic framework for regression, let us assume that Yi = ()o + ()IXi + Wi, i = 1, ... , n, where WI , ... , Wn are LLd. normal random variables with mean zero and variance 0' 2 . Then, given Xi and the realized values Yi of Yi, i = 1, ... , n, the ML estimates of ()o and ()1 are given by the linear regression formulas, as discussed in (a) Show that the ML estimators 80 and 81 are unbiased. (b) Show that the variances of the estimators 80 and 81 are n (72 LX? var(80) = n , = 1 n L(Xi - xl i=1 respectively, and their covariance is (72 var(8I ) = -n --- L(Xi - X)2 i=1 (72 X cov(80, (1 ) = - -nL(Xi - X)2 i=1 515 (c) Show that if L=I (Xi - X)2 -+ 00 and X2 is bounded by a constant as n -+ 00, we have var(80) -+ 0 and var(8I ) -+ O. (This, together with Chebyshev's inequality, implies that the estimators 80 and 81 are consistent.) Note: Although the assumption that the W, are normal is needed for our estimators to be ML estimators, the argument below shows that these estimators remain unbiased and consistent without this assumption. Solution. (a) Let the true values of 00 and 01 be 00 and Or, respectively. We have n 81 = ,=_1 ______ _ n where Y = ( L=I Yi) / n, and where we treat XI , . .. , Xn as constant. Denoting W = ( L=I Wi )/n, we have Y = O + Ox + W, and Thus, n n 81 = ..:.. i=--=-I ----n-------- = O + i=I -n------ n L(Xi - X)Wi = O + ..:.. =_=---___ _ L(Xi _ x)2 i=1 where we have used the fact E=l (Xi - x) = O. Since E[Wi] = 0, it follows that Also and using the facts E[el] = 8i and E[W] = 0, we obtain E[eoJ = (Jo Thus, the estimators eo and 81 are unbiased. (b) We now calculate the variance of the estimators. Using the formula for Eh derived in part (a) and the independence of the Wt , we have n Similarly, using the formula for eo derived in part (a), - - 2 - var(eo) = var(W - elx) = var(W) + x var(ed - 2Xcov(W, ed . Since E=l (Xi - x) = 0 and E[WWi] = 0'2 In for all i, we obtain _ E [w (Xi - x) W ] :: t(x. - x) cov(W, el ) = =-- n..;.. --=----- - -n-=--=----- = O. L(Xi - X)2 i=1 Combining the last three equations, we obtain By expanding the quadratic forms (Xi - X)2, we also have n n L(Xt _ x)2 + nx2 = L X;. i=l i=I n L(Xi - X)2 + nx2 Sec. 9.5 By combining the preceding two equations, or n O'2Lx; var(80) = n i=l n L(Xi - X)2 i=1 We finally calculate the covariance of 80 and 81. We have cov(80, 8d = -x var(8I) + cov(W, 8I ) . Since, as shown earlier, cov(W, 8I) = 0, we finally obtain xO'2 cov(80, 8t) = - ---- n L(Xl - X)2 i=l 517 (c) If L::l (Xi - x)2 --+ CX), the expression for var(8t) --+ 0 derived in part (b) goes to zero. Then, the formula A - 2 A var(So) = var(W) + X var(SI), from part (b), together with the assumption that X 2 is bounded by a constant, implies that yare 80) -- O.

Unfortunately, we don't have that question answered yet. But you can get it answered in just 5 hours by Logging in or Becoming a subscriber.

Becoming a subscriber
Or look for another answer

×

Login

Login or Sign up for access to all of our study tools and educational content!

Forgot password?
Register Now

×

Register

Sign up for access to all content on our site!

Or login if you already have an account

×

Reset password

If you have an active account we’ll send you an e-mail for password recovery

Or login if you have your password back