Solved: Weighted Least Squares. Suppose that we are

Chapter , Problem 114MEE

(choose chapter or problem)

Weighted Least Squares. Suppose that we are fitting the line \(Y=\beta_{0}+\beta_{1} x+\epsilon\), but the variance of \(Y \) depends on the level of \(x); that is,

\(V\left(Y_{i} \mid x_{i}\right)=\sigma_{i}^{2}=\frac{\sigma^{2}}{w_{i}} \quad i=1,2, \ldots, n\)

where the \(w_{i}\) are constants, often called weights. Show that for an objective function in which each squared residual is multiplied by the reciprocal of the variance of the corresponding observation, the resulting weighted least squares normal equations are

\(\widehat{\beta}_{0} \sum_{i=1}^{n} w_{i}+\widehat{\beta}_{1} \sum_{i=1}^{n} w_{i} x_{i}=\sum_{i=1}^{n} w_{i} y_{i}\)

\(\widehat{\beta}_{0} \sum_{i=1}^{n} w_{i} x_{i}+\widehat{\beta}_{1} \sum_{i=1}^{n} w_{i} x_{i}^{2}=\sum_{i=1}^{n} w_{i} x_{i} y_{i}\)

Find the solution to these normal equations. The solutions are weighted least squares estimators of \(\beta_{0} \text { and } \beta_{1}\)

Equation Transcription:

   

   

Text Transcription:

Y=\beta_0+\beta_1 x+\epsilon

Y

x

V(Y_i \mid x_i\right)=\sigma_i^2=\sigma^2 w_i\quad i=1,2, \ldots, n

w_i

\widehat\beta_0\sum_i=1^n w_i+\widehat\beta_1\sum_i=1^n w_i x_i=\sum_i=1^n w_i y_i

\widehat\beta}_0 \sum_i=1^n w_i x_i+\widehat\beta_1 \sum_i=1^n w_i x_i^2=\sum_i=1^n w_i x_i y_i

\beta_0 and \beta_1

Unfortunately, we don't have that question answered yet. But you can get it answered in just 5 hours by Logging in or Becoming a subscriber.

Becoming a subscriber
Or look for another answer

×

Login

Login or Sign up for access to all of our study tools and educational content!

Forgot password?
Register Now

×

Register

Sign up for access to all content on our site!

Or login if you already have an account

×

Reset password

If you have an active account we’ll send you an e-mail for password recovery

Or login if you have your password back