×

Let's log you in.

or

Don't have a StudySoup account? Create one here!

×

or

by: Dorris Borer

16

0

13

ENGRAPPLICOFSTATISTIC ESE302

Dorris Borer
Penn
GPA 3.84

T.Smith

These notes were just uploaded, and will be ready to view shortly.

Either way, we'll remind you when they're ready :)

Get a free preview of these Notes, just enter your email below.

×
Unlock Preview

COURSE
PROF.
T.Smith
TYPE
Class Notes
PAGES
13
WORDS
KARMA
25 ?

Popular in Electrical Engineering

This 13 page Class Notes was uploaded by Dorris Borer on Monday September 28, 2015. The Class Notes belongs to ESE302 at University of Pennsylvania taught by T.Smith in Fall. Since its upload, it has received 16 views. For similar materials see /class/215446/ese302-university-of-pennsylvania in Electrical Engineering at University of Pennsylvania.

×

Reviews for ENGRAPPLICOFSTATISTIC

×

×

What is Karma?

You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/28/15
ESE 302 Tony E Smith NOTES ON SIMPLE LINEAR REGRESSION 1 INTRODUCTION The purpose of these notes is to supplement the mathematical development of linear regression in Devore 2008 This development also draws on the treatment in Johnston 1963 and Larsen and Marx 1986 We begin with the basic least squares estimation problem and next develop the moments of the estimators Finally the fundamental optimality property of these estimators is established in terms of the Gauss Markov Theorem 2 LINEAR LEAST SQUARES ESTIMATION The basic linear model assumes the existence of a linear relationship between two variables x and y which is disturbed by some random error e Hence for each value of x the corresponding y value is a random variable of the form 21 Y 0 1xe where o and l are designated respectively as the intercept parameter and the slope parameter ofthe linear function o 1x Ifn values q i ln of x are observed with corresponding errors 81 i l n then the resulting random variables X i l n are given by 22 Yl 0 lxlel iln In this context it is assumed that the random errors 81 i l n are independently and identically distributed iid with mean zero and variance 0392 so that 23 Eel0 iln 24 varel0392 i ln If values of y corresponding to q i l n are also observed and are denoted by y i l n then the least squares estimation problem is to nd estimates do and A of the unknown parameter values o and l which minimize the sum of squared residuals designated as fb0b1 in Devore p 455 25 SBo81 Ego won31x92 This function is easily seen to be convex and differentiable in o and l so that the unique solution AO A is given by the firstorder conditions ismw l 22y 0 1x 1 26 0 a o 27 0 sww l zzlor AO pqx x Ifwe let flz x and 7lZy thenby26 n 11 n x 28 21y quot e 12198 0 2Zy1 o lizxlj 0 and by 27 29 Zy L lxx 0 To simplify 29 let the estimated y value corresponding to 0 3 be defined by 210 l A0 A1xl 139 ln and rewrite 29 as 211 21yI ylxl0 Note also from 28 that 212 Ely y 29 Zy 21 1x1 W ri e AIlel n7 n 0 1f m7 0 To solve for A we rst observe by subtracting 28 from 210 that 213 f i lx f 3m yy x f 139 ln Hence multiplying both sides by xl f and summing over 139 we obtain 214 201yxzfZyzFxzf 12xzf2 But since 2 l l and 2 12 imply 215 ZMMWEFZM ampfZ 0 we may conclude from 214 that 201 WWI 4 216 1 2 Evex See expression 122 in Devore p 456 Finally by employing 28 we may solve for 0 in terms of A as 0 3037 515 See expression 123 in Devore p 456 3 MOMENTS OF THE ESTIMATORS The estimators in 216 and 217 depend on the values of the random variables X 139 l n and hence are themselves random variables In particular if the sample mean of the K s is denoted by 31 17 1211 121 0 1amp o lf12 7 7 7 then it follows at once from 216 that A is a random variable of the form 32 1 and similarly that 0 is a random variable of the form 33 6 7 61 To compute the moments of the slope estimator Al it is convenient to simplify expression 32 as follows By breaking 32 into two terms 34 A Evef 2M f2 and observing that 35 21xl fzlxl nfnlzlq f0 n we see that the second term vanishes and hence that the estimator can be written as a linear combination ofthe K s 36 51 WY where the coefficients w are of the form X f 37 w 2 11n 2 x x and hence are non random ie depend only on the given values of the g s To analyze 36 we begin with several observations about the coefficient values in 37 First observe from 35 that and moreover that 39 21w which together with 38 also implies 310 lexxxZlmlew2wEfl To compute the mean of I31 observe from 22 and 23 that 311 EY o E8 0 lxx so that by 36 together with 38 and 310 312 WI mm Zyl m 02w Zwx 0 11 1 Thus is an unbiased estimator of Moreover since 31 and 23 imply that 313 E07 0 lyq ZIE81 0 1f it follows from 33 together with 313 that 314 we Eon w mm 1 A and thus that AO is also an unbiased estimator of o To compute the variance of 1 we again observe from 36 that 315 lle omale o 1amp2we const Z wig and hence from the independence of the e s that 316 var A1 varZl we 211w2 varel Hence we may conclude from 24 and 37 that 9 f 317 van3 UZZIW 622 2 Zxf x 0392 Xxx fr 02 Zx f212 20 f2 See expression 124 in Devore p 470 Similarly to determine the variance of 0 we observe from the above relations that 318 1 7 lf ZY f2wx Zd mx Zl W o 1998 Zlfw o pqZ WM const Zl fvt1el and hence that A 1 2 319 var 0 Zl fwl varel 2 1 2 2 1 2 2 2 039 Z xwl 039 217 zxwlx w 0392 i2 xZlwlTZZlw2 2 Oquot 2 L f2 2 21Jq f2nf2 0 n OHZxxrin l anaq W 2992252w2nf2 7 2 lef znfuznxz quotElm ff 2 Ex a HEWW 4 GAUSS MARKOV THEOREM quot2106 f2 Finally we establish the fundamental optimality property of the above estimators To do so recall that for an independent random sample Y1Yn from a population with mean u EY the sample mean Yquot was shown to be a best linear unbiased BLU estimator of u This optimality property turns out to be shared by the leastsquares estimators AO Al above This result known as the Gauss Markov Theorem provides the single strongest justification for linear leastsquares estimation and can be stated as follows GAUSS MARKOV THEOREM For any linear function L ao 0 a1 l of AO 91 the least squares estimator LA aO AO al Al has minimum variance among all linear unbiased estimators of L Proof We shall prove this assertion only for the linear function with coefficients a0 0a1 l ie for the estimate 51 of the slope parameter l which is by far the most important of the two individual parameters The argument for any linear function of o and l is essentially the same To begin with observe from 36 that I31 is indeed a linear estimator ie is a linear function ofthe random variables X i l n Moreover it was shown in 312 that is also an unbiased estimator of l Hence it remains only to show that the variance of A never exceeds that of any other linear unbiased estimator To do so consider any other linear estimator say 41 quot Z cY and suppose that A is also unbiased estimator Then by 312 we must have 42 Eml 21 cm 2161 0 1xx ozlcx lzlcxxx But since unbiasedness requires that 42 hold for all values of the unknown parameters o and it follows by setting o l and l 0 that 43 21c 0 and in turn by setting l l that 44 2 clxl 1 Hence in a manner identical with 315 these two conditions are seen to imply that 45 l ZcY Zcs and thus that the variance of l is given by 46 var 1 2 cf varel 03922 c2 To compare this with var Al observe first that if the differences between the coefficients of A and in 4 l and36 are denoted by cl c w i l n then 46 can be rewritten as 47 var 1 5221 w dz2 522 wf 221d1w1 2 df But by 43 and 44 together with 38 and 310 we must have 48 02chlezlal0zlal1321all0 49 lZlc1xllelxlZlallyqlzlalgq321algq0 which together imply that Z Z x 5 Z dxxz 52 dz d d 1 I 1 0 1 1W 1 lZxjfz Zxjf2 Hence recalling 37 we see that 47 reduces to 4 l l var 1 03922 w2 02216112 var A1 02216112 and may conclude from the nonnegatiVity of 0392 Z all that 412 var3 2var Thus Al has minimum variance among all linear unbiased estimators and the result is established 5 REFERENCES Devore J L 2008 Probability anal Statistics for Engineering anal the Sciences Seventh Edition Dquury Press Belmont California Larsen R and ML Marx 1986An Introduction to Mathematical Statistics anal its Applications Second Edition PrenticeHall Englewood Cliffs NJ Johnston J 1963 Econometric Methods McGrawHill NY SYSTEMS 302 LECTURE 23 0 PARTIAL RESIDUAL PLOTS o HETEROSCEDASTICITY 0 VarianceStabilizing Transformations 0 Weighted Least Squares 0 AUTOCORRELATION PROBLEM o DurbinWatson Test 0 TwoStage Regression Approach 0 AuxiliaryVariable Approach 0 For next time 0 Regression Notes PARTIAL RESIDUAL PLOTS Given the MULTIVARIATE LINEAR MODEL k 1 y oZj1 xs 11n 2 a N NO02 i1 iid n 7 STEP 1 Do a multiple regression and plot residuals If there appears to be some pattern of nonlinearity save the residuals 551 and save the Prediction Formula STEP 2 Plot the residuals against each variable x J j 1 k STEP 3 If there is an observable nonlinear relation for a given variable x then construct variable res j by 1 resyz jnxU k i i1n STEP 4 Use Fit Y by X to plot res against x and use the Fit options to find a reasonable polynomial or logarithmic fit to this relation STEP 5 Add a variable of this form to the multiple regression For example if Step 4 yields a good quadratic fit to res J for x then add the variable x12 Similarly if a good logarithmic fit is obtained add logxj ENERGY CONSUMPTION PROBLEM Energy consumption has long been regarded as one of the best indicators of real standard of living This can be tested by regressing Energy Consumption on Real GNP for a selection of countries DATA A 1965 study of n 109 countries included data on annual per capita energy consumption EC 1 1 1n and annual per capita GNP GNB 1 1n Note that per capita data is used in order to avoid the obvious size effects of big versus small countries ANALYSIS Consider the two regression models EC 2 80 BIGNP 81 139 1n lnECl 280 B1 lnGNB 81 iln WEIGHTED LEAST SQUARES If regression results for Y on X 1X k yield a residual plot showing heteroscedasticity 1 2 3 4 5 6 Save the residuals as RES make a new column of squared residuals RESA2 and regress each variable X J against RESA2 If X 10 is the variable with highest RSquare and if its PValue is signi cant make weights w 1 X10 Set Y wY cons w and X wXj jlk Regress Y on conslX 1X using NoIntercept Check the new residual plot for homoscedasticity and also check for normality using the Normal Quantile plot If residuals are OK then set the new intercept estimate 30 equal to the coef cient for conslquot and set A each new slope estimate 3 equal to the coef cient for X j 1k SALES FORCASTING PROBLEM One of the oldest economic laws is that increased income leads to increased expenditures This time honored relation can be tested for the US by regressing retail sales against per capita income for a number of years DATA In a study of T 15 years data was collected from 1965 to 1980 on per capita retail sales SALESI t1T per capita income PCIt t 1 T and the unemployment rate UR t 1T the US ANALYSIS Consider the two regression models SALESI 80 81PCIt 8 t1T SALESI 80 BIPCIt BZUR et t1T

×

25 Karma

×

×

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

Why people love StudySoup

Steve Martinelli UC Los Angeles

"There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

Allison Fischer University of Alabama

"I signed up to be an Elite Notetaker with 2 of my sorority sisters this semester. We just posted our notes weekly and were each making over \$600 per month. I LOVE StudySoup!"

Bentley McCaw University of Florida

"I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

Parker Thompson 500 Startups

"It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

Become an Elite Notetaker and start selling your notes online!
×

Refund Policy

STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com