Note 6 for ECON 520 at UA
Popular in Course
Popular in Department
This 8 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at University of Arizona taught by a professor in Fall. Since its upload, it has received 16 views.
Reviews for Note 6 for ECON 520 at UA
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 02/06/15
Economics 522A Spring 2007 Lecture Note 6 Multivariate and Conditional Normal Model 1 Bivariate Normal Model Suppose we want to develop a stastistical model for the CPS data examined in the previous lecture note To simplify things let us suppose that we rst start by modeling the joint distribution of yi xi where y is the log of wage and z is experience Here is a scatterplot 4395 linel 4 7 t 4 i 7 35 7 t J 7 n t L 1 i g 5 H 3 7 f3 4 lm 41 1 7 t g gi i t t g i i 257 Jr g f1 7 E j i i fe gm ii ig f g 3 5 y gong g j i 7 7 an 2 ii f fiiif i 4444 i N t 7 t 3 2 i i 39 i t 1 15 1 f 1 43 r Jar 1 7 t t 05 7 j 7 0 705 l l l l l 0 10 20 30 40 50 60 Experience Taking the log of wage has made the wage distribution a little more symmetric and apart from the low experience observations the data points look roughly elliptical So although the plot doesn t perfectly justify it we might start by assuming that are jointly multivariate normal 2 lt y gt i39i39Nd39Nwz i 1 Here n is the sample size in the CPS data 71 1289 The mean vector u and variance matrix 2 can be written as MltMygt Elt 7yy am am Hwy 03911 where My is the marginal mean of yi am is the marginal variance of yi am is the covariance between y and mi etc Please remind yourself of the properties of the multivariate normal model given in LN7 Addendum from Econ 520 and Ruud Ch 1051 The density of 2 is 1 mi 22 detlt2vr2gt12 exp 75 7 MM 7 m So the joint likelihood is EMS f21772n 7E n HHZMME i1 det27r2 2 exp lt7 7 Male 7 m The MLE can be solved analytically to get 1 lZyi A 7 n l M Z mm 81 Cl v lt 1 izllyii izldyiimmgi gt This is also the method of moments estimator since we are basically just replacing popu lation expectations with sample averages Using the CPS data we get A 7 234 i 7 034 132 u 7 1879 7 7 132 13592 39 2 Conditional Mean of 3 Having estimated the joint distribution of xi yi we might want to focus on certain aspects of that joint distribution such as the conditional mean of yi given m Using standard results for the multivariate normal distribution 039 N My e m am e caram mm Let an 31 M 77M y 0mm m a 32 Ly 7mm 2 2 7 02121707121 7m Then we can write N N l 52901302 We see that the conditional mean of yi given ml is linear in mi Elyilil 51 52 and the conditional variance is a constant that does not depend on the value of m Vleilmil 0 Let s focus on the conditional mean parameters 31 and g We can plug in the ML estimates to get corresponding estimates for the 3 parameters A 7 6w i i g 32 7 A 7 0m iZlM 7 i 51 yi AzQz Qi zi In our data 61 2167 62 00097 Here is the plot 45 i line 1 line 2 Log Wage 0 10 20 30 40 50 60 Experience 3 OLS Now7 consider the OLS coe icients They solve the following problem 2 mln 39 7 7 1339 bm i 391 31 32 z The rst order conditions for a minimum are 2 2 i 31 i 329 0 i 2 2 i 31 i 32 90 0 139 So the OLS coe icients 3132 satisfy what are sometimes called the OLS Normal Equa tions 2 i 31 i 329 0 2 i ZltM 31 BZM 0 3 i The rst equation can be rearranged to get Zyi n l 32290139 0 i i i 31 Z i 329 Plug this into the second normal equation Zyii g 32izi 32zzz 0 i i 2579M 32 1 7 A 1 2 7 2 j 32 i L L A l 290 y39 i 21 gt 32 E 1 l n i i i A bit of algebra shows that this is equal to 32 i g i2ltm 79 So the OLS coef cients are identical to the MLE estimates 4 Conditional Modeling Recall that we started by assuming joint normality for yi A nice feature of the mul tivariate normal distribution is that the marginal distribution of xi is normal7 and the conditional distribution of y is normal We focused on the parameters of the conditional distribution of Mn and showed that MLE gave the same result as OLS Suppose we only make the assumption that y is conditionally normally distributed N N31 3296i7 02 4 We allow the distribution of z to be arbitraryiit could be nonnormal discrete even degenerate In order to be precise about some of our later arguments we will assume that the above conditional distribution holds conditional on allthe zis yilm1mn Nwl 32 02 and that conditional on all the m the y are independent1 Then we can write the joint conditional density of the ys given the zs as fy17 7ynl17 7m31732702gt Hfltyili31732702 i1 27r02 2 exp 7 i 31 329002 ln conditional maximum likelihood we treat this conditional density as the conditional likelihood and maximize it with respect to the parameters 1 2 ill2 7 7 7 2 11393 227w exp lt 202 yi 61 62m gt Equivalently maximize the log of the conditional likelihood n 2 1 397 7 I 2 5 10W l m 51 BM 39 First order conditions Simplify a bit to get 2 i 31 3290i 0 i 2 i 31 3290i90i 0 i n 1 A A i willi731 322 0 1 1Suppose we assume that the yam are HD and that the conditional model in 4 holds Then these further assumptions will hold Notice that the rst two equations above are exactly the same as the normal equations in the OLS problem The third equation above can be rearranged to get 572 i 31 329002 So the solution for Bl and 32 are the same as in OLS7 and to get the estimate for 62 we can form the OLS residuals77 5i 3 9i i 51 i 529 1 0272 5 71 L This last expression should seem sensiblel It is like a sample variance and then calculate 5 Some Properties of the OLS coef cients Since our model is conditional on the us let s see if we can derive some simple properties of the OLS coe icients First7 consider the slope coef cient 32 izm 7 mm 7 22gt 1 21m 7 2 2m 7 r2 39 El32l9017790nl Useful trick de ne 5i 3 9i i yi 51 i 529 So 34139 51 529 613 and H Eil17 7x NO7 0392 Note 6 is NOT the same as 51 which is the residual using the OLS coe icients Then 2quot Z 51 329 51quot Z31 62 6i i 1 32ii i i 1 So 2 7 mm 7 22 52 2m 7 32 261 7 am 7 32 i l And E 2yiigmiiilz1nzn 322zi7i2E Zk aaralzlvnwn i l 52 290139 2 l 139 Therefore El52l9017790nl 52 So 32 is conditionally unbiased Also by the law of iterated expectations El zl E El32l9017790nll 52 By similar arguments we can show that El llm1739 u nl 51 6 Conditional vs Joint Modeling We are going to work with conditional models for a little while so it is worth stopping to think about the general relationship between say unconditional MLE and conditional MLE Return to the joint normal model given in We decomposed the model into a marginal model for m i N NHm7 Umz7 and a conditional model for yi given m N N l 52901302 Note that there is a 1 1 mapping between the original parameters pm My a 0mg ayy and the parameters Mm am 31 g 02 Under the reparametrizion we have a set of parameters related to the marginal distribution of m 01 Wham and a set of parameters for the conditional distribution 02 313202 So generalizing a bit we have a joint model and a marginal conditional decomposition 9 M 01702 fi01fyil 02 The joint likelihood can be written 1 H f9017y1777yn 01702 fltivyi 01702 fi01fyil02 H H n 1090139H91X H win 02 i1 H H The joint MLE solves n n max f9017y17 7n7yn 01702 Hflti 01 X Hfltyili 027 91 926 i1 i1 where 9 is the joint parameter space for 917 02 The conditional MLE solves n max Hfltyili 027 9 G 26 211 where 92 is the parameter space for 02 If 01 only enters the marginal density of mi and 02 only enters the conditional density of Mn and the joint parameter space is a Cartesian product quot3quot31X927 then conditional MLE will give the same result for z as unconditional MLE
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'