Class Note for PUBHLTH 744 at UMass(2)
Class Note for PUBHLTH 744 at UMass(2)
Popular in Course
Popular in Department
This 13 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at University of Massachusetts taught by a professor in Fall. Since its upload, it has received 28 views.
Reviews for Class Note for PUBHLTH 744 at UMass(2)
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 02/06/15
Linear models With applications in R PUBHLTH 744 Handout 6ntr0duction to Linear Illadels Instructor Andrea S Foulkes Division of Biostatistics and Epidemiolog UMass School of Public Health and Health Sciences Fall 2007 Solutions to systems of linear equations Consider the equation Y X where Ynxl anp and 3pm For a given X and Y observed data does there exist a solution B to this equation gt lfp n ie X square and X is nonsingular then Esand the unique solution is B X lY Note that in this case the number of parameters is equal to the number of subjects and we could not make inference Handout 1 Solutions to systems of linear equations gt Suppose p g n and Y E CX then yes though the solution is not necessarily unique In this case B X Y is a solution since X3 XX Y Y for all Y E CX by definition of generalized inverse Consider following 2 cases gt If 7 X p X full rank then the columns of X form a basis for CX and the coordinates of Y relative to that basis are unique recall notes section 22 and therefore the solution B is unique gt Suppose 7 X lt p If 5 is a solution to Y X then 5 w w E NX is also a solution So we have the set of all solutions to the equation equal to 3 X Y If X X XquotlXz7 2 6 RP Note that X X X 1X is the orthogonal projection operator onto CX and so If X X X 1X is the orthogonal projection operator onto CX i NX Handout 2 Solutions to systems of linear equations In general Y 7 CX and no solution exists In this case we look for a vector in CX that is quotclosestquot to Y and solve the equation with this vector in place of Y This is given by MY where M XX X 1X is the orthogonal projection operator onto X Now solve MYXB Handout 3 Solutions to systems of linear equations gt The general solution for rX p is given by X MY l I 7 X X X Xz and again there are infinite solutions Let the SVD of X be given by X VlAUi We know the MP generalized inverse of X is XJr UlA lVll Therefore 5 XMY XXX X 1X Y UlA lWVlAU1U1A2U1 U1AVY UlAilVfly XlY So the general solution is given by XY I a X X XXz Handout 4 Solutions to systems of linear equations gt Now assume rX p In this case we have XJr X X 1X and so 5 XMY X X 1X XX X 1X Y X X 1X Y Handout 5 Random vectors and 111at1 ices Y1 Y2 Definition Let Y be a random vector with Y m VarYi TM and 001102739 oi The expectation of Y is given by ECG 1 E Y EY l 2 l M Elyn Mn Similarly the expectation of a matrix is the matrix of expectations of the elements of that matrix Handout 6 Random vectors and 111at1 ices Definition Suppose Y is an n x 1 vector of random variables The covariance of Y is given by the matrix 03911 03912 0391 03921 2CovltYgt E WWW7m Unl Unn where 71 i 7 My EngY 7 ij 7 ij mm E0939 EYiEYjgt Handout 7 Random vectors and 111at1 ices Theorem Suppose Y is a random 71 x 1vector with mean EY u and covariance 00120 2 Further suppose the elements of ATM and brxl are scalar constants Then EAYbAEYbAub and COUAY b ACOUYA AEA Handout 8 Random vectors and 111at1 ices Definition Let st1 and WT be random vectors with EY u and 39y The covariance between Y and W is given by Com Y EW 7 W 7 m We call this a matrix of covariances not necessarily square which is distinct from a covarince matrix Handout 9 Random vectors and 111at1 ices Theorem Let st1 and WT be random vectors with COUY 2y COUW 2W C39OUUV7 Y EWY and CovY7 W Eyw Further suppose AMT and B XS are matrices of constant scalars Then COUAW BY ABM14 BEyB AEWyB BEywA Handout 10 Random vectors and 111at1 ices Theorem Covariance matrices are always positive semi definite Proof Let Ynxl be a random vector and E 00120 EY 7 uY 7 u where u We need to show that for any x E R m Ez 2 0 Let Z Y 7 p then we have mEz zEZZz EmZZz since z is a vector of scalars Eww where w Zz 012 Handout 11 Random vectors and 111at1 ices Since the expectation of a non negative random variable will always be non negative Note that if wi 0 for all i then we have Z z zlzl l zgzg l l znzn 0 where 21 is the ith column of Z This implies dependency among the columns and singularity of the covariance matrix Handout 12
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'