Class Note for PUBHLTH 744 at UMass(4)
Class Note for PUBHLTH 744 at UMass(4)
Popular in Course
Popular in Department
This 27 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at University of Massachusetts taught by a professor in Fall. Since its upload, it has received 14 views.
Reviews for Class Note for PUBHLTH 744 at UMass(4)
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 02/06/15
Linear models With applications in R PUBHLTH 744 Humiouf 4Mufr239l Decomposi ons Instructor Andrea S Foulkes Division of Biostatistics and Epidemiolog UMass School of Public Health and Health Sciences Fall 2007 h latrix Decmnpositions We consider 4 matrix decompositions relevant to linear modeling Both the spectral decomposition and the Cholesky decomposition apply to n x 71 square matrices The former requires the matrix be symmetric while the later assumes the matrix is positive definite The QR factorization and the singular value decomposition SVD apply more broadly to n x p matrices gt Singular Value Decomposition gt Spectral Decomposition gt QR Decomposition gt Cholesky Decomposition Lecture4 1 Singular Value Decomposition SVD Theorem 41 Singular Value Decomposition SVD For an n Xp matrix A of rank r there exists orthogonal matrices Unxp and Vpxp such that UAV A 0 0 0 WP where A diag6167 and 61 2 2 67 gt 0 are called the singular values of A Lecture4 2 Singular Value Decomposition SVD Recall for an orthogonal matrix X we have X 1 X or equivalently X X XX I Since U and V are orthogonal we know UU I VV I and therefore UAV D ltgt UUAVV UDV ltgt A UDV Example Consider the matrix A lt i i Note that A is a 2 x 2 symmetric matrix of rank MA 1 The SVD is obtained using the svd command as demonstrated below Lecture4 3 Singular Value Decomposition SVD Create the matrix A gt A lt7 matrixc11112 gt A 1 2 1 1 1 2 1 1 Calculate the singular Value decomposition of A gt svdA d 1 2 0 u 1 2 1 07071068 07071068 2 07071068 07071068 v 1 2 1 07071068 07071068 2 07071068 07071068 LectureA 4 Singular Value Decomposition SVD gt We know from the SVD theorem that we can write D U AV Since A is of dimension 2 x 2 and rA1 we expect only one non zero singular value 6 in the diagonal elements of D gt We can also reconstruct A from the components of the SVD as shown below Lecture4 5 Singular Value Decomposition SVD Checking that U AVDe1ta gt v lt7 svdAv gt U lt7 svdAu gt roundtU 79quot7 A 79quot7 V2 1 2 2 0 0 Generate A from the components of the SVD of A gt Delta lt7 diagsvdAd gt U 77 Delta 77 tV 1 2 1 1 1 2 1 1 LectureA 6 Singular Value Decomposition SVD Recall matrices are spatial operators that rotate stretch and shrink vectors The SVD and the Spectral theorem tell us how to break up these operations of a matrix into three steps Here we consider an example using the SVD though a similar interpretation can be arrived at using the Spectral theorem Let A UDV act upon the vector x lt 2 0 It is easily seen that Am lt 3 Now let us consider UDV z Lecture4 7 42 43 Singular Value Decomposition SVD SM UDVX Ax x i LectureA 8 Singular Value Decomposition SVD gt First we have i 7 V 7 70707 70707 2 7 71414 a a 70707 0707 0 71414 Note V z E 5v1 where 121 is an eigenvector of AA That is pre multiplying by the matrix V rotates the vector x to be in the span of the first eigenvector of AA Lecture4 9 Singular Value Decomposition SVD gt Since D is a diagonal matrix pre multiplying by D has the effect of stretching or shrinking the corresponding axes In this example the first diagonal element equals 2 and the second equals 0 and so the effect is to stretch the x axis by 2 and reduce the y axis to O 2 0 71414 72828 7 7 7 D lt0 0gtlt71414gt lt 0 gt Lectu re 4 10 Singular Value Decomposition SVD gt Note that 2 and 0 are the square roots of the eigenvalues of AA gt Finally pre multiplying by U gives Uit 70707 70707 72828 7 2 a 70707 0707 0 2 This has the effect of rotating z back to the original space Lecture4 11 Singular Value Decomposition SVD gt Also of note the columns of U and V are the eigenvectors corresponding to the nonzero eigenvalues of AA and AA respectively and for MA r 6f 763 are the eigenvalues of AA gt Finally it can be shown that A UDV UlAVll where U1 is n x r and V1 is p x r representing the orthonormal columns of U and V respectively that correspond to the singular values Lectu re 4 12 Spectral DecIllpositiOH gt If A is a symmetric square matrix then we know 1 AA AA and 2 the eigenvalues of AA equal the squares of the eigenvalues of A Proof of this second property is straightforward Suppose v is an eigenvector of A with eigenvalue Then A U U or equivalently AA U A U A U A U U A21 or A2 is an eigenvalue of AA Note that we also have that the unit length eigenvector of A equals the unit length eigenvector of AA Lectu re 4 13 Spectral De111positio11 gt The first result A A AA implies U V This follows from the facts that the columns of U and the columns of V are eigenvectors for the same matrices A A AA and the columns of both U and V are of length 1 These results lead us to the following related theorem regarding symmetric matrices Theorem 42 Spectral Decomposition For an n x n symmetric matrix A there exists an orthogonal matrix P such that A PAP where A diag1n 1 lt lt m and P is the corre sponding orthogonal matrix of eigenvectors Lectu re 4 14 Spectral De0111positio11 Note that the Spectral decomposition is a special case of the SVD decomposition in which U V P and i for 2 17 771 Both of these conditions hold in the case that A is symmetric Example 1 1 1 1 we can use the spectral theorem to arrive at the the same results That is U V and 61 A Furthermore U and V are orthogonal matrices with columns equal to the eigenvectors of A That is U V P where P is defined in the Spectral theorem above These are illustrated below Again consider the matrix A Since A is symmetric Lectu re 4 15 Spectral DecIllposition Check that P and delta lambda Q gt roundU2 roundV2 1 2 1 my my 2 my my gt elgem Wane 1 2000000eoo 9860761e 32 Wectors 1 2 1 01071068 01071068 2 01071068 O7071068 gt elgentA3939A Wane 1 4000000eoo 1972152e 31 Wectors 1 2 1 01071068 01071068 2 01071068 O7071068 gt e1genA 3939 tA Wane 1 4000000eoo 1972152e 31 Wectors 1 2 1 01071068 01071068 2 01071068 O7071068 Lectu re 4 16 QR Factorization Theorem 43 QR Factorization Suppose A is an n x p matrix then A can be written in the form A QR where 62 has or thnormal columns and Rpxp is an upper triangular matrix If the columns of A are linearly independent ie A is non singular then this decomposition is unique and R has positive diagonal elements Lectu re 4 17 QR Factorization 1 1 1 2 independent we know by the QR factorization theorem that we can write A QR where Q has orthonormal columns and R is an upper triangular matrix with positive diagonal elements Calculation of these matrices is achieved using the qu qrQ and qrR functions in R as shown below Note that both the qrQ and qrR functions require an object resulting from a QR decomposition as from a call to qr Example Let A lt Since the columns of A are linearly Lectu re 4 18 QR Factorization Define A gt A lt7 matrixc11122 gt A 1 2 1 1 1 2 1 2 Calculate the Q matrix from the QR factorization of A gt Q i qrQqrA gt Q 1 2 1 07071068 07071068 2 07071068 07071068 Calculate the R matrix from the QR factorization of A gt R lt7 qrRqrA gt R 1 2 1 1414214 21213203 2 0000000 07071068 Lectu re 4 19 QR Factorization As expected Q has orthonormal columns and R is an upper triangular matrix with positive diagonal elements We confirm this and that we can reconstruct A using Q and R with the following code Checking Q is orthogonal gt Q o oQ 1 2 1 1 000000e00 7 852334e 17 2 7 852334e 17 1 000000e00 Reconstructing A from Q and R gt QZMR 1 2 1 1 1 2 1 2 Lectu re 4 20 QR Factorization Notably the lsfitO function in R also returns an object that can be used with the qrQ and qrR commands We return to this later The QR decomposition also provides an efficient way of calculating the determinant of a matrix and the product of its eigenvalues It can be shown that detA detQ detR Furthermore ldetQl 1 and so ldetAl ldetRl But the determinant of R is given by detR m the product of the diagonal elements of R since R is triangular We saw previously that the determinant of a matrix is equal to the product of its eigenvalues and so ml Ail Lectu re 4 21 QR Factorization Finally the qu function in R also returns the rank of a matrix This is demonstrated below it Using the qr function to calculate the rank of A gt qrArank 1 2 Lectu re 4 22 rholesky Decmnpositiull Theorem 44 Cholesky Decomposition A square and positive definite definition given below matrix A can be written A LL where L is a lower triangular matrix with strictly positive diagonal elements Lectu re 4 23 rholesky ecmnposition Example Define A as in example above We know A is symmetric A is also positive definite since the eigenvalues of A are all strictly greater than 0 it Check the eigenvalues of A are greater than 0 gt eigenA Values 1 2618034 0381966 Vectors 1 2 1 05257311 08506508 2 08506508 05257311 We could also use the following code to check the eigenvalues are greater than 0 gt sumeigenAvalueslt1ei8 1 0 Confirm A is symmetric gt sumtA A 1 0 Lectu re 4 24 Cholesky Decunposition As a result we can apply the Cholesky decomposition it Determining the Cholesky decomposition of A gt cholA 1 2 1 1 1 2 O 1 Lectu re 4 25 Cholesky elt0111positi11 Only a single matrix is returned since the Cholesky decomposition tells us A is the product of a lower triangular matrix and its transpose A LL The matrix returned by R is the upper triangular matrix L As a result we can reconstruct A as follows it Reconstructing A from the output of chol gt Lt lt cholA gt tLt o Lt 1 2 1 1 1 2 1 2 Lectu re 4 26
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'