Matrix Computation CSE 275
Popular in Course
verified elite notetaker
Popular in Computer Science and Engineering
This 13 page Class Notes was uploaded by Abel Lueilwitz on Thursday October 29, 2015. The Class Notes belongs to CSE 275 at University of California - Merced taught by Ming-Hsuan Yang in Fall. Since its upload, it has received 37 views. For similar materials see /class/231720/cse-275-university-of-california-merced in Computer Science and Engineering at University of California - Merced.
Reviews for Matrix Computation
Report this Material
What is Karma?
Karma is the currency of StudySoup.
Date Created: 10/29/15
CSE 275 Matrix Computation Ming Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced CA 95344 httpfacutyucmercededumhyang Y Q v T Op t e if 397 quot2 339 ul 3 as 17 we Et 2 quotv xA39 339 h a 1868 an Lecture 4 114 Overview 0 Basic definition orthogonality orthogonal projection distance between subspaces matrix inverse 9 Matrix decomposition 214 Reading 0 Chapters 2 and 3 of Matrix Computations by Gene Golub and Charles Van Loan Orthogonality O 0 O O A set vectors x17 7xn in Rm is orthogonal if xij 0 when j and orthonormal if xij 63 Orthogonal vectors are maximally independent for they point in totally different directions Subspace A collection of subspaces 1 75p in Rm is mutually orthogonal if xTy 0 whenever x E 5 and y 6 5 for i j The orthogonal complement of a subspace S C Rm is Sly IRmyTx0Vx It can be shown that ranAL nullAT The vectors v17 vk form an orthonormal basis for a subspace S C Rm if they are orthonormal and span 5 Orthogonality cont d o A matrix Q 6 IRme is said to be orthogonal if QTQ l 7qm is orthogonal then the q form an orthonormal a If Qq1 basis for Rm Theorem If V1 6 IRr39X has orthonormal columns then there exists V2 6 IRr39XV39 such that V V1 V2 6 Rnxn is orthogonal Note that ranV1L ranV2 o The vector 2 norm is invariant under orthogonal transformation Q ie ll Qxllg XTQTQX XTX llel2 0 Likewise matrix 2 norm and Frobenius norm are invariant with respect to orthogonal transformations Q and llQAleF llAllF llQAle2 llAll2 c Singular value decomposition SVD Theorem If A is a real m by n matrix then there exists orthogonal matrices U U17 um ERme and V v1vn EIRr39X such that UTAV X diagz717 7op E IRWX 7 p minm7 n Wherealzagzap20 or A UXVT o The a are the singular values of A and the vectors u and v are the i th left singular vector and the i th right singular vector respectively o It follows that AV UK and ATU VXT 614 Existence of SVD Proof Let x 6 IR and y E Rm be unit 2 norm vectors that satisfy Ax 0y with a llAllg There exists V2 6 EMMA and U2 6 Rmxm l so V x V2 6 IRr39X and U y U2 6 IRme are orthogonal It follows that T UTAVUTay AVQ3 29 2A1 for some w and B Since 039 ll A1 l w l we have 2 02 l wTw But 02 and thus w must be 0 By induction we complete this proof 2 2 72 WTW2 2 714 Singular vectors 9 As A UXVT by comparing the columns in the equations AV UK and ATU VET it is easy to show AV 039U ATU 039V wherei1minmn o The a are the singular values of A and the vectors u and v are the i th left singular vector and i th right singular vector respectively 0 U is a set of eigenvectors of AAT 6 IRme 0 X is a diagonal matrix whose values are the square root of eigenvalues of AAT 6 IRme 0 V is a set of eigenvectors of ATA E IRr39X o It can be shown that singular values a are the square roots of eigenvalues ie a Singular values 0 The singular values of a matrix A are precisely the lengths of the semi axes of the hyperellipsoid E defined by E Ax x2 1 o The semi axes are described by the singular vectors L maxl llell llAll lxll min llell llel1 1 llA lll o The SVD reveals the structure of a matrix Let then rankA r nullA spanvr1vn ranA spanu1 u 9 14 Geometric interpretation of SVD Apply left rotation to unit vector x using right singular vectors V ie VTX Scale with ie vTx Apply right rotation using left singular vectors U UZ VTX 0 Best approximation with r eigenvectors in 2 norm Theorem Eckart Young 1936 Let A u VT A U diag01 0 O O VT and any 8 ofrank r A Arll2 llA Bll2 1014 SVD expansion 0 We can decompose A in terms of singular values and vectors r r T A Zaiuivi Zaiu Vi i1 i1 where X is the Kronecker product a The matrix 2 norm and Frobenius norm properties have connections to the SVD llAllF af 05 pminmn l l q H o Closely related to eigenvalues eigenvectors and principal component analysis 1114 Application of SVD 0 Matrix algebra pseudo inverse solving homogeneous linear equation least squares minimization rank null space etc a Computer vision denoise eigenface eigenteXture eigen X structure from motion etc 0 Pattern recognition principal component analysis dimensionality reduction multivariate Gaussian etc 0 Application imagedata analysis document retrieval etc 1214 Covariance matrix 0 Let x E Rm and X x17 xn covariance matrix C E Rmxm C EX EXX Elxlfl iZ71Xi 7 uXi 7 MT n 7 i 7T my 7 u x e u EixxTi a M X x where p Ex Z1x and Y X 7 1p 0 Covariance C is positive semi definite a Second order statistics of x o The variation can be compactly modeled with principal component analysis 0 Related to multivariate Gaussian distribution principal component analysis SVD and others 1314