Class Note for MATH 410 with Professor Dostert at UA 2
Class Note for MATH 410 with Professor Dostert at UA 2
Popular in Course
Popular in Department
This 11 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at University of Arizona taught by a professor in Fall. Since its upload, it has received 27 views.
Reviews for Class Note for MATH 410 with Professor Dostert at UA 2
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 02/06/15
THE UNIVERSITY a OF ARIZONA Math 410 Matrix Analysis Section 84 Eigenvalues of Symmetric Matrices Section 85 Singular Values Paul Dostert March 27 2009 111 A Evals of Symmetric Matrices Thm Let A AT be a real symmetric n gtlt n matrix Then a All the eigenvalues of A are real b Eigenvectors corresponding to distinct eigenvalues are orthogonal c There is an orthonormal basis of R consisting of n eigenvectors of A Ex Show each of the above properties for 3 1 A 1 2 Thm A symmetric matrix K KT is positive definite iff all its eigenvalues are strictly positive Note A matrix is positive definite if xTKac gt 0 for all vectors x 7 0 This is in Chapter 4 Ex Prove the previous theorem in one direction if K is SPD Then try to prove the first theorem using the previous A7 Basis for Symmetric Matrices Prop Let A AT be a symmetric n gtlt n matrix Let v17 7 Un be an orthogonal eigenvector basis st 1117 71177 correspond to nonzero eigenvalues while 11774717 711 are null eigenvectors corresponding to the zero eigenvalue Then 7quot rank A 1117 7117 form an orthogonal basis for rng A corng A while 11774717 711 form an orthogonal basis for ker A coker A All this theorem says is that for a symmetric matrix the nonzero eigenvectors are always a basis for rng A while the zero eigenvectors form a basis for ker A 2ZL The Spectral Theorem Thm Let A be a real symmetric matrix Then there exists an orthogonal matrix Q its columns form an orthonormal basis st A where A is a real diagonal matrix The eigenvalues of A appear on the diagonal of A while the columns of Q are the orthonormal eigenvectors Recall We can factor a regular symmetric matrix as A LDLT The spectral factorization A QAQT is NOT the same factorization Ex Write out the spectral factorization of A u Singular Values The singular values 717 7 or of an m gtlt n matrix A are the positive square roots az xAZ gt 0 of the nonzero eigenvalues of K ATA the Gram matrix The corresponding eigenvectors of K are known as the singular vectors of A Generally the singular values are listed in decreasing order with 01 2022quot3920rgt0 So 01 will always represent the dominant singular value Prop If A AT is a symmetric matrix its singular values are the absolute values of the nonzero eigenvalues az Z gt 0 The singular vectors coincide with the non null eigenvectors SVD The generalization of the spectral factorization to non symmetric matrices is called the singular value decomposition SVD Thm Any nonzero real m gtlt n matrix A of rank 7quot gt 0 can be factored as A PZQT with P an m gtlt 7quot matrix with orthonormal columns 21 diag 01 QT an 7quot gtlt n matrix with orthonormal rows a and Finding an SVD To find the SVD by hand we find the singular values and vectors of A We put the singular values in order into E The orthonormal eigenvectors of K ATA are then made to be the columns of Q To find the qu39 columns of P we calculate p 0239 Ex FindtheSVD ofA i Prop Given the SVD A PZQT the columns q of Q form an orthonormal basis for corng A while the columns pj of P form an orthonormal basis for rng A A Principal Component Analysis Technique of Principal Component Analysis PCA 0 Assume we are given a large data matrix A with n singular values having a mean of 0 and K ATA the covariance matrix 0 We compute the SVD PZQT then rearrange things so the largest singular values are first 0 We compute the sum of singular values and decide what percent error would be acceptable for our problem k 221 0239 2211 0239 Percent Error 1 0 We now consider data only in the k principal directions we reduce the basis to only k vectors 0 All computations would be done for only the k directions greatly reducing computation time 0 There is quite a bit more statistical information in the PCA than is discussed here A The Pseudoinverse The pseudoinverse of a nonzero m gtlt n matrix with SVD A PZQT is the n gtlt m matrix A stlpT Ex Show that pseudoinverse corresponds to the inverse for a nonsingular n gtlt n matrix A Lemma Let A be an m gtlt n matrix of rank n Then A ATAY1 AT Ex Prove the lemma by calculating ATA in terms of the SVD taking its inverse then multiplying by AT in terms of the SVD Ex Find the pseudoinverse to A The Pseudoinverse and Least Squares Thm Consider Ax 9 Let c Ab If ker A 0 then x is the Euclidean least squares solution to Ax b If ker 7 0 then 93 e corng A is the least squares solution of minimal Euclidean norm of Aac b Note We ll talk much more about least squares in Ch 4 Ex Use the psuedoinverse to solve Ax b in the least squares sense with 1 2 1 A 3 4 7b 0 1 3 1 A Condition Number The condition number of a nonsingular n gtlt n matrix is the ratio between its largest and smallest singular value K A 010 Note In numerical analysis this is only one of many different types of condition numbers A matrix with a large condition number is called illconditioned These matrices create many numerical problems when solved on a computer Ex Find the condition number of the following 1000 1 2 Ai 2 1 Bi 21000 2000 11000 39 Then solve 3 3000 Axi 3 39 31000 using only four digit precision A Matlab Examples Use the pseudoinverse to solve Ax b with 1 2 1 1 3 4 1 1 3 1 7b 31 2 1 0 If kerA 0 we use A AT0 1 AT Otherwise we use the full SVD A1 2 1 3 4 1 1 3 1 2 1 0b12 1 2 nullA We find kerA 7 0 So we need to use A QTZilP P Sigma Q svdA Since 2 has zero diagonals we eliminate the corresponding columns from P and Q the 3rd col in this case Pn P1 2 On Q12 Sn diagSigma11 Sigma22 PnSnQn This double checks the SVD this should be A Now compute the psuedoinverse corresponding to the known formula Apsuedoinv QninvSnPn xSoln Apsuedoinvb
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'