Outline for CMPSCI 689 at UMass
Outline for CMPSCI 689 at UMass
Popular in Course
Popular in Department
This 27 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at University of Massachusetts taught by a professor in Fall. Since its upload, it has received 14 views.
Reviews for Outline for CMPSCI 689 at UMass
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 02/06/15
CMPSCI 689 Learning in Matrix Vector Spaces Sridhar Mahadevan mahadevacs umas s edu University of Massachusetts Outline I Matrix vector spaces I Basic matrix theory eigenvalues and eigenvectors I Multivariate gaussian and the Mahalanobis distance I Singular value decomposition and GoogleTM I Discriminant functions and risk minimization 689 F03 p 2239 Applications of Linear Algebra I Image compression JPEG HDTV I Face recognition using eigenfaces Mogaddam and Pentland I Solves the problem of face recognition using linear algebra and multivariate gaussian density estimation I Eigenvectors produce a reduced representation of faces I Information retrieval Google I Construct a singular value decomposition of the adjacency matrix underlying the web by doing a random walk I Look for dominant eigenvectors I Early layers in the visual cortex compute reduced representations of highdimensional sensory data Theoretical Neuroscience Dayan et al MIT Press 689 F03 p 3239 Face recognition using eigenvectors MTurk and APentland Eigen Faces for Recognition Journal of Cognitive Neuroscience 31 1991 689 F03 p 4239 Matrix Vector Spaces A is an mn matrix dimension dimension r ROW r space RA Column space CA x b X X R n x m AX b R xn nuuspace dImensIon NA mquot left dimension null ace n r NAT 689 F03 p 5239 Column and Row Spaces I The column space C A associated with a m n matrix A is the set of all linear combinations of the columns of A I Ax Alxl AQZCQ where A1 is the first column of A etc 1 1 1 1 2 5 4 x2 5 4 I The row space CAT associated with a m n matrix A is the column space of the transpose AT I The system of equations Ax b is solvable if and only if b is in the column space of A 689 F03 p 6239 I The null space NA associated with a m n matrix A is the set of all vectors x such that Ax 0 I The left null space N AT associated with a m n matrix A is the set of all vectors x such that ATx 0 I What is the nullspace of 1 0 2 I We solve the below linear system of equations 3 6 x2 0 2 I The nullspace NA consists of all scalar multiples of 1 689 F03 p 7239 Example of Matrix Vector Spaces I Considerthe matrixA 1 2 3 I What is the column space CA I What is the row space CAT I What is the null space NA I What is the left null space NAT I Repeat the above for the matrix A 689 F03 p 8239 Independence I The columns of a matrix A are linearly independent if the only solution to Ax 0 is x 0 I Eg 10 01 are independent I Any set of n vectors in R is dependent if n gt m I A set of vectors span a space if their linear combination fills the space I A basis for a vector space is a set of linearly independent vectors that span the space 689 F03 p 9239 Subspaces and Dimension I A subspace of a vector space is a set of vectors that has the 9 element and is closed under addition and scalar multiplication eg all vectors on a line passing through 0 0 constitute a subspace of R2 I The dimension of a vector space is the number of vectors in every basis eg the dimension of Rquot is n I The dimension of the row space 2 7 number of independent rows I The dimension of the column space 2 7 number of independent columns I If the dimension of the rowcolumn space 2 number of rows columns the matrix A is considered full rank and has an inverse A l I The dimension of the nullspace n 7 I The dimension of the left nullspace m 7 I Give the dimensions of the vector spaces associated with the matrix A 1 2 3 I Repeat the above for the matrix A 1 2 3 6892F03 p10239 Orthogonal Spaces I Two subspaces W and V are orthogonal if every 111 E W is perpendicularto every 11 E V I A subspace W forms an orthogonal complement to another subspace V if for each vector v E V there exists a unique perpendicular vector vi E W I The nullspace NA forms an orthogonal complement to the row space CAT I The left nullspace NAT forms an orthogonal complement to the column space CA 6892F03 p11239 Orthonormal vectors and matrices I A set of vectors are orthonormal if 1qu 61739 which is 1 exactly when i j otherwise 0 I So orthonormal vectors are both perpendicular to each other and are unit vectors I Show that the matrix Q whose columns are orthonormal vectors has the following property QTQ I I If Q is a square matrix show that this implies QT Q l cos oz sin oz I Example of an orthonormal matrix sin oz cos oz I This matrix corresponds to doing a coordinate rotation by oz I Show that the permutation matrix is orthonormal 0 1 0 1 0 0 0 0 1 689 F03 p 12239 Eigenvalues and Eigenvectors I Generally when we multiply a vector x by a matrix A the resulting vector Ax is not in the same direction as x I Occasionally however this does happen and the resulting vector is then called an eigenvector I Formally the eigenvectorx of a matrix is such that Ax 2 Am where is a scalar called an eigenvalue I Eigenvectors lie in the nullspace of A I because they satisfy the equation A AIM 0 I Algorithm Solve the equation A I 0 to find the eigenvalues I For each resulting eigenvalue M solve the equation A AIM 0 to find the corresponding eigenvector 1 1 2 I Exercise Find the eigenvalues and eigenvectors of 689 F03 p 13239 Eigenspace Properties I The product of the n eigenvalues gives us the determinant of the matrix so if the matrix is singular 0 must be an eigenvalue I The sum of the eigenvalues equals the trace or sum of the diagonal entries of the matrix I If a square n n matrix A has n linearly independent columns then the following property holds S lAS A equivalently it follows that A 2 SAS 1 A10 Sx1xn0 A2 0 lg M I The columns of S are made up of the eigenvectors To prove this first show that AS 2 SA and then it follows directly I Suggest a fast way to compute Ak given its decomposition above 689 F03 p 14239 Eigenvalues of Symmetric Matrices I If a matrix A is symmetric that is AT 2 A then all its eigenvalues are real I This leads to a fundamental spectral theorem I Every symmetric matrix A has the factorization A QAQT where the eigenvector matrix Q is comprised of an orthonormal set of vectors I Interesting property If Q is orthonormal then QT Q l I Find the orthonormal set of eigenvectors for the previous matrix I Show that any symmetric matrix can be written as the sum of projection matrices A 2 Max l l Anxnxz Each of these matrices is a projection along an eigenvector space 689 F03 p 15239 Positive Definite Matrices I A symmetric matrix A is positive de nite if xTAx gt 0 for every nonzero vector x if we allow equality we get the property of positive semidefiniteness I Symmetric matrices with positive eigenvalues are positive definite if Ax 2 Am then xTAx AxTx gt 0 when gt 0 a b x xTAxx y ax22bxycy2gt0 C y I Note that this is an equation for an ellipse What we need to determine is the orientation and the parameters of the ellipse I Result Given a positive definite matrix A QAQT the graph of xTAx 1 is an ellipse or ellipsoid x X x yQQT X YA Y 1X2 l 2Y21 y 6892 F03 16239 1 and 1 P 3 I The halflength of the major and minor axes are Multivariate Normal Distribution 2D Normal tlz 689 F03 p 17239 Multivariate Normal Distribution The multivariate normal distribution in d dimensions is given by 1 f 4TE15 4 1396 9 M exp I g Ez is a ddimensional vector I Z Ez FL 1 mT is a d x d covariance matrix I 2 is the determinant of Z I The covariance matrix in 2 dimensions is E E1 M11 M1 E1 M12 M2 01 012 2 E2 M21 1 E2 M22 2 012 02 I The covariance matrix is symmetric which means that the eigenvalues are real It also means that the eigenvectors form an orthonormal set independent unit length I 80 if Eu 2 Aim then UZTUJ39 Sij I Since 2 is symmetric it can be written as E QAQ l where A is the eigenvalue diagonal matrix and Q is the orthonormal eigenvector matrix It immediately follows that the inverse covariance matrix can also be written in such a form why I This implies that the inverse covariance matrix 2 1 can be expressed as a sum of d prolectlon matrices 2 1 E 1 i L 689 F03 p 19239 Mahaanobis Distance I The exponent in the multivariate normal distribution A2 x MTE l x u is called the Mahalanobis distance If the covariance matrix is diagnonal E a it reduces to Euclidean distance I Consider a change of coordinate from the original as to the following y U1c u where the rows of U are the eigenvectors 2 I Then using the above expression for 2 1 it can be shown that A2 2 221 Notice this shows that the inverse covariance matrix is positive semidefinite y1 x2 12 1 X1 689F03 p20239 Transformed Multivariate Gaussmn I Note that in the transformed y U1c u the multivariate gaussian can be written in the following simplified form 2 d y 1 2jj e dyj 1 27rj I This is a product of d independent normal distributions with mean 0 and variance 2 j jth eigenvalue J I Thus the transformation y U1c u decorreates the original componeg gsg3p212 Singular Value Decomposition I So far we have seen how to decompose symmetric matrices into a product of orthogonal matrices the eigenvector matrix and a diagonal matrix the eigenvalue matrix I What about nonsymmetric matrices The key insight behind SVD is to find two orthogonal basis representations U and V such that for any matrix A the following decomposition holds AzUEVT where U and V are orthogonal matrices so UTU I and VTV I I E is a diagonal matrix whose entries are called singular values hence the meaning behind the term SVD 689 F03 p22239 Singular Value Decomposition I We want to choose the orthogonal matrices in such a way that not only are they orthogonal but we want the result of applying A to them to also be orthogonal I That is the vi are are a orthonormal basis set unit vectors and moreover Am also orthogonal to each other I We can then construct the m 2 3 1 set to give us the second orthonormal basis 0391 0 A111 201u1 02u2u1 712 0 0392 I So we get the following relationships AVUE or U lszz or UTAVZ AV 2 US or AVV 1 UEV 1 or A UEVT 689 F03 p23239 Orthonormal Basis in SVD I How do we go about finding U and V I Here is the trick We can eliminate U from the equation AV 2 US by premultiplying A by its transpose ATA UEVTT UEVT VETUTUEVT VETEVT I Since ATA is always symmetric why the above expression gives us exactly the familiar spectral decomposition we have seen before namely QAQT except that now V represents the orthonormal eigenvector set of ATA I In a similar fashion we can eliminate V from the equation AV 2 US by postmultiplying A by its transpose AAT UEVT UEVTT UEVTVETUT UEETUT 2 0 I The matrix EST 2 STE 2 01 whose entries are the square of the 2 0 02 Singular values 689F03 p24239 Examples of SVD followmg matrices 2 2 A 2 2 A SVD underlies GoogleTM I GoogleTM uses SVD to accelerate finding relevant web pages Here is a brief explanation of the general idea I Define a web site as an authority if many sites link to it I Define a web site as a hub if it links to many sites I We want to compute a ranking x1 xN of authorities and 311 yM of hubs I As a first pass we can compute the ranking scores as follows as is the number of links pointing to z and y is the number of links going out of z I But not all links should be weighted equally For example links from authorities or hubs should count more So we can revise the rankings as follows 30 Z yngTyO and y 2 xngx0 j links to i i links toj I Here A is the adjacency matrix where aij 1 ifz links to j 689 F03 p26239 SVD underlies GoogleTM I Clearly we can keep iterating and compute x5 2 ATA ask 1 and yf AAT yk l I This is basically doing an iterative SVD computation and as k increases the largest of eigenvalues dominate I GoogleTM is doing an SVD over a matrix A of size 4 x 109 by 4 x 109 689 F03 p27239
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'