New User Special Price Expires in

Let's log you in.

Sign in with Facebook


Don't have a StudySoup account? Create one here!


Create a StudySoup account

Be part of our community, it's free to join!

Sign up with Facebook


Create your account
By creating an account you agree to StudySoup's terms and conditions and privacy policy

Already have a StudySoup account? Login here

Matrix Computation

by: Abel Lueilwitz

Matrix Computation CSE 275

Abel Lueilwitz
GPA 3.71

Ming-Hsuan Yang

Almost Ready


These notes were just uploaded, and will be ready to view shortly.

Purchase these notes here, or revisit this page.

Either way, we'll remind you when they're ready :)

Preview These Notes for FREE

Get a free preview of these Notes, just enter your email below.

Unlock Preview
Unlock Preview

Preview these materials now for free

Why put in your email? Get access to more of this material and other relevant free materials for your school

View Preview

About this Document

Ming-Hsuan Yang
Class Notes
25 ?




Popular in Course

Popular in Computer Science and Engineering

This 13 page Class Notes was uploaded by Abel Lueilwitz on Thursday October 29, 2015. The Class Notes belongs to CSE 275 at University of California - Merced taught by Ming-Hsuan Yang in Fall. Since its upload, it has received 37 views. For similar materials see /class/231720/cse-275-university-of-california-merced in Computer Science and Engineering at University of California - Merced.

Similar to CSE 275 at UCMerced

Popular in Computer Science and Engineering


Reviews for Matrix Computation


Report this Material


What is Karma?


Karma is the currency of StudySoup.

You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 10/29/15
CSE 275 Matrix Computation Ming Hsuan Yang Electrical Engineering and Computer Science University of California at Merced Merced CA 95344 httpfacutyucmercededumhyang Y Q v T Op t e if 397 quot2 339 ul 3 as 17 we Et 2 quotv xA39 339 h a 1868 an Lecture 4 114 Overview 0 Basic definition orthogonality orthogonal projection distance between subspaces matrix inverse 9 Matrix decomposition 214 Reading 0 Chapters 2 and 3 of Matrix Computations by Gene Golub and Charles Van Loan Orthogonality O 0 O O A set vectors x17 7xn in Rm is orthogonal if xij 0 when j and orthonormal if xij 63 Orthogonal vectors are maximally independent for they point in totally different directions Subspace A collection of subspaces 1 75p in Rm is mutually orthogonal if xTy 0 whenever x E 5 and y 6 5 for i j The orthogonal complement of a subspace S C Rm is Sly IRmyTx0Vx It can be shown that ranAL nullAT The vectors v17 vk form an orthonormal basis for a subspace S C Rm if they are orthonormal and span 5 Orthogonality cont d o A matrix Q 6 IRme is said to be orthogonal if QTQ l 7qm is orthogonal then the q form an orthonormal a If Qq1 basis for Rm Theorem If V1 6 IRr39X has orthonormal columns then there exists V2 6 IRr39XV39 such that V V1 V2 6 Rnxn is orthogonal Note that ranV1L ranV2 o The vector 2 norm is invariant under orthogonal transformation Q ie ll Qxllg XTQTQX XTX llel2 0 Likewise matrix 2 norm and Frobenius norm are invariant with respect to orthogonal transformations Q and llQAleF llAllF llQAle2 llAll2 c Singular value decomposition SVD Theorem If A is a real m by n matrix then there exists orthogonal matrices U U17 um ERme and V v1vn EIRr39X such that UTAV X diagz717 7op E IRWX 7 p minm7 n Wherealzagzap20 or A UXVT o The a are the singular values of A and the vectors u and v are the i th left singular vector and the i th right singular vector respectively o It follows that AV UK and ATU VXT 614 Existence of SVD Proof Let x 6 IR and y E Rm be unit 2 norm vectors that satisfy Ax 0y with a llAllg There exists V2 6 EMMA and U2 6 Rmxm l so V x V2 6 IRr39X and U y U2 6 IRme are orthogonal It follows that T UTAVUTay AVQ3 29 2A1 for some w and B Since 039 ll A1 l w l we have 2 02 l wTw But 02 and thus w must be 0 By induction we complete this proof 2 2 72 WTW2 2 714 Singular vectors 9 As A UXVT by comparing the columns in the equations AV UK and ATU VET it is easy to show AV 039U ATU 039V wherei1minmn o The a are the singular values of A and the vectors u and v are the i th left singular vector and i th right singular vector respectively 0 U is a set of eigenvectors of AAT 6 IRme 0 X is a diagonal matrix whose values are the square root of eigenvalues of AAT 6 IRme 0 V is a set of eigenvectors of ATA E IRr39X o It can be shown that singular values a are the square roots of eigenvalues ie a Singular values 0 The singular values of a matrix A are precisely the lengths of the semi axes of the hyperellipsoid E defined by E Ax x2 1 o The semi axes are described by the singular vectors L maxl llell llAll lxll min llell llel1 1 llA lll o The SVD reveals the structure of a matrix Let then rankA r nullA spanvr1vn ranA spanu1 u 9 14 Geometric interpretation of SVD Apply left rotation to unit vector x using right singular vectors V ie VTX Scale with ie vTx Apply right rotation using left singular vectors U UZ VTX 0 Best approximation with r eigenvectors in 2 norm Theorem Eckart Young 1936 Let A u VT A U diag01 0 O O VT and any 8 ofrank r A Arll2 llA Bll2 1014 SVD expansion 0 We can decompose A in terms of singular values and vectors r r T A Zaiuivi Zaiu Vi i1 i1 where X is the Kronecker product a The matrix 2 norm and Frobenius norm properties have connections to the SVD llAllF af 05 pminmn l l q H o Closely related to eigenvalues eigenvectors and principal component analysis 1114 Application of SVD 0 Matrix algebra pseudo inverse solving homogeneous linear equation least squares minimization rank null space etc a Computer vision denoise eigenface eigenteXture eigen X structure from motion etc 0 Pattern recognition principal component analysis dimensionality reduction multivariate Gaussian etc 0 Application imagedata analysis document retrieval etc 1214 Covariance matrix 0 Let x E Rm and X x17 xn covariance matrix C E Rmxm C EX EXX Elxlfl iZ71Xi 7 uXi 7 MT n 7 i 7T my 7 u x e u EixxTi a M X x where p Ex Z1x and Y X 7 1p 0 Covariance C is positive semi definite a Second order statistics of x o The variation can be compactly modeled with principal component analysis 0 Related to multivariate Gaussian distribution principal component analysis SVD and others 1314


Buy Material

Are you sure you want to buy this material for

25 Karma

Buy Material

BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.


You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

Why people love StudySoup

Jim McGreen Ohio University

"Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

Anthony Lee UC Santa Barbara

"I bought an awesome study guide, which helped me get an A in my Math 34B class this quarter!"

Bentley McCaw University of Florida

"I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

Parker Thompson 500 Startups

"It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

Become an Elite Notetaker and start selling your notes online!

Refund Policy


All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email


StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here:

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.