New User Special Price Expires in

Let's log you in.

Sign in with Facebook


Don't have a StudySoup account? Create one here!


Create a StudySoup account

Be part of our community, it's free to join!

Sign up with Facebook


Create your account
By creating an account you agree to StudySoup's terms and conditions and privacy policy

Already have a StudySoup account? Login here

Numerical Linear Algebra

by: Nick Rowe

Numerical Linear Algebra CS 51500

Marketplace > Purdue University > ComputerScienence > CS 51500 > Numerical Linear Algebra
Nick Rowe
GPA 3.68


Almost Ready


These notes were just uploaded, and will be ready to view shortly.

Purchase these notes here, or revisit this page.

Either way, we'll remind you when they're ready :)

Preview These Notes for FREE

Get a free preview of these Notes, just enter your email below.

Unlock Preview
Unlock Preview

Preview these materials now for free

Why put in your email? Get access to more of this material and other relevant free materials for your school

View Preview

About this Document

Class Notes
25 ?




Popular in Course

Popular in ComputerScienence

This 25 page Class Notes was uploaded by Nick Rowe on Saturday September 19, 2015. The Class Notes belongs to CS 51500 at Purdue University taught by Staff in Fall. Since its upload, it has received 93 views. For similar materials see /class/208081/cs-51500-purdue-university in ComputerScienence at Purdue University.

Similar to CS 51500 at Purdue

Popular in ComputerScienence


Reviews for Numerical Linear Algebra


Report this Material


What is Karma?


Karma is the currency of StudySoup.

You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/19/15
Chapter 3 LEAST SQUARES PROBLEMS One application is geodesy amp surveying Let 2 elevation7 and suppose we have the mea surements 21417 zB 27 2037 23 7 2A 17 20 723 m 27 20 72A This is overdetermined and inconsistent 1 0 0 1 0 1 0 2 0 0 1 2 N 3 i1 1 0 23 N 1 0 i1 1 20 2 i1 0 1 1 Another application is tting a curve to given data Y Mi 5 More generally Ammxn bm where A is known exactly7 m 2 717 and b is subject to inde pendent random errors of equal variance which can be achieved by scaling the equations Gauss 1821 proved that the best solution x minimizes Mb 7 AzHZ7 ie7 the sum of the squares of the residual components Hence7 we might write Ax 2 b Even if the 2 norm is inappropriate7 it is the easiest to work with 31 The Normal Equations 32 QR Factorization 33 Householder Re ections 34 Givens Rotations 35 Gram Schmidt Orthogonalization 36 Singular Value Decomposition 31 The Normal Equations Recall the inner product zTy for Ly E R What is the geometric interpretation of the inner product in R3 A sequence of vectors 1727 on in R is orthogonal if ij 0 if and only ifz39 31 j and orthonormal if aiTaj 617 Two subspaces S7 T are orthogonal if z E 87y E T i Ty 0 If X x17x27 7 then orthonormal means XTX I Ererolse De ne what it means for a set rather than a multiset of vectors to be orthog onal An orthogonal matrix Q satis es QT Q l In 2D or in 3D it represents a re ection andor rotation The problem mmin Mb 7 AaHZ gt Hb 7x1a1 znan lg where A aL7 7an nd a linear combination of the columns of A which is nearest b RA Here RA column space of A The best approximation is when rLRA Hence the best approximation Ax orthogonal projection of b onto 72A7 ie7 rLaj7 j 17 27 7n normal gt ATr 0 gt ATb 7 Ax 0 gt ATAx ATb i Clearly z is unique gt equations columns of A are linearly independent Otherwise7 there are in nitely many solutions The least squares problem is sometimes written UT llilZlSl 73 analytical argument Let rankA 71 Therefore ATA is nonsingular Let x satisfy AT 0 where r b 7 Ax Then for any other value x w Mb 7 Az w Hr 7 AwHS TDquot 7 2wTATr wTATAw H5 MUS HAW Hence7 a solution of the normal equations is a solution of the least squares problem The solution x is a unique minimum if rankA 71 because then HwHZ 0 i w 0 the pseudoinverse Assume rankA n Then s ATAYIATb We call ATA 1AT A the pseudo inverse and we write x Atb The de nition can be extended to the case rankA lt For a column vector 1 the pseudo inverse of The product ATA In What about AA V This is an orthogonal projector 74 orthogonal projector ooT i The product i Mil produces an orthogonal prOJection of a vector onto spano because 1 o 1 oolx E spano7 2 z 7 orle span T m Alternatively7 x is the closest vector to x that is some multiple of o o o Similarly AAf AATA 1AT7 rank A n 3 m7 produces an orthogonal projection onto 72A7 because AAlb E RA since AAlb AAlb For any b b i AATbimA since ATb 7 AAlb 0 DEFNA matrix P is an orthogonal projector if for any x z 7 PxLRP gt Vn x 7 PzTP 0 ltgt PTP P PT P P2 P ltgt symmetry idempotence If S is a subspace of R then its orthogonal complement SL xley 0 for all y E Every 1 E R has a unique decomposition o n y n 6 S7 y 6 Si solving the normal equations The direct approach to solving ATAz AV is to 1 form ATA7 2 3 use Cholesky to get factorization GGT7 set x G TG 1ATb Example With 4 digit round to even oating point arithmetic 02 3 02 3 302 302 30404 11 3 7 T 7 A H 7 AA 302 304 l The result is not even positive de nite The problem is that it can be shown that n2ATA n2A2 where n2A HAlHZHAHZ Any roundoff made in forming ATA will have a very signi cant effect if A is ill conditioned The solution is to 07quot do entire computation in double precision look for another algorithm Review questions H F 9 7 9 CT I 00 p De ne a least squares solution to an overdetermined system Ax b using matrix notation For a least squares solution to an overdetermined system Ax equations be scaled b7 how should the De ne an orthogonal sequence an orthonormal sequence How is a subspace represented computationally What does it mean for two subspaces to be orthogonal What is the orthogonal complement of a subspace What is the null space of a matrix Give an alternative expression for 7ZAL which is more useful computationally Give a geometric interpretation of a linear least squares problem Ax b Give necessary and suf cient condition for existence of a solution to a linear least squares problem Ax b for existence of a unique solution 76 H H What are the normal equations for a linear least squares problem Ax b H D Express the linear least squares problem Ax b as a system of m n equations in m n unknowns where m and n are the dimensions of A H 9 If x satis es the normal equations show that no other vector can be a better solution to the least squares problem H r If y is the orthogonal projection of y onto a subspace 8 what two conditions does y satisfy T 15 What does g do 1 1 16 Give a formula for the orthogonal projector onto a subspace S in terms of a basis 11 12anfor8 17 What two simple conditions must a matrix P satisfy for it to be an orthogonal projec tor 18 What is an oblique projector 19 What does idempotent mean 20 Why is the explicit use of the normal equations undesirable computationally 21 If we do use the normal equations what method is used for the matrix factorization Exercises 1 What can you say about a nonsingular orthogonal projector 2 a What is the orthogonal complement of RA where 101 1 0 A 1 1 1 1 b What is the projector onto the orthogonal complement of RA c What is the projector onto RA 3 Construct the orthogonal projector onto the plane through 000 110 101 4 Assuming u and 1 are nonzero column vectors show that I UUT is orthogonal only if u 721111Tv Assume that A E Rm has linearly independent columns Hence ATA has a unique Cholesky factorization Prove that there exists a factorization A Q1R1 where Q1 6 Rm has columns forming an orthonormal set and R1 6 RM is an upper triangular matrix What is the solution of the least squares problem Ax b in terms of Q1R1 and b 9 03 Let f be a least squares solution to an overdetermined system Ax b What is the geometric interpretation of A When is Az unique When is f unique 5 Show that an upper triangular orthogonal matrix is diagonal What are the diagonal elements 00 Suppose that the matrix A B 0 O is orthogonal where A and C are square submatrices Prove in a logical deductive fashion that B 0 Hint there is no need to have a formula for the inverse of a 2 gtlt 2 block upper triangular matrix and there is no need to consider individual elements columns or rows or A B or C 3 Show that HQzHZ if Q is orthogonal H 0 Show that if Q is an m by m orthogonal matrix and A is an m by n matrix then HQAHz llAllz 11 Assume A Q q 0T p is an orthogonal matrix where Q is square and p is a scalar What can we say about Q q and p The answer should be simpli ed 12 a What is the orthogonal projector for spano1o2 where 111 oz are linearly inde pendent real vectors b How does this simplify if 112 is orthogonal to m In pariticular what is the usual way of expressing an orthogonal projector in this case 32 QR Factorization The least squares problem min Ho 7 A llg is simpli ed if A is reduced to upper triangular 1 form by means of orthogonal transformations QTA R right triangular 78 Then HbiAzHZ HQTb7AzH2 see exercise 319 7 T i H T i c n i R n i HQb Rzllg partitionQbid min Ri0 min H C 7 13 m d 0 071 A M d lllz HciRzH HldH Obviously minimized for z R lc which is computed by back substitution A special case is m n ie7 A is square This method is numerically very stable because there is no growth in the elements of the reduced matrix HRHz HQTAHz llAHz Show this for matrices It is twice as much work as Gaussian elimination 713 multiplica tions using Householder re ections Review questions H What is an orthogonal matrix P What is the effect of multiplication by an orthogonal matrix on the 2 norm of a vector 2 norm of a matrix Prove it 9 What is a QR factorization 7 Show how the QR factorization of a matrix A can be used to solve the linear least squares problem Exercise 1 Consider the problem of solving an overdetermined system Ax b in the least squares sense Suppose A is such that it is possible to compute an accurate factorization LU where L is a square lower triangular matrix and U is upper triangular with the same dimensions as A Why would this be of little use in solving the least squares problem that is7 what is special about a QR factorization 79 2 Assume that A E Rm has linearly independent columns Hence ATA has a unique Cholesky factorization Prove that there exists a factorization A Q1R1 where Q1 6 Rm has columns forming an orthonormal set and R1 6 RM is an upper triangular matrix What is the solution of the least squares problem Ax b in terms of Qth and b 33 Householder Re ections De nition Householder An elementary matria has the form I rank one matrix It is easy to show that an elementary matrix has the form IuoT7 u7 07o7 0 An example of an elementary matrix is a Gauss transformation Mk I imkeg An elemen tary matrix is ef cient computationally it is easy to invert It is not dif cult to show that an orthogonal elementary matrix has the form MIT 27 I P oTi P is symmetric and P2 I Exercise Prove this Recall that gz is the orthogonal projection of z onto 1 i i V T What does P I i 2 do 1 1 It re ects in the hyperplane through the origin orthogonal to 1 How should one store P How should one multiply by P In sum T 13172 VTV V V X Householder re ection PX Note that the length of 1 is irrelevantionly its direction matters In practice we want to determine 1 so that for some given z Pr for some 04 E R Recalling that orthogonal matrices preserve Euclidean length7 M2 llp cllz lal a ill cllz To get 1 to point in the right direction7 choose 1 z 7 Pm v xiPx Pm signz1HxH261 Pm 7 signz1HxH261 rst components of z and Pm rst components of z and Pm have same sign have opposite sign which do we choose Note that for very acute angles between z and Px7 the direction ofv becomes very sensitive to perturbations in z or Px7there is a lot of cancellation in z7Pz if they point in similar directions Therefore7 1 z signx1HzH2617 sign0 1 Pm 7sign1HzH261 computing P LAPACK normalizes 1 so that vim 17 and hence uses Unew vvl instead Also7 it is careful to avoid under owover ow in the computation of lt computes so P I7 sz computing PA for A E Rm We are given 1 and B where P I 7 BinT The product PA would require mzn multiplications However7 A7UBUTA requires mn multiplications followed by n multiplications followed by mn multiplications Hence7 mzn multiplications vs 2mm multiplications reduction to triangular form As in Gaussian elirnination7 we reduce A column by column Recursive description The case m 2 n 1 is an exercise Consider the case m 2 n 2 2 Determine orthogonal P such that 04 1T M 0 A P is to be constructed in the next section Then A 7 P 04 aT 0 A and recursion gives 4 QR so that 10T aaT AQR where QP0 and R0 Ntmrecursz39ve description First7 determine orthogonal P1 such that X X X X PlA 000 00 Dgt H Similarly7 O Then H JP H 9 E m b H OOOOOX OOOOXX gtltgtltgtlt gtltgtltgtltgtlt P3 R Thus P4P3P2P1 QT7 but we do not need to form this product Rather we compute an P4P3P2P1b Note If m 717 then 71 7 1 reductions suf ce 83 Householder orthogonalization At beginning of the kth stage k r11 r12 rln r22 r2 rm 71 TH We want to nd 1 I 7 6147147 such that the algorithm Householder orthogonalizatz39on fork17277ndo determine a Householder re ection Pk I 7 6147147 such that akk rkk ak1k 7 7 amk 0 forjk17k277ndo Tkj akj 0mm 13 0mm k L am am 84 The storage scheme is as follows after the kth step we have Review questions H What is an elementary matrix Give an explicit form for an elementary matrix 2 Give an explicit for an elementary orthogonal matrix What do we call such a matrix 3 Describe geometrically the effect of multiplying a vector x by a matrix H I 7 20UTU 1UT 4 In practice H is constructed so that y Hm where z and y are given What condition must y satisfy for H to exist If all but the rst element of y is to vanish7 what are the choices for y 5 Assuming the ability to construct a Householder transformation that maps a given vector to another of the same length7 give a recursive algorithm for QR factorization using Householder re ections As always7 use partitioned matrices to describe the algorithm 6 How much storage is required for the matrix Q in a QR factorization using Householder re ections Exercises 1 Suppose that P2P1A R where Pl I 7 lm1 and 2 0 32 0 71 4 0 79 4 517 U1 0 7 5237 12 1 7 andR 0 0 1 71 0 0 F 9 7 CT 5 00 53 Determine the least squares solution of Ax b where T 3 5 072 4 27 27 Do an ef cient computation and show every step Determine a re ection P I 7 2UUTU 1UT such that P2 is a multiple of y where z 31 0 and y 31 0 Of the two possibilities7 which uses for 1 the sum of two vectors separated by an angle of at most 7T2 Prove or disprove a Householder re ection is positive de nite Write an algorithm for overwriting b with Pn P2P1b where Pk I 7 614nm and 11k 0 0 Dim Umk Only the following array values should be referenced by your algorithm bi 1 S S 77747 vik7k m71 k n7 Let 78 72125 7 71 2350 A 7 2 74825 2 10225 Determine 6176271117112 and right triangular matrix R such that P2P1A R where Pl I 7 lm1 As a check con rm that A PngR Assuming u and 1 are nonzero column vectors7 show that I UUT is orthogonal only if u 72vUTU Count the number of multiplications for the Householder orthogonalizatz39on algorithm as described in this section T Recall that for a Householder re ection P I 7 2 the vector P2 is the mirror re ection of a vector x in the hyperplane through the origin normal to 1 Let x and y be linearly independent vectors of equal Euclidean length By means of a pictorial argument determine a formula for 1 such that P2 y Apply Householder orthogonalization to compute the QR decomposition of the matriX 1 1 1 1 bcowH 86 H Normalize the Householder vectors 11k so that their rst nonzero element is 17 as does LAPACK7 and calculate Bk values Do not actually form the matrices Pk or worse yet the matrix Qk The application of a Householder reduction to an m by 71 matrix A7 71 lt m7 PnP2P1AR yields an upper triangular matrix B This can be used to create a reduced QR factor ization A QR where Q is m by n and R is a square upper triangular matrix What is R in terms of Pl7 P27 7 P 7 B What is Q in terms of13171327 PmR7 11 Given on the graph below is a vector x def Construct the vector x signz1HxH261 on this graph Also construct on the graph in a vector 1 in terms of z and x such that Pm x where P I 7 2 1 1 12 Simplify Ha 7 Hallgelllg where 161 E R and 61F170770 34 Givens Rotations The goal is to use one row of a matrix such as xxxxxx xxxxx axxx xxx xxx bgtltgtltgtlt xxxx xxxx to zero out an element in another row by recombining the two rows By using 1 l 1 where 02 52 17 we can accomplish this with an orthogonal matrix How should we choose 0 and 5 So that 75a Cb 0 that is7 swFE CWw Cost is about 6n7k operations per eliminated element vs 4n7k operations per eliminated element for a Householder re ection where k is the column index of the eliminated element Note The matrix cos0 sin0 isin cos0 denotes a 2 dimensional clockwise rotation by an angle of 6 radians Review questions 1 What is the form of a Givens rotation What is its geometric interpretation 2 Show how a typical Givens rotation is determined in the course of computing a QR factorization Exercises 1 Apply the rst 2 out of 5 Givens rotations in the QR decomposition of the matrix 1 1 1 2 1 3 1 4 1 1 2 Calculate a Givens rotation that zeros out the 41 entry of j without 71 1 changing the 1st and 3rd rows 35 GramSchmidt Orthogonalization Recall that if P is an orthogonal projector for a subspace S7 then y 7 PyLS The classical Gram Schmidt process produces an orthonormal set q1 q2 qn from a linearly independent set 117 127 7 an7 and7 in particular7 q1 q2 7 qk are formed from 117 127 ak k 1727n Q n ql Q1 ill11011127 ZI2 a2 QiqiTaz7 Q2 Uzll zlla T T 3 as i 11 as 1212 137 13 WaxH1st More generally7 771 j 0739 quqgaw k1 11 jijHz Gram Schmidt orthogonalization computes a reduced QR factorization 73971 0739 Zq qga dt HWHZQJ39 kil W w 739 Zrik k1 so T11 T12 Tin T22 Tm 11 02 anQ1 12 Qn 39 3 T V M L J or H1in QiTaz QiTas A A A iii2H2 1203 AQR Hmh This is a reduced QR factorization The algorithm is fork17277ndo Ukak forj17277k71do Tjk 1739 0k W W WM Tkk HUkHZ Qk ilkTM QR vs QR The classical Gram Schmidt process is not very satisfactory numerically A mathemati cally equivalent but numerically superior alternative is given by the modi ed Gram Schmidt process Instead of Us as QiqlTas 1212Tas7 use Us I Q2q2TI QiqlTas v 3 ququa3 q2 q1 Computationally Us as 7 T Us 7 Us 1191 Us Us Us 1212 Us An additional change is also desirable and that is to compute the elements of R row by row instead of column by column A row oriented version of MGS is preferable because it lends itself readily to the use of column pivoting to deal with possible rank de ciency whereas the column oriented version does not Both versions compute Q column by column but after computing each new column qk the row oriented version immediately orthogonalizes the remaining vectors 1H1 1 against qk fork 1 2 ndovkak fork12ndo Tkk HUkll2 Qk UkTkk forj k1k2 ndo ka39 qua Uj Uj Qijki Review questions 1 Let ql qz 1 be a sequence obtained from a linearly independent sequence 11 a2 an by Gram Schmidt orthogonalization The sequence Q1 12 n is uniquely determined by what two properties One involves relationships among the terms of the generated sequence and the other involves relationships between the two sequences Express the second of these properties as a matrix equation 2 Reproduce the algorithm for Gram Schmidt orthogonalization 3 What is the de ning77 idea of the row oriented version of MGS Exercises 1 a Give a high level recursive algorithm for the reduced QR factorization A of an m by 71 matrix by working in terms of a 1 n 7 1 partitioning of the columns of A and Q and a 171 71 by 171 71 partitioning of You will have to use the orthogonality of Q not to mention the triangularity of b Give the nonrecursive equivalent of part a c How is this algorithm related to the classical or modi ed Gram Schmidt process 2 Apply the rst 2 out of 5 Givens rotations in the QR decomposition of the matrix 1131 1141 36 Singular Value Decomposition THEOREM SVD Let A E Rm m 2 71 There exist orthogonal matrices U E Rmxm V 6 RM such that AUEVT 2 where the singular values 01 2 02 2 2 an 2 0 If m gt 71 one can use the SVD for AT to get an SVD for A The SVD provides a geometric interpretation for A VT rotations andor re ections E differential scaling along coordinate axes U rotations andor re ections unit B VB sphere T T EV B UEV B sigmaz E 3 sigma1 Hence HAH2 01 The condition number K2A0391039n Computing the SVD is discussed in Section 47 361 Application to linear least squares problem HbiUEVT bHUTbiEVhw2 Let c UTb7 y VTx The problem is now to minimize HG 7 Eyllz Cl 7 U1y12 39 39 39 0n 7 Unyn2 C 2amp1 39 39 0 solution yi CiUi What if a an 0 but a gt 0 rank de cient Then yr177yn can be anything7 and we have a family of solutions z Vy Suppose we ask for z having minimum M2 HVyllz Hyllz For smallest choose 397 ciUl7 23917277r7 1 07 r17r277n That is7 Ufl 071 y ETC where El T 0 Mooreili enrose pseudoinverse z All AT VETUT 362 Data Compression THEOREM Let A E Rmmm 3 m7 and let B be a matrix of rank k S n for which HBiAHF is smallest Then k B Z uiUlvlT i1 where is the SVD of A The SVD is a good way to compute the rank of a matrix Review questions H position E0 nltm7fornm7andforngtm C40 What is the 2 norm of a matrix in terms of its SVD 4 94 1 Under what conditions does a matrix A 6 me m 2 717 have a singular value decom Specify precisely the form of a singular value decomposition of a matrix A E Rm for What is the 2 norm condition number of a nonsingular matrix in terms of its SVD U 03 5 00 What additional requirement makes the solution of any linear least squares problem unique What is the Moore Penrose pseudoinverse of a rank de cient matrix A 6 me 71 lt m What is the solution of a linear least squares problem with a rank de cient coef cient matrix What is the matrix B of rank k S n for which MB 7 AHF is smallest where A E Rmmm lt m


Buy Material

Are you sure you want to buy this material for

25 Karma

Buy Material

BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.


You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

Why people love StudySoup

Jim McGreen Ohio University

"Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

Janice Dongeun University of Washington

"I used the money I made selling my notes & study guides to pay for spring break in Olympia, Washington...which was Sweet!"

Bentley McCaw University of Florida

"I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

Parker Thompson 500 Startups

"It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

Become an Elite Notetaker and start selling your notes online!

Refund Policy


All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email


StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here:

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.