- 5.3.1: Find the least squares solution of each of the following systems: (...
- 5.3.2: For each of your solutions x in Exercise 1, (a) determine the proje...
- 5.3.3: For each of the following systems Ax = b, find all least squares so...
- 5.3.4: For each of the systems in Exercise 3, determine the projection p o...
- 5.3.5: (a) Find the best least squares fit by a linear function to the dat...
- 5.3.6: Find the best least squares fit to the data in Exercise 5 by a quad...
- 5.3.7: Given a collection of points (x1, y1), (x2, y2),. . . , (xn, yn), l...
- 5.3.8: The point (x, y) is the center of mass for the collection of points...
- 5.3.9: Let A be an m n matrix of rank n and let P = A(ATA) 1AT . (a) Show ...
- 5.3.10: Let A be an 8 5 matrix of rank 3, and let b be a nonzero vector in ...
- 5.3.11: Let P = A(ATA) 1AT , where A is an mn matrix of rank n. (a) Show th...
- 5.3.12: Show that if A I O AT x r = b 0 then x is a least squares solution ...
- 5.3.13: Let A Rmn and let x be a solution of the least squares problem Ax =...
- 5.3.14: Find the equation of the circle that gives the best least squares c...
Solutions for Chapter 5.3: Least Squares Problems
Full solutions for Linear Algebra with Applications | 8th Edition
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.
Remove row i and column j; multiply the determinant by (-I)i + j •
Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Dimension of vector space
dim(V) = number of vectors in any basis for V.
Eigenvalue A and eigenvector x.
Ax = AX with x#-O so det(A - AI) = o.
Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.
0,1,1,2,3,5, ... satisfy Fn = Fn-l + Fn- 2 = (A7 -A~)I()q -A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].
Identity matrix I (or In).
Diagonal entries = 1, off-diagonal entries = 0.
Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.
Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.
Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
Reflection matrix (Householder) Q = I -2uuT.
Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.
Skew-symmetric matrix K.
The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.
Symmetric matrix A.
The transpose is AT = A, and aU = a ji. A-I is also symmetric.
Unitary matrix UH = U T = U-I.
Orthonormal columns (complex analog of Q).
Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).