 7.7.1: Find the solution x to the least squares problem, given that A = QR...
 7.7.2: Let A = D E = d1 d2 . . . dn e1 e2 . . . en and b = b1 b2 ... b2n U...
 7.7.3: Let A = 1 2 1 3 1 2 1 1 , b = 3 10 3 6 (a) Use Householder transfor...
 7.7.4: Let A = 1 1 _ 0 0 _ where _ is a small scalar. (a) Determine the si...
 7.7.5: Show that the pseudoinverse A+ satisfies the four Penrose conditions.
 7.7.6: Let B be any matrix that satisfies Penrose conditions 1 and 3, and ...
 7.7.7: If x Rm, we can think of x as an m 1 matrix. If x _= 0, we can then...
 7.7.8: Show that if A is a m n matrix of rank n, then A+ = (ATA) 1AT .
 7.7.9: Let A be an m n matrix and let b Rm. Show that b R(A) if and only i...
 7.7.10: Let A be an m n matrix with singular value decomposition U_V T , an...
 7.7.11: Let A = 1 1 1 1 0 0 Determine A+ and verify that A and A+ satisfy t...
 7.7.12: Let A = 1 2 1 2 and b = 6 4 (a) Compute the singular value decompos...
 7.7.13: Show each of the following: (a) (A+ ) + = A (b) (AA+ )2 = AA+ (c) (...
 7.7.14: Let A1 = U_1V T and A2 = U_2V T , where _1 = 1 . . . r1 0 . . . 0 a...
 7.7.15: Let A = XYT , where X is an m r matrix, Y T is an r n matrix, and X...
Solutions for Chapter 7.7: Least Squares Problems
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 7.7: Least Squares Problems
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Since 15 problems in chapter 7.7: Least Squares Problems have been answered, more than 5000 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290. Chapter 7.7: Least Squares Problems includes 15 full stepbystep solutions.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Companion matrix.
Put CI, ... ,Cn in row n and put n  1 ones just above the main diagonal. Then det(A  AI) = ±(CI + c2A + C3A 2 + .•. + cnA nl  An).

Covariance matrix:E.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x  x) (x  x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then SI AS = A = eigenvalue matrix.

Elimination.
A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn1c can be computed with ne/2 multiplications. Revolutionary.

Full column rank r = n.
Independent columns, N(A) = {O}, no free variables.

Hilbert matrix hilb(n).
Entries HU = 1/(i + j 1) = Jd X i 1 xj1dx. Positive definite but extremely small Amin and large condition number: H is illconditioned.

Lucas numbers
Ln = 2,J, 3, 4, ... satisfy Ln = L n l +Ln 2 = A1 +A~, with AI, A2 = (1 ± /5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

Nullspace matrix N.
The columns of N are the n  r special solutions to As = O.

Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or 1) based on the number of row exchanges to reach I.

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and AI are BT AT and (AT)I.

Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.

Wavelets Wjk(t).
Stretch and shift the time axis to create Wjk(t) = woo(2j t  k).