- 5.3.1: Find the least squares solution of each of the following systems: (...
- 5.3.2: For each of your solutions x in Exercise 1, (a) determine the proje...
- 5.3.3: For each of the following systems Ax = b, find all least squares so...
- 5.3.4: For each of the systems in Exercise 3, determine the projection p o...
- 5.3.5: (a) Find the best least squares fit by a linear function to the dat...
- 5.3.6: Find the best least squares fit to the data in Exercise 5 by a quad...
- 5.3.7: Given a collection of points (x1, y1), (x2, y2),. . . , (xn, yn), l...
- 5.3.8: The point (x, y) is the center of mass for the collection of points...
- 5.3.9: Let A be an m n matrix of rank n and let P = A(ATA) 1AT . (a) Show ...
- 5.3.10: Let A be an 8 5 matrix of rank 3, and let b be a nonzero vector in ...
- 5.3.11: Let P = A(ATA) 1AT , where A is an mn matrix of rank n. (a) Show th...
- 5.3.12: Show that if A I O AT x r = b 0 then x is a least squares solution ...
- 5.3.13: Let A Rmn and let x be a solution of the least squares problem Ax =...
- 5.3.14: Find the equation of the circle that gives the best least squares c...
Solutions for Chapter 5.3: Least Squares Problems
Full solutions for Linear Algebra with Applications | 8th Edition
Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.
Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!
A = CTC = (L.J]))(L.J]))T for positive definite A.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.
Incidence matrix of a directed graph.
The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .
A symmetric matrix with eigenvalues of both signs (+ and - ).
Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .
= Xl (column 1) + ... + xn(column n) = combination of columns.
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
The diagonal entry (first nonzero) at the time when a row is used in elimination.
Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Similar matrices A and B.
Every B = M-I AM has the same eigenvalues as A.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.
Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.
Volume of box.
The rows (or the columns) of A generate a box with volume I det(A) I.