- 6.7.1: For each of the following matrices, compute the determinants of all...
- 6.7.2: Let A be a 33 symmetric positive definite matrix and suppose that d...
- 6.7.3: Let A = 2 1 0 0 1 2 1 0 0 1 2 1 0 0 1 2 (a) Compute the LU factoriz...
- 6.7.4: For each of the following, factor the given matrix into a product L...
- 6.7.5: Find the Cholesky decomposition LLT for each of the matrices in Exe...
- 6.7.6: Let A be an n n symmetric positive definite matrix. For each x, y R...
- 6.7.7: Let A be a nonsingular n n matrix, and suppose that A = L1D1U1 = L2...
- 6.7.8: Let A be a symmetric positive definite matrix and let Q be an ortho...
- 6.7.9: Let B be an mn matrix of rank n. Show that BTB is positive definite. 1
- 6.7.10: Let A be a symmetric n n matrix. Show that eA is symmetric and posi...
- 6.7.11: Show that if B is a symmetric nonsingular matrix, then B2 is positi...
- 6.7.12: Let A = 1 1 2 1 2 1 and B = 1 1 0 1 (a) Show that A is positive def...
- 6.7.13: Let A be an n n symmetric negative definite matrix. (a) What will t...
- 6.7.14: Let A be a symmetric positive definite nn matrix. (a) If k < n, the...
Solutions for Chapter 6.7: Positive Definite Matrices
Full solutions for Linear Algebra with Applications | 8th Edition
A = CTC = (L.J]))(L.J]))T for positive definite A.
Column space C (A) =
space of all combinations of the columns of A.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.
Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Free variable Xi.
Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).
Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.
A symmetric matrix with eigenvalues of both signs (+ and - ).
Jordan form 1 = M- 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.
Pivot columns of A.
Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.
Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.
Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!
Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.
Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!
Standard basis for Rn.
Columns of n by n identity matrix (written i ,j ,k in R3).
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.
Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.