- 6.7.1: 1-3 compute the SVD of a square singular matrix A.
- 6.7.2: 1-3 compute the SVD of a square singular matrix A.
- 6.7.3: 1-3 compute the SVD of a square singular matrix A.
- 6.7.4: 4-7 ask for the SVD of matrices of rank 2.
- 6.7.5: 4-7 ask for the SVD of matrices of rank 2.
- 6.7.6: 4-7 ask for the SVD of matrices of rank 2.
- 6.7.7: 4-7 ask for the SVD of matrices of rank 2.
- 6.7.8: A square invertible matrix has A-I = V :E-1 UT This says that the s...
- 6.7.9: Suppose Ul, ... ,Un and Vb .. . ,Vn are orthononnal bases for Rn. ,...
- 6.7.10: Construct the matrix with rank one that has Av 12u for v = !(1, 1, ...
- 6.7.11: Suppose A has orthogonal columns WI, W2, .. . ,Wn of lengths (Jl, (...
- 6.7.12: Suppose A is a 2 by 2 symmetric matrix with unit eigenvectors UI an...
- 6.7.13: If A = QR with an orthogonal matrix Q, the SVD of A is almost the s...
- 6.7.14: Suppose A is invertible (with (Jl > (J2 > 0). Change A by as small ...
- 6.7.15: Why doesn't the SVD for A + I just use 2: + I?
- 6.7.16: Run a random walk x (2), ... , x (n) starting from web site x (1) =...
- 6.7.17: The 1, -1 first difference matrix A has AT A = second difference ma...
Solutions for Chapter 6.7: Singular Value Decomposition (SVD)
Full solutions for Introduction to Linear Algebra | 4th Edition
Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.
Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.
Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.
Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.
Hilbert matrix hilb(n).
Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.
Incidence matrix of a directed graph.
The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .
lA-II = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n - 1, volume of box = I det( A) I.
Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .
Outer product uv T
= column times row = rank one matrix.
Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.
Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).
Row space C (AT) = all combinations of rows of A.
Column vectors by convention.
Schur complement S, D - C A -} B.
Appears in block elimination on [~ g ].
Solvable system Ax = b.
The right side b is in the column space of A.
Special solutions to As = O.
One free variable is Si = 1, other free variables = o.
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.
Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.