- 7.3.1: Label the following statements as true or false. Assume that all ve...
- 7.3.2: Label the following statements as true or false. Assume that all ve...
- 7.3.3: For each linear operator T on V, find the minimal polynomial of T. ...
- 7.3.4: Determine which of the matrices and operators in Exercises 2 and 3 ...
- 7.3.5: Describe all linear operators T on R2 such that T is diagonalizable...
- 7.3.6: Prove Theorem 7.13 and its corollary.
- 7.3.7: Prove the corollary to Theorem 7.14.
- 7.3.8: Let T be a linear operator on a finite-dimensional vector space, an...
- 7.3.9: Let T be a diagonalizable linear operator on a finite-dimensional v...
- 7.3.10: Let T be a linear operator on a finite-dimensional vector space V, ...
- 7.3.11: Let g(t) be the auxiliary polynomial associated with a homogeneous ...
- 7.3.12: Let D be the differentiation operator on P(/?), the space of polyno...
- 7.3.13: Let T be a linear operator on a finite-dimensional vector space, an...
- 7.3.14: Let T be linear operator on a finite-dimensional vector space V, an...
- 7.3.15: Let T be a linear operator on a finite-dimensional vector space V, ...
- 7.3.16: T be a linear operator on a finite-dimensional vector space V, and ...
Solutions for Chapter 7.3: The Minimal Polynomial
Full solutions for Linear Algebra | 4th Edition
Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.
Change of basis matrix M.
The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)
Remove row i and column j; multiply the determinant by (-I)i + j •
Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).
Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.
Dimension of vector space
dim(V) = number of vectors in any basis for V.
Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.
Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.
Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Every v in V is orthogonal to every w in W.
Outer product uv T
= column times row = rank one matrix.
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.
Row space C (AT) = all combinations of rows of A.
Column vectors by convention.
Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.
Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.