- 8.C.1: Suppose T 2 L.C4/ is such that the eigenvalues of T are 3, 5, 8. Pr...
- 8.C.2: Suppose V is a complex vector space. Suppose T 2 L.V / is such that...
- 8.C.3: Give an example of an operator on C4 whose characteristic polynomia...
- 8.C.4: Give an example of an operator on C4 whose characteristic polynomia...
- 8.C.5: Give an example of an operator on C4 whose characteristic and minim...
- 8.C.6: Give an example of an operator on C4 whose characteristic polynomia...
- 8.C.7: Suppose V is a complex vector space. Suppose T 2 L.V / is such that...
- 8.C.8: Suppose T 2 L.V /. Prove that T is invertible if and only if the co...
- 8.C.9: Suppose T 2 L.V / has minimal polynomial 4C5z6z27z3C2z4Cz5.Find the...
- 8.C.10: Suppose V is a complex vector space and T 2 L.V / is invertible.Let...
- 8.C.11: Suppose T 2 L.V / is invertible. Prove that there exists a polynomi...
- 8.C.12: Suppose V is a complex vector space and T 2 L.V /. Prove that Vhas ...
- 8.C.13: Suppose V is an inner product space and T 2 L.V / is normal. Provet...
- 8.C.14: Suppose V is a complex inner product space and S 2 L.V / is anisome...
- 8.C.15: Suppose T 2 L.V / and v 2 V.(a) Prove that there exists a unique mo...
- 8.C.16: Suppose V is an inner product space and T 2 L.V /. Supposea0 C a1z ...
- 8.C.17: Suppose F D C and T 2 L.V /. Suppose the minimal polynomial of Thas...
- 8.C.18: Suppose a0;:::;an1 2 C. Find the minimal and characteristic polynom...
- 8.C.19: Suppose V is a complex vector space and T 2 L.V /. Suppose thatwith...
- 8.C.20: Suppose V is a complex vector space and V1;:::;Vm are nonzero subsp...
Solutions for Chapter 8.C: Characteristic and Minimal Polynomials
Full solutions for Linear Algebra Done Right (Undergraduate Texts in Mathematics) | 3rd Edition
Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.
Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.
Free variable Xi.
Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.
Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.
Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).
Nullspace N (A)
= All solutions to Ax = O. Dimension n - r = (# columns) - rank.
Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.
Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.
Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.
Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!
Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.
Constant down each diagonal = time-invariant (shift-invariant) filter.
Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.
Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).