 5.1.1: Label the following statements as true or false. (a) Every linear o...
 5.1.2: For each of the following linear operators T on a vector space V an...
 5.1.3: For each of the following matrices A G MnXn (F), (i) Determine all ...
 5.1.4: For each linear operator T on V, find the eigenvalues of T and an o...
 5.1.5: Prove Theorem 5.4.
 5.1.6: Let T be a linear operator on a finitedimensional vector space V, ...
 5.1.7: Let T be a linear operator on a finitedimensional vector space V. ...
 5.1.8: (a) Prove that a linear operator T on a finitedimensional vector s...
 5.1.9: Prove that the eigenvalues of an upper triangular matrix M are the ...
 5.1.10: Let V be a finitedimensional vector space, and let A be any scalar...
 5.1.11: A scalar matrix is a square matrix of the form XI for some scalar A...
 5.1.12: (a) Prove that similar matrices have the same characteristic polyno...
 5.1.13: Let T be a linear operator on a finitedimensional vector space V o...
 5.1.14: For any square matrix A, prove that A and A1 have the same characte...
 5.1.15: (a) Let T be a linear operator on a vector space V, and let x be an...
 5.1.16: (a) Prove that similar matrices have the same trace. Hint: Use Exer...
 5.1.17: Let T be the linear operator on MnXn(i?) defined by T(A) = A1 . (a)...
 5.1.18: Let A,B e M x n (C). (a) Prove that if B is invertible, then there ...
 5.1.19: Let A and B be similar nxn matrices. Prove that there exists an ndi...
 5.1.20: Let A be an nxn matrix with characteristic polynomial f(t) = (l)n ...
 5.1.21: Let A and f(t) be as in Exercise 20. (a) Prove that/( 0 = (Ant)(A2...
 5.1.22: (a) Let T be a linear operator on a vector space V over the field F...
 5.1.23: Use Exercise 22 to prove that if f(t) is the characteristic polynom...
 5.1.24: Use Exercise 21(a) to prove Theorem 5.3.
 5.1.25: Prove Corollaries 1 and 2 of Theorem 5.3
 5.1.26: Determine the number of distinct characteristic polynomials of matr...
Solutions for Chapter 5.1: Eigenvalues and Eigenvectors
Full solutions for Linear Algebra  4th Edition
ISBN: 9780130084514
Solutions for Chapter 5.1: Eigenvalues and Eigenvectors
Get Full SolutionsThis textbook survival guide was created for the textbook: Linear Algebra , edition: 4th. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 5.1: Eigenvalues and Eigenvectors includes 26 full stepbystep solutions. Linear Algebra was written by and is associated to the ISBN: 9780130084514. Since 26 problems in chapter 5.1: Eigenvalues and Eigenvectors have been answered, more than 6426 students have viewed full stepbystep solutions from this chapter.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Cofactor Cij.
Remove row i and column j; multiply the determinant by (I)i + j •

Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Multiplication Ax
= Xl (column 1) + ... + xn(column n) = combination of columns.

Nullspace N (A)
= All solutions to Ax = O. Dimension n  r = (# columns)  rank.

Orthogonal subspaces.
Every v in V is orthogonal to every w in W.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Pascal matrix
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.

Unitary matrix UH = U T = UI.
Orthonormal columns (complex analog of Q).

Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.