 5.3.1: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.2: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.3: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.4: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.5: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.6: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.7: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.8: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.9: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.10: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.11: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.12: In Exercises 1 12, a matrix A and its characteristic polynomial ar...
 5.3.13: In Exercises 13 20, a matrix A is given. Find, if possible. an inv...
 5.3.14: In Exercises 13 20, a matrix A is given. Find, if possible. an inv...
 5.3.15: In Exercises 13 20, a matrix A is given. Find, if possible. an inv...
 5.3.16: In Exercises 13 20, a matrix A is given. Find, if possible. an inv...
 5.3.17: In Exercises 13 20, a matrix A is given. Find, if possible. an inv...
 5.3.18: In Exercises 13 20, a matrix A is given. Find, if possible. an inv...
 5.3.19: In Exercises 13 20, a matrix A is given. Find, if possible. an inv...
 5.3.20: In Exercises 13 20, a matrix A is given. Find, if possible. an inv...
 5.3.21: In Exercises 2128, use complex wmbers to find an invertible matrix...
 5.3.22: In Exercises 2128, use complex wmbers to find an invertible matrix...
 5.3.23: In Exercises 2128, use complex wmbers to find an invertible matrix...
 5.3.24: In Exercises 2128, use complex wmbers to find an invertible matrix...
 5.3.25: In Exercises 2128, use complex wmbers to find an invertible matrix...
 5.3.26: In Exercises 2128, use complex wmbers to find an invertible matrix...
 5.3.27: In Exercises 2128, use complex wmbers to find an invertible matrix...
 5.3.28: In Exercises 2128, use complex wmbers to find an invertible matrix...
 5.3.29: Every 11 x n matrix is diagonali1able.
 5.3.30: An 11 x 11 matrix A is diagonahzable tf and only if there i> a basi...
 5.3.31: If P is an invenible 11 x 11 matnx and D " a diagonal 11 x 11 matri...
 5.3.32: If P is an invenible matrix and D is a diagonal matrix such that A ...
 5.3.33: If A is a diagonalizable matrix, then there exbts a unique diagonal...
 5.3.34: lf an 11 x 11 matrix has 11 distinct eigenvectors. then it is diago...
 5.3.35: E'cry diagonalizablc 11 x 11 matrix has 11 distinct eigenvalue.
 5.3.36: If Bt. Bz ..... Bt are ba.es for distinct eigenspaces of a matrix ...
 5.3.37: If the sum of the multiplic ities of the eigenvalues of an 11 x 11 ...
 5.3.38: If, for each eigenvalue A of A. the multiplicity of A equals the di...
 5.3.39: If A is a diagonalizablc 6 x 6 matrix havi ng two distinct eigenval...
 5.3.40: If A is an eigenvalue of A, then the dimension of the ctgenspacc co...
 5.3.41: A dlagonal11 x n matrix has 11 dtslmcl ei
 5.3.42: A diagonal matrix is dtagonahzable.
 5.3.43: The standard vectors are e igenvectors of a diagonal matrix.
 5.3.44: Let A und P be 11 x 11 matrices. If the columns of I' form u sci of...
 5.3.45: If S is a set of distinct eigenvectoN of :1 matrix. then S i~ linea...
 5.3.46: If S is a set of eigenvectors of a matrix A that correspond to dist...
 5.3.47: If the characteristic polynomial of 3 matrix A factors into a produ...
 5.3.48: If. for each eigenvalue i.. of a matrix A. the dimension of the eig...
 5.3.49: A 3 x 3 matrix has eigenvalues 4. 2. and 5. Is the m:llrix diagona...
 5.3.50: A 4 x 4 matrix h:L< eigenvalues 3, I, 2, and 5. Is the m:urix dia...
 5.3.51: A 4 x 4 matrix has e igenvalues 3, I. and 2. The e igenvalue  1 ...
 5.3.52: A 5 x 5 matrix has eigenvalues 4. which has multiplicity 3. and 6....
 5.3.53: A 5 x 5 matrix has eigenvalues  3. which has multiplicity 4. and 7...
 5.3.54: Let A be a 4 x 4 matrix with exactly the eigenvalue; 2 and 7. and c...
 5.3.55: Let A be a 5 x 5 matrix with exactly the eigen,aluc; 4. 5. and 8. a...
 5.3.56: Let A = I I _ 2] [2 2 and B = I (a) Show th:ll AB and BA ha,e the ...
 5.3.57: In Exen:iu:. 57 62, {Ill II X II IIWirix A. a /xJsis far n ('OIISi...
 5.3.58: In Exen:iu:. 57 62, {Ill II X II IIWirix A. a /xJsis far n ('OIISi...
 5.3.59: In Exen:iu:. 57 62, {Ill II X II IIWirix A. a /xJsis far n ('OIISi...
 5.3.60: In Exen:iu:. 57 62, {Ill II X II IIWirix A. a /xJsis far n ('OIISi...
 5.3.61: In Exen:iu:. 57 62, {Ill II X II IIWirix A. a /xJsis far n ('OIISi...
 5.3.62: In Exen:iu:. 57 62, {Ill II X II IIWirix A. a /xJsis far n ('OIISi...
 5.3.63: 11 urcise.r 63 72. a mmri..1 tmd its dwracteristic polynomial are g...
 5.3.64: 11 urcise.r 63 72. a mmri..1 tmd its dwracteristic polynomial are g...
 5.3.65: 11 urcise.r 63 72. a mmri..1 tmd its dwracteristic polynomial are g...
 5.3.66: 11 urcise.r 63 72. a mmri..1 tmd its dwracteristic polynomial are g...
 5.3.67: 11 urcise.r 63 72. a mmri..1 tmd its dwracteristic polynomial are g...
 5.3.68: 11 urcise.r 63 72. a mmri..1 tmd its dwracteristic polynomial are g...
 5.3.69: 11 urcise.r 63 72. a mmri..1 tmd its dwracteristic polynomial are g...
 5.3.70: 11 urcise.r 63 72. a mmri..1 tmd its dwracteristic polynomial are g...
 5.3.71: 11 urcise.r 63 72. a mmri..1 tmd its dwracteristic polynomial are g...
 5.3.72: 11 urcise.r 63 72. a mmri..1 tmd its dwracteristic polynomial are g...
 5.3.73: Find a 2 x 2 mmrix having eigenvalues  3 and 5, with corresponding...
 5.3.74: Find a 2 x 2 matrix having eigenvalues 7 and 4. with corresponding...
 5.3.75: Find a 3 x 3 matnx having eigen
 5.3.76: Find a 3 x 3 matrix having eigen\alue> 3. 2. and 2. with correspond...
 5.3.77: Give an example of diagonalitable 11 x 11 mat rice> A and B such th...
 5.3.78: Give an example of tliagonalitablc" x 11 mat rice> A and B such tha...
 5.3.79: Show that every diagon:1l 11 x
 5.3.80: ) Let A be an 11 x 11 matrix having a single eigenvalue c. Show tha...
 5.3.81: If A is a diagonalizablc matrix. p1'0VC that AT is diagon:tl izable.
 5.3.82: If A is an invcniblc matrix that is diagon:lizable. prove that A I
 5.3.83: If A is a di:gonali1.:1blc matrix. prove that A 2 is diagonal izable
 5.3.84: If A is a di:gonalitable matrix. prove that At i diagonal izable fo...
 5.3.85: Suppose that A and 8 are >imilar matrices such that B = P 1 AP for ...
 5.3.86: A matrix 8 is called a cub~ root of a matrix A if 8 3 =A. Prove tha...
 5.3.87: Prove that if a nilpotent matrix is diagonalizable. then it must be...
 5.3.88: Let A be a diagonalizable 11 x 11 matrix. Prove that if the charact...
 5.3.89: The tra ce of a square matrix is the sum of its diagonal entries. (...
 5.3.90: 11 Exercises 90 94. 11se either a calc11lator with matrix capabili...
 5.3.91: 11 Exercises 90 94. 11se either a calc11lator with matrix capabili...
 5.3.92: 11 Exercises 90 94. 11se either a calc11lator with matrix capabili...
 5.3.93: 11 Exercises 90 94. 11se either a calc11lator with matrix capabili...
 5.3.94: 11 Exercises 90 94. 11se either a calc11lator with matrix capabili...
Solutions for Chapter 5.3: EIGENVALUES, EIGENVECTORS, AND DIAGONALIZATION
Full solutions for Elementary Linear Algebra: A Matrix Approach  2nd Edition
ISBN: 9780131871410
Solutions for Chapter 5.3: EIGENVALUES, EIGENVECTORS, AND DIAGONALIZATION
Get Full SolutionsSince 94 problems in chapter 5.3: EIGENVALUES, EIGENVECTORS, AND DIAGONALIZATION have been answered, more than 22816 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Elementary Linear Algebra: A Matrix Approach, edition: 2. Chapter 5.3: EIGENVALUES, EIGENVECTORS, AND DIAGONALIZATION includes 94 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. Elementary Linear Algebra: A Matrix Approach was written by and is associated to the ISBN: 9780131871410.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Elimination.
A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

GramSchmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

Norm
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b  Ax) = o.

Nullspace matrix N.
The columns of N are the n  r special solutions to As = O.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Standard basis for Rn.
Columns of n by n identity matrix (written i ,j ,k in R3).

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.