 6.3.1: In each of the following, factor the matrix A into a product XDX1, ...
 6.3.2: For each of the matrices in Exercise 1, use the XDX1 factorization ...
 6.3.3: For each of the nonsingular matrices in Exercise 1, use the XDX1 fa...
 6.3.4: For each of the following, find a matrix B such that B2 = A: (a) A ...
 6.3.5: Let A be a nondefective nn matrix with diagonalizing matrix X. Show...
 6.3.6: Let A be a diagonalizable matrix whose eigenvalues are all either 1...
 6.3.7: Show that any 3 3 matrix of the form a 1 0 0 a 1 0 0 b is defective.
 6.3.8: For each of the following, find all possible values of the scalar t...
 6.3.9: Let A be a 44 matrix and let be an eigenvalue of multiplicity 3. If...
 6.3.10: Let A be an n n matrix with positive real eigenvalues 1 > 2 > > n. ...
 6.3.11: Let A be a n n matrix with real entries and let 1 = a + bi (where a...
 6.3.12: Let A be an n n matrix with an eigenvalue of multiplicity n. Show t...
 6.3.13: Show that a nonzero nilpotent matrix is defective. 1
 6.3.14: Let A be a diagonalizable matrix and let X be the diagonalizing mat...
 6.3.15: It follows from Exercise 14 that, for a diagonalizable matrix, the ...
 6.3.16: Let A be an n n matrix and let be an eigenvalue of A whose eigenspa...
 6.3.17: Let x, y be nonzero vectors in Rn, n 2, and let A = xyT . Show that...
 6.3.18: Let A be a diagonalizable n n matrix. Prove that if B is any matrix...
 6.3.19: Show that if A and B are two n n matrices with the same diagonalizi...
 6.3.20: Let T be an upper triangular matrix with distinct diagonal entries ...
 6.3.21: Each year, employees at a company are given the option of donating ...
 6.3.22: The city of Mawtookit maintains a constant population of 300,000 pe...
 6.3.23: Let A = 1 2 1 3 1 5 1 4 1 3 2 5 1 4 1 3 2 5 be a transition matrix ...
 6.3.24: Consider a Web network consisting of only four sites that are linke...
 6.3.25: Let A be an n n stochastic matrix and let e be the vector in Rn who...
 6.3.26: The transition matrix in Example 5 has the property that both its r...
 6.3.27: Let A be the PageRank transition matrix and let xk be a vector in t...
 6.3.28: Use the definition of the matrix exponential to compute eA for each...
 6.3.29: Compute eA for each of the following matrices: (a) A = 2 1 6 3 (b) ...
 6.3.30: In each of the following, solve the initial value problem Y_ = AY, ...
 6.3.31: Let be an eigenvalue of an n n matrix A and let x be an eigenvector...
 6.3.32: Show that eA is nonsingular for any diagonalizable matrix A. 3
 6.3.33: Let A be a diagonalizable matrix with characteristic polynomial p()...
Solutions for Chapter 6.3: Diagonalization
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 6.3: Diagonalization
Get Full SolutionsSince 33 problems in chapter 6.3: Diagonalization have been answered, more than 4348 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Chapter 6.3: Diagonalization includes 33 full stepbystep solutions.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

lAII = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n  1, volume of box = I det( A) I.

Normal matrix.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Rotation matrix
R = [~ CS ] rotates the plane by () and R 1 = RT rotates back by (). Eigenvalues are eiO and eiO , eigenvectors are (1, ±i). c, s = cos (), sin ().

Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.

Subspace S of V.
Any vector space inside V, including V and Z = {zero vector only}.

Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and AI are BT AT and (AT)I.

Vector addition.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.