 6.1.1: Find the eigenvalues and the corresponding eigenspaces for each of ...
 6.1.2: Show that the eigenvalues of a triangular matrix are the diagonal e...
 6.1.3: Let A be an n n matrix. Prove that A is singular if and only if = 0...
 6.1.4: Let A be a nonsingular matrix and let be an eigenvalue of A. Show t...
 6.1.5: Let A and B be n n matrices. Show that if none of the eigenvalues o...
 6.1.6: . Let be an eigenvalue of A and let x be an eigenvector belonging t...
 6.1.7: Let A be an n n matrix and let B = I 2A + A2. (a) Show that if x is...
 6.1.8: . An n n matrix A is said to be idempotent if A2 = A. Show that if ...
 6.1.9: An nn matrix is said to be nilpotent if Ak = O for some positive in...
 6.1.10: Let A be an n n matrix and let B = A I for some scalar . How do the...
 6.1.11: Let A be an n n matrix and let B = A + I. Is it possible for A and ...
 6.1.12: Show that A and AT have the same eigenvalues. Do they necessarily h...
 6.1.13: Show that the matrix A = cos sin sin cos will have complex eigenval...
 6.1.14: Let A be a 2 2 matrix. If tr(A) = 8 and det(A) = 12, what are the e...
 6.1.15: Let A = (aij) be an n n matrix with eigenvalues 1, ... , n. Show th...
 6.1.16: Let A be a 2 2 matrix and let p() = 2 + b + c be the characteristic...
 6.1.17: Let be a nonzero eigenvalue of A and let x be an eigenvector belong...
 6.1.18: . Let A be an n n matrix and let be an eigenvalue of A. If A I has ...
 6.1.19: Let A be an n n matrix. Show that a vector x in either Rn or Cn is ...
 6.1.20: Let = a + bi and = c + di be complex scalars and let A and B be mat...
 6.1.21: Let Q be an orthogonal matrix. (a) Show that if is an eigenvalue of...
 6.1.22: Let Q be an orthogonal matrix with an eigenvalue 1 = 1 and let x be...
 6.1.23: Let Q be a 3 3 orthogonal matrix whose determinant is equal to 1. (...
 6.1.24: Let x1, ... , xr be eigenvectors of an n n matrix A and let S be th...
 6.1.25: Let A be an n n matrix and let be an eigenvalue of A. Show that if ...
 6.1.26: Let B = S1AS and let x be an eigenvector of B belonging to an eigen...
 6.1.27: Let A be an n n matrix with an eigenvalue and let x be an eigenvect...
 6.1.28: Show that if two n n matrices A and B have a common eigenvector x (...
 6.1.29: Let A be an n n matrix and let be a nonzero eigenvalue of A. Show t...
 6.1.30: Let {u1, u2, ... , un} be an orthonormal basis for Rn and let A be ...
 6.1.31: Let A be a matrix whose columns all add up to a fixed constant . Sh...
 6.1.32: Let 1 and 2 be distinct eigenvalues of A. Let x be an eigenvector o...
 6.1.33: Let A and B be n n matrices. Show that (a) If is a nonzero eigenval...
 6.1.34: Prove that there do not exist n n matrices A and B such that AB BA ...
 6.1.35: Let p() = (1)n(n an1n1 a1 a0) be a polynomial of degree n 1, and le...
 6.1.36: The result given in Exercise 35(b) holds even if all the eigenvalue...
Solutions for Chapter 6.1: Eigenvalues and Eigenvectors
Full solutions for Linear Algebra with Applications  9th Edition
ISBN: 9780321962218
Solutions for Chapter 6.1: Eigenvalues and Eigenvectors
Get Full SolutionsSince 36 problems in chapter 6.1: Eigenvalues and Eigenvectors have been answered, more than 10457 students have viewed full stepbystep solutions from this chapter. Linear Algebra with Applications was written by and is associated to the ISBN: 9780321962218. Chapter 6.1: Eigenvalues and Eigenvectors includes 36 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 9.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Cholesky factorization
A = CTC = (L.J]))(L.J]))T for positive definite A.

Companion matrix.
Put CI, ... ,Cn in row n and put n  1 ones just above the main diagonal. Then det(A  AI) = ±(CI + c2A + C3A 2 + .•. + cnA nl  An).

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Fibonacci numbers
0,1,1,2,3,5, ... satisfy Fn = Fnl + Fn 2 = (A7 A~)I()q A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A  AI) if no eigenvalues are repeated; always meA) divides peA).

Multiplication Ax
= Xl (column 1) + ... + xn(column n) = combination of columns.

Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b  Ax) = o.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Outer product uv T
= column times row = rank one matrix.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Similar matrices A and B.
Every B = MI AM has the same eigenvalues as A.

Solvable system Ax = b.
The right side b is in the column space of A.

Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.

Unitary matrix UH = U T = UI.
Orthonormal columns (complex analog of Q).