 6.1.1: Find the eigenvalues and the corresponding eigenspaces for each of ...
 6.1.2: Show that the eigenvalues of a triangular matrix are the diagonal e...
 6.1.3: Let A be an n n matrix. Prove that A is singular if and only if = 0...
 6.1.4: Let A be a nonsingular matrix and let be an eigenvalue of A. Show t...
 6.1.5: Let A and B be n n matrices. Show that if none of the eigenvalues o...
 6.1.6: Let be an eigenvalue of A and let x be an eigenvector belonging to ...
 6.1.7: Let A be an n n matrix and let B = I 2A+ A2. (a) Show that if x is ...
 6.1.8: An n n matrix A is said to be idempotent if A2 = A. Show that if is...
 6.1.9: An nn matrix is said to be nilpotent if Ak = O for some positive in...
 6.1.10: Let A be an n n matrix and let B = A I for some scalar . How do the...
 6.1.11: Let A be an n n matrix and let B = A + I. Is it possible for A and ...
 6.1.12: Show that A and AT have the same eigenvalues. Do they necessarily h...
 6.1.13: Show that the matrix A = cos sin sin cos will have complex eigenval...
 6.1.14: Let A be a 2 2 matrix. If tr(A) = 8 and det(A) = 12, what are the e...
 6.1.15: Let A = (ai j ) be an n n matrix with eigenvalues 1, . . . , n. Sho...
 6.1.16: Let A be a 22 matrix and let p() = 2 +b+c be the characteristic pol...
 6.1.17: Let be a nonzero eigenvalue of A and let x be an eigenvector belong...
 6.1.18: Let A be an nn matrix and let be an eigenvalue of A. If A I has ran...
 6.1.19: Let A be an n n matrix. Show that a vector x in Rn is an eigenvecto...
 6.1.20: Let = a +bi and = c +di be complex scalars and let A and B be matri...
 6.1.21: Let Q be an orthogonal matrix. (a) Show that if is an eigenvalue of...
 6.1.22: Let Q be an orthogonal matrix with an eigenvalue 1 = 1 and let x be...
 6.1.23: Let Q be a 3 3 orthogonal matrix whose determinant is equal to 1. (...
 6.1.24: Let x1, . . . , xr be eigenvectors of an n n matrix A and let S be ...
 6.1.25: Let A be an n n matrix and let be an eigenvalue of A. Show that if ...
 6.1.26: Let B = S1AS and let x be an eigenvector of B belonging to an eigen...
 6.1.27: Let A be an n n matrix with an eigenvalue and let x be an eigenvect...
 6.1.28: Show that if two n n matrices A and B have a common eigenvector x (...
 6.1.29: Let A be an n n matrix and let be a nonzero eigenvalue of A. Show t...
 6.1.30: Let {u1, u2, . . . , un} be an orthonormal basis for Rn and let A b...
 6.1.31: Let A be a matrix whose columns all add up to a fixed constant . Sh...
 6.1.32: Let 1 and 2 be distinct eigenvalues of A. Let x be an eigenvector o...
 6.1.33: Let A and B be n n matrices. Show that (a) If is a nonzero eigenval...
 6.1.34: Prove that there do not exist n n matrices A and B such that AB BA ...
 6.1.35: Let p() = (1)n(n an1n1 a1a0) be a polynomial of degree n 1, and let...
 6.1.36: The result given in Exercise 35(b) holds even if all the eigenvalue...
Solutions for Chapter 6.1: Eigenvalues and Eigenvectors
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 6.1: Eigenvalues and Eigenvectors
Get Full SolutionsChapter 6.1: Eigenvalues and Eigenvectors includes 36 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Since 36 problems in chapter 6.1: Eigenvalues and Eigenvectors have been answered, more than 3440 students have viewed full stepbystep solutions from this chapter. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Characteristic equation det(A  AI) = O.
The n roots are the eigenvalues of A.

Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn  l . Cx = convolution c * x. Eigenvectors in F.

Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax  x Tb over growing Krylov subspaces.

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

Full column rank r = n.
Independent columns, N(A) = {O}, no free variables.

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

Nullspace matrix N.
The columns of N are the n  r special solutions to As = O.

Partial pivoting.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or 1) based on the number of row exchanges to reach I.

Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(DÂ» O.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Rank r (A)
= number of pivots = dimension of column space = dimension of row space.

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Spectrum of A = the set of eigenvalues {A I, ... , An}.
Spectral radius = max of IAi I.