 6.1.1: Find the eigenvalues and the corresponding eigenspaces for each of ...
 6.1.2: Show that the eigenvalues of a triangular matrix are the diagonal e...
 6.1.3: Let A be an n n matrix. Prove that A is singular if and only if = 0...
 6.1.4: Let A be a nonsingular matrix and let be an eigenvalue of A. Show t...
 6.1.5: Let A and B be n n matrices. Show that if none of the eigenvalues o...
 6.1.6: Let be an eigenvalue of A and let x be an eigenvector belonging to ...
 6.1.7: Let A be an n n matrix and let B = I 2A+ A2. (a) Show that if x is ...
 6.1.8: An n n matrix A is said to be idempotent if A2 = A. Show that if is...
 6.1.9: An nn matrix is said to be nilpotent if Ak = O for some positive in...
 6.1.10: Let A be an n n matrix and let B = A I for some scalar . How do the...
 6.1.11: Let A be an n n matrix and let B = A + I. Is it possible for A and ...
 6.1.12: Show that A and AT have the same eigenvalues. Do they necessarily h...
 6.1.13: Show that the matrix A = cos sin sin cos will have complex eigenval...
 6.1.14: Let A be a 2 2 matrix. If tr(A) = 8 and det(A) = 12, what are the e...
 6.1.15: Let A = (ai j ) be an n n matrix with eigenvalues 1, . . . , n. Sho...
 6.1.16: Let A be a 22 matrix and let p() = 2 +b+c be the characteristic pol...
 6.1.17: Let be a nonzero eigenvalue of A and let x be an eigenvector belong...
 6.1.18: Let A be an nn matrix and let be an eigenvalue of A. If A I has ran...
 6.1.19: Let A be an n n matrix. Show that a vector x in Rn is an eigenvecto...
 6.1.20: Let = a +bi and = c +di be complex scalars and let A and B be matri...
 6.1.21: Let Q be an orthogonal matrix. (a) Show that if is an eigenvalue of...
 6.1.22: Let Q be an orthogonal matrix with an eigenvalue 1 = 1 and let x be...
 6.1.23: Let Q be a 3 3 orthogonal matrix whose determinant is equal to 1. (...
 6.1.24: Let x1, . . . , xr be eigenvectors of an n n matrix A and let S be ...
 6.1.25: Let A be an n n matrix and let be an eigenvalue of A. Show that if ...
 6.1.26: Let B = S1AS and let x be an eigenvector of B belonging to an eigen...
 6.1.27: Let A be an n n matrix with an eigenvalue and let x be an eigenvect...
 6.1.28: Show that if two n n matrices A and B have a common eigenvector x (...
 6.1.29: Let A be an n n matrix and let be a nonzero eigenvalue of A. Show t...
 6.1.30: Let {u1, u2, . . . , un} be an orthonormal basis for Rn and let A b...
 6.1.31: Let A be a matrix whose columns all add up to a fixed constant . Sh...
 6.1.32: Let 1 and 2 be distinct eigenvalues of A. Let x be an eigenvector o...
 6.1.33: Let A and B be n n matrices. Show that (a) If is a nonzero eigenval...
 6.1.34: Prove that there do not exist n n matrices A and B such that AB BA ...
 6.1.35: Let p() = (1)n(n an1n1 a1a0) be a polynomial of degree n 1, and let...
 6.1.36: The result given in Exercise 35(b) holds even if all the eigenvalue...
Solutions for Chapter 6.1: Eigenvalues and Eigenvectors
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 6.1: Eigenvalues and Eigenvectors
Get Full SolutionsChapter 6.1: Eigenvalues and Eigenvectors includes 36 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Since 36 problems in chapter 6.1: Eigenvalues and Eigenvectors have been answered, more than 6818 students have viewed full stepbystep solutions from this chapter. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn  l . Cx = convolution c * x. Eigenvectors in F.

Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

Column space C (A) =
space of all combinations of the columns of A.

Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

Full column rank r = n.
Independent columns, N(A) = {O}, no free variables.

Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

Hilbert matrix hilb(n).
Entries HU = 1/(i + j 1) = Jd X i 1 xj1dx. Positive definite but extremely small Amin and large condition number: H is illconditioned.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.

Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.

Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B IIĀ·

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.