 7.2.1: Compute the eigenvalues and associated eigenvectors of the followin...
 7.2.2: Compute the eigenvalues and associated eigenvectors of the followin...
 7.2.3: Find the complex eigenvalues and associated eigenvectors for the fo...
 7.2.4: Find the complex eigenvalues and associated eigenvectors for the fo...
 7.2.5: Find the spectral radius for each matrix in Exercise 1.
 7.2.6: Find the spectral radius for each matrix in Exercise 2
 7.2.7: Which ofthe matrices in Exercise I are convergent?
 7.2.8: Which ofthe matrices in Exercise 2 are convergent?
 7.2.9: Find the I2 norm for the matrices in Exercise I.
 7.2.10: Find the I2 norm for the matrices in Exercise 2.
 7.2.11: Let A1 1 0 " 1 1 i 0 1 and A2 = 2 . .4 2  . 16 2 . . Show that A1 ...
 7.2.12: An n x n matrix A is called nilpotent if an integer m exists with A...
 7.2.13: In Exercise 11 ofSection 6.3, we assumed that the contribution a fe...
 7.2.14: In Exercise 11 of Section 6.5, a female beetle population was consi...
 7.2.15: Show that the characteristic polynomial p(A) = det(A A/) for the n ...
 7.2.16: a. Show that if A is an n x n matrix, then n detA = JJA, i=l where...
 7.2.17: Let A be an eigenvalue ofthe n x matrix A and x ^ 0 be an associate...
 7.2.18: Show that if A is symmetric, then 42 = p(A).
 7.2.19: Find matrices A and B for which p(A + B) > p(A) + p(B). (This shows...
 7.2.20: Show that if   is any natural norm, then (A_I)_1 < A < ...
Solutions for Chapter 7.2: Eigenvalues and Eigenvectors
Full solutions for Numerical Analysis  10th Edition
ISBN: 9781305253667
Solutions for Chapter 7.2: Eigenvalues and Eigenvectors
Get Full SolutionsThis textbook survival guide was created for the textbook: Numerical Analysis, edition: 10. Numerical Analysis was written by and is associated to the ISBN: 9781305253667. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 7.2: Eigenvalues and Eigenvectors includes 20 full stepbystep solutions. Since 20 problems in chapter 7.2: Eigenvalues and Eigenvectors have been answered, more than 14858 students have viewed full stepbystep solutions from this chapter.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Distributive Law
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Norm
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

Outer product uv T
= column times row = rank one matrix.

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.

Row space C (AT) = all combinations of rows of A.
Column vectors by convention.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Singular matrix A.
A square matrix that has no inverse: det(A) = o.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.