×
×

# Solutions for Chapter 5.3: EIGENVALUES, EIGENVECTORS, AND DIAGONALIZATION

## Full solutions for Elementary Linear Algebra: A Matrix Approach | 2nd Edition

ISBN: 9780131871410

Solutions for Chapter 5.3: EIGENVALUES, EIGENVECTORS, AND DIAGONALIZATION

Solutions for Chapter 5.3
4 5 0 433 Reviews
31
1
##### ISBN: 9780131871410

Since 94 problems in chapter 5.3: EIGENVALUES, EIGENVECTORS, AND DIAGONALIZATION have been answered, more than 22816 students have viewed full step-by-step solutions from this chapter. This textbook survival guide was created for the textbook: Elementary Linear Algebra: A Matrix Approach, edition: 2. Chapter 5.3: EIGENVALUES, EIGENVECTORS, AND DIAGONALIZATION includes 94 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. Elementary Linear Algebra: A Matrix Approach was written by and is associated to the ISBN: 9780131871410.

Key Math Terms and definitions covered in this textbook
• Associative Law (AB)C = A(BC).

Parentheses can be removed to leave ABC.

• Cayley-Hamilton Theorem.

peA) = det(A - AI) has peA) = zero matrix.

• Complete solution x = x p + Xn to Ax = b.

(Particular x p) + (x n in nullspace).

• Dimension of vector space

dim(V) = number of vectors in any basis for V.

• Elimination.

A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

• Factorization

A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

• Fundamental Theorem.

The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

• Gram-Schmidt orthogonalization A = QR.

Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

• Left nullspace N (AT).

Nullspace of AT = "left nullspace" of A because y T A = OT.

• Linearly dependent VI, ... , Vn.

A combination other than all Ci = 0 gives L Ci Vi = O.

• Markov matrix M.

All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

• Norm

IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

• Normal equation AT Ax = ATb.

Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.

• Nullspace matrix N.

The columns of N are the n - r special solutions to As = O.

• Orthogonal matrix Q.

Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

• Plane (or hyperplane) in Rn.

Vectors x with aT x = O. Plane is perpendicular to a =1= O.

• Projection matrix P onto subspace S.

Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

• Semidefinite matrix A.

(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

• Standard basis for Rn.

Columns of n by n identity matrix (written i ,j ,k in R3).

• Vandermonde matrix V.

V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.

×