×
×

# Solutions for Chapter 7: Eigenvalues and Eigenvectors

## Full solutions for Elementary Linear Algebra | 7th Edition

ISBN: 9781133110873

Solutions for Chapter 7: Eigenvalues and Eigenvectors

Solutions for Chapter 7
4 5 0 393 Reviews
25
5
##### ISBN: 9781133110873

Elementary Linear Algebra was written by and is associated to the ISBN: 9781133110873. Chapter 7: Eigenvalues and Eigenvectors includes 406 full step-by-step solutions. Since 406 problems in chapter 7: Eigenvalues and Eigenvectors have been answered, more than 11587 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra, edition: 7.

Key Math Terms and definitions covered in this textbook
• Cofactor Cij.

Remove row i and column j; multiply the determinant by (-I)i + j •

• Cramer's Rule for Ax = b.

B j has b replacing column j of A; x j = det B j I det A

• Diagonalizable matrix A.

Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

• Elimination matrix = Elementary matrix Eij.

The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

• Hessenberg matrix H.

Triangular matrix with one extra nonzero adjacent diagonal.

• Hypercube matrix pl.

Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

• Indefinite matrix.

A symmetric matrix with eigenvalues of both signs (+ and - ).

• Kirchhoff's Laws.

Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

• Linearly dependent VI, ... , Vn.

A combination other than all Ci = 0 gives L Ci Vi = O.

• Multiplicities AM and G M.

The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

• Norm

IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

• Orthogonal subspaces.

Every v in V is orthogonal to every w in W.

• Orthonormal vectors q 1 , ... , q n·

Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

• Permutation matrix P.

There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.

• Polar decomposition A = Q H.

Orthogonal Q times positive (semi)definite H.

• Rank r (A)

= number of pivots = dimension of column space = dimension of row space.

• Reduced row echelon form R = rref(A).

Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

• Sum V + W of subs paces.

Space of all (v in V) + (w in W). Direct sum: V n W = to}.

• Transpose matrix AT.

Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.

• Triangle inequality II u + v II < II u II + II v II.

For matrix norms II A + B II < II A II + II B II·

×