 7.1.1: Let L : R2 __ R! be counterclockwise rotation through 8. Find all t...
 7.1.2: Let L: PI '> PI be the linear operator defined by L(at + b) = bl...
 7.1.3: Let L. 1'2 ....... 1'2 be the linear operator dellned hy L (at! + b...
 7.1.4: Let L : R3 ....... R3 be defined by Using the natural basis for R1....
 7.1.5: Find the characteristic polynomial of each of the follow109 matrice...
 7.1.6: Find the characteristic polynomial. the eigenvalues. and associated...
 7.1.7: Find the characteristic polynomial, the eigenvalues. and associated...
 7.1.8: (. ) [ : ;] (b) [~ ~] [! 2 4] [!  I n
 7.1.9: Find the characteristic polynomial. the eigenvalue~. and associated...
 7.1.10: Find all the eigenval~es and associated eigenvectors of each of the...
 7.1.11: Prove that if A is an upper (lower) triangular m;!trix, then the ei...
 7.1.12: Prove that A and A ' have the same eigenvalues. What. if anything. ...
 7.1.13: Let ,  I o o 3 3 J o ~] represent the linear transfonnation L : M2...
 7.1.14: Let L: V '> V be a linear operator, where V is an /1  dimension...
 7.1.15: Let A be an eigenvalue of the n x /I matrix A. Prove that (he subse...
 7.1.16: In Exercises 14 and ~. why do we have 10 include 01' in (he set of ...
 7.1.17: III Eurcilies 17 (IIu/ 18, find" bmis ior thl' eigell~pacl' (see E;...
 7.1.18: III Eurcilies 17 (IIu/ 18, find" bmis ior thl' eigell~pacl' (see E;...
 7.1.19: Let A = 0 (a) Find a ba~is for Ihe eigenspace a~sociated with the e...
 7.1.20: L"A [~ ~ r fJ (a) Find a ba.~ s for the eigenspace associated wilh ...
 7.1.21: Prove Ihat if A is an ei.eenvalue of a matrix A with associated eig...
 7.1.22: Let 23. A=[: ~] be Ihe matrix of Exercise Sea). Find the eigenvalue...
 7.1.23: Prove thaI if AI = 0 for some positive integer k [i.e .. if A is a ...
 7.1.24: Let A be an /1 x I! mntrix. (a) Show that del (A) is the product of...
 7.1.25: Let A be an /1 x I! mntrix. (a) Show that del (A) is the product of...
 7.1.26: Let A be an /I x II matrix with eigenvalues AI aoo A2. where AI i= ...
 7.1.27: Let A be an eigenvalue of A with assocklled eigen\'ector x. Show th...
 7.1.28: Let A be an /I x I' matrix and consider the linear operator on R" d...
 7.1.29: Let A and 8 be I! X I! matrices such that Ax = AX and 8 x = /l x. ...
 7.1.30: The CayleY' Hamillon' Ihrorem states that a matrix satisfies its c...
 7.1.31: Let A be an I! XII matrix whose characteristic polynomial (a) (A + ...
 7.1.32: 1.p.t then A~[~ !l The proof and applications of this result. unfon...
 7.1.33: Show that if A is a matrix all of whose columns add up to l. then A...
 7.1.34: Show that if A is an I! x I! matrix whose kth row is the same as th...
 7.1.35: Let A be a square matrix. (a) Suppose that the homogeneous system A...
 7.1.36: Determine whether your soflware has a command for fi nding the char...
 7.1.37: If your soflware has a command for finding the charucteristic polyn...
 7.1.38: Assuming that your software has the commands discussed in Exercises...
 7.1.39: Most linear algebra software has a command for automatically findin...
 7.1.40: Fullowing lhe iueas ill Exen:ise 39. uelermillc lhc cummand in your...
Solutions for Chapter 7.1: Eigenvalues and Eigenvectors
Full solutions for Elementary Linear Algebra with Applications  9th Edition
ISBN: 9780471669593
Solutions for Chapter 7.1: Eigenvalues and Eigenvectors
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Since 40 problems in chapter 7.1: Eigenvalues and Eigenvectors have been answered, more than 10232 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Elementary Linear Algebra with Applications, edition: 9. Elementary Linear Algebra with Applications was written by and is associated to the ISBN: 9780471669593. Chapter 7.1: Eigenvalues and Eigenvectors includes 40 full stepbystep solutions.

Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn  l . Cx = convolution c * x. Eigenvectors in F.

Complex conjugate
z = a  ib for any complex number z = a + ib. Then zz = Iz12.

Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A

Distributive Law
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn1c can be computed with ne/2 multiplications. Revolutionary.

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Iterative method.
A sequence of steps intended to approach the desired solution.

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Linear combination cv + d w or L C jV j.
Vector addition and scalar multiplication.

Outer product uv T
= column times row = rank one matrix.

Partial pivoting.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

Row space C (AT) = all combinations of rows of A.
Column vectors by convention.

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Standard basis for Rn.
Columns of n by n identity matrix (written i ,j ,k in R3).

Vector addition.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.