 5.1.1: Label the following statements as true or false. (a) Every linear o...
 5.1.2: For each of the following linear operators T on a vector space V an...
 5.1.3: For each of the following matrices A G MnXn (F), (i) Determine all ...
 5.1.4: For each linear operator T on V, find the eigenvalues of T and an o...
 5.1.5: Prove Theorem 5.4.
 5.1.6: Let T be a linear operator on a finitedimensional vector space V, ...
 5.1.7: Let T be a linear operator on a finitedimensional vector space V. ...
 5.1.8: (a) Prove that a linear operator T on a finitedimensional vector s...
 5.1.9: Prove that the eigenvalues of an upper triangular matrix M are the ...
 5.1.10: Let V be a finitedimensional vector space, and let A be any scalar...
 5.1.11: A scalar matrix is a square matrix of the form XI for some scalar A...
 5.1.12: (a) Prove that similar matrices have the same characteristic polyno...
 5.1.13: Let T be a linear operator on a finitedimensional vector space V o...
 5.1.14: For any square matrix A, prove that A and A1 have the same characte...
 5.1.15: (a) Let T be a linear operator on a vector space V, and let x be an...
 5.1.16: (a) Prove that similar matrices have the same trace. Hint: Use Exer...
 5.1.17: Let T be the linear operator on MnXn(i?) defined by T(A) = A1 . (a)...
 5.1.18: Let A,B e M x n (C). (a) Prove that if B is invertible, then there ...
 5.1.19: Let A and B be similar nxn matrices. Prove that there exists an ndi...
 5.1.20: Let A be an nxn matrix with characteristic polynomial f(t) = (l)n ...
 5.1.21: Let A and f(t) be as in Exercise 20. (a) Prove that/( 0 = (Ant)(A2...
 5.1.22: (a) Let T be a linear operator on a vector space V over the field F...
 5.1.23: Use Exercise 22 to prove that if f(t) is the characteristic polynom...
 5.1.24: Use Exercise 21(a) to prove Theorem 5.3.
 5.1.25: Prove Corollaries 1 and 2 of Theorem 5.3
 5.1.26: Determine the number of distinct characteristic polynomials of matr...
Solutions for Chapter 5.1: Eigenvalues and Eigenvectors
Full solutions for Linear Algebra  4th Edition
ISBN: 9780130084514
Solutions for Chapter 5.1: Eigenvalues and Eigenvectors
Get Full SolutionsThis textbook survival guide was created for the textbook: Linear Algebra , edition: 4. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 5.1: Eigenvalues and Eigenvectors includes 26 full stepbystep solutions. Linear Algebra was written by and is associated to the ISBN: 9780130084514. Since 26 problems in chapter 5.1: Eigenvalues and Eigenvectors have been answered, more than 10104 students have viewed full stepbystep solutions from this chapter.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Block matrix.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Full column rank r = n.
Independent columns, N(A) = {O}, no free variables.

Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.

Identity matrix I (or In).
Diagonal entries = 1, offdiagonal entries = 0.

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)ยท(b  Ax) = o.

Nullspace N (A)
= All solutions to Ax = O. Dimension n  r = (# columns)  rank.

Row space C (AT) = all combinations of rows of A.
Column vectors by convention.

Similar matrices A and B.
Every B = MI AM has the same eigenvalues as A.

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Solvable system Ax = b.
The right side b is in the column space of A.

Spectrum of A = the set of eigenvalues {A I, ... , An}.
Spectral radius = max of IAi I.

Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.