 5.1.1E: Is an eigenvalue of ?Why or why not?
 5.1.2E: Is an eigenvalue of ?Why or why not?
 5.1.3E: Is an eigenvector of ?If so, find the eigenvalue.
 5.1.4E: Is an eigenvector of ?so, find the eigenvalue.
 5.1.5E: If so, find the eigenvalue.
 5.1.6E: If so, find the eigenvalue.
 5.1.7E: If so, find one corresponding eigenvector.
 5.1.8E: If so, find one corresponding eigenvector.
 5.1.9E: In Exercises 9–16, find a basis for the eigenspace corresponding to...
 5.1.10E: In Exercises 9–16, find a basis for the eigenspace corresponding to...
 5.1.11E: In Exercises 9–16, find a basis for the eigenspace corresponding to...
 5.1.12E: In Exercises 9–16, find a basis for the eigenspace corresponding to...
 5.1.13E: In Exercises 9–16, find a basis for the eigenspace corresponding to...
 5.1.14E: In Exercises 9–16, find a basis for the eigenspace corresponding to...
 5.1.15E: In Exercises 9–16, find a basis for the eigenspace corresponding to...
 5.1.16E: In Exercises 9–16, find a basis for the eigenspace corresponding to...
 5.1.17E: Find the eigenvalues of the matrices in Exercises 17 and 18.
 5.1.18E: Find the eigenvalues of the matrices in Exercises 17 and 18.
 5.1.19E: find one eigenvalue, with no calculation. Justify your answer.
 5.1.20E: Without calculation, find one eigenvalue and two linearly independe...
 5.1.21E: In Exercises 21 and 22, A is an n × n matrix. Mark each statement T...
 5.1.22E: In Exercises 21 and 22, A is an n × n matrix. Mark each statement T...
 5.1.23E: Explain why a 2 × 2 matrix can have at most two distinct eigenvalue...
 5.1.24E: Construct an example of a 2 × 2 matrix with only one distinct eigen...
 5.1.25E: Let be an eigenvalue of an invertible matrix A. Show that is an eig...
 5.1.26E: Show that if A2 is the zero matrix, then the only eigenvalue of A i...
 5.1.27E: Show that is an eigenvalue of A if and only if is an eigenvalue of ...
 5.1.28E: Use Exercise 27 to complete the proof of Theorem 1 for the case in ...
 5.1.29E: Consider an n × n matrix A with the property that the row sums all ...
 5.1.30E: Consider an n × n matrix A with the property that the column sums a...
 5.1.31E: In Exercises 31 and 32, let A be the matrix of the linear transform...
 5.1.32E: In Exercises 31 and 32, let A be the matrix of the linear transform...
 5.1.33E: Let u and v be eigenvectors of a matrix A, with corresponding eigen...
 5.1.34E: Describe how you might try to build a solution of a difference equa...
 5.1.35E: Let u and v be the vectors shown in the figure, and suppose u and v...
 5.1.36E: Repeat Exercise 35, assuming u and v are eigenvectors of A that cor...
 5.1.37E: [M] In Exercises 37–40, use a matrix program to find the eigenvalue...
 5.1.38E: [M] In Exercises 37–40, use a matrix program to find the eigenvalue...
 5.1.39E: [M] In Exercises 37–40, use a matrix program to find the eigenvalue...
 5.1.40E: [M] In Exercises 37–40, use a matrix program to find the eigenvalue...
Solutions for Chapter 5.1: Linear Algebra and Its Applications 4th Edition
Full solutions for Linear Algebra and Its Applications  4th Edition
ISBN: 9780321385178
Solutions for Chapter 5.1
Get Full SolutionsThis textbook survival guide was created for the textbook: Linear Algebra and Its Applications, edition: 4. Since 40 problems in chapter 5.1 have been answered, more than 32449 students have viewed full stepbystep solutions from this chapter. Chapter 5.1 includes 40 full stepbystep solutions. Linear Algebra and Its Applications was written by and is associated to the ISBN: 9780321385178. This expansive textbook survival guide covers the following chapters and their solutions.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.

Graph G.
Set of n nodes connected pairwise by m edges. A complete graph has all n(n  1)/2 edges between nodes. A tree has only n  1 edges and no closed loops.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Lucas numbers
Ln = 2,J, 3, 4, ... satisfy Ln = L n l +Ln 2 = A1 +A~, with AI, A2 = (1 ± /5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

Orthogonal subspaces.
Every v in V is orthogonal to every w in W.

Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Solvable system Ax = b.
The right side b is in the column space of A.

Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.

Tridiagonal matrix T: tij = 0 if Ii  j I > 1.
T 1 has rank 1 above and below diagonal.

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.

Vector addition.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.

Volume of box.
The rows (or the columns) of A generate a box with volume I det(A) I.