 5.1: If T is a linear transformation from Rn to Rn such that T (e1), T (...
 5.2: If A is an invertible matrix, then the equation (AT )1 = (A1)T must...
 5.3: If matrix A is orthogonal, then matrix A2 must be orthogonal as well.
 5.4: The equation (AB)T = AT BT holds for all n n matrices A and B.
 5.5: If A and B are symmetric n n matrices, then A + B must be symmetric...
 5.6: If matrices A and S are orthogonal, then S1AS is orthogonal as well.
 5.7: . All nonzero symmetric matrices are invertible.
 5.8: If A is an n n matrix such that AAT = In, then A must be an orthogo...
 5.9: If u is a unit vector in Rn, and L = span(u), then projL (x) = (x u...
 5.10: . If A is a symmetric matrix, then 7A must be symmetric as well.
 5.11: If x and y are two vectors in Rn, then the equation x + y2 = x2 + y...
 5.12: The equation det(AT ) = det(A) holds for all 2 2 matrices A
 5.13: If matrix A is orthogonal, then AT must be orthogonal as well.
 5.14: If A and B are symmetric nn matrices, then AB must be symmetric as ...
 5.15: If matrices A and B commute, then A must commute with BT as well.
 5.16: If A is any matrix with ker(A) = {0}, then the matrix AAT represent...
 5.17: If A and B are symmetric n n matrices, then ABBA must be symmetric ...
 5.18: If matrices A and B commute, then matrices AT and BT must commute a...
 5.19: There exists a subspace V of R5 such that dim(V) = dim(V ), where V...
 5.20: Every invertible matrix A can be expressed as the product of an ort...
 5.21: The determinant of all orthogonal 2 2 matrices is 1.
 5.22: If A is any square matrix, then matrix 1 2 (A AT ) is skewsymmetric.
 5.23: The entries of an orthogonal matrix are all less than or equal to 1.
 5.24: Every nonzero subspace of Rn has an orthonormal basis.
 5.25: 3 4 4 3 is an orthogonal matrix.
 5.26: If V is a subspace of Rn and x is a vector in Rn, then vector projV...
 5.27: If A and B are orthogonal 2 2 matrices, then AB = B A.
 5.28: If A is a symmetric matrix, vector v is in the image of A, and w is...
 5.29: The formula ker(A) = ker(AT A) holds for all matrices A
 5.30: If AT A = AAT for an n n matrix A, then A must be orthogonal.
 5.31: There exist orthogonal 22 matrices A and B such that A + B is ortho...
 5.32: If Axx for all x in Rn, then A must represent the orthogonal projec...
 5.33: If A is an invertible matrix such that A1 = A, then A must be ortho...
 5.34: If the entries of two vectors v and w in Rn are all positive, then ...
 5.35: The formula (ker B) = im(BT ) holds for all matrices B.
 5.36: The matrix AT A is symmetric for all matrices A.
 5.37: If matrix A is similar to B and A is orthogonal, then B must be ort...
 5.38: The formula im(B) = im(BT B) holds for all square matrices B.
 5.39: If matrix A is symmetric and matrix S is orthogonal, then matrix S1...
 5.40: If A is a square matrix such that AT A = AAT , then ker(A) = ker(AT ).
 5.41: Any square matrix can be written as the sum of a symmetric and a sk...
 5.42: . If x1, x2,..., xn are any real numbers, then the inequality 9 n k...
 5.43: If AAT = A2 for a 2 2 matrix A, then A must be symmetric.
 5.44: If V is a subspace of Rn and x is a vector in Rn, then the inequali...
 5.45: If A is an n n matrix such that Au = 1 for all unit vectors u, then...
 5.46: If A is any symmetric 2 2 matrix, then there must exist a real numb...
 5.47: There exists a basis of R22 that consists of orthogonal matrices.
 5.48: f A = 1 2 2 1 , then the matrix Q in the Q R factorization of A is ...
 5.49: There exists a linear transformation L from R33 to R22 whose kernel...
 5.50: If a 3 3 matrix A represents the orthogonal projection onto a plane...
Solutions for Chapter 5: Linear Algebra with Applications 5th Edition
Full solutions for Linear Algebra with Applications  5th Edition
ISBN: 9780321796974
Solutions for Chapter 5
Get Full SolutionsChapter 5 includes 50 full stepbystep solutions. Linear Algebra with Applications was written by and is associated to the ISBN: 9780321796974. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 5. Since 50 problems in chapter 5 have been answered, more than 4165 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Free variable Xi.
Column i has no pivot in elimination. We can give the n  r free variables any values, then Ax = b determines the r pivot variables (if solvable!).

GramSchmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

Identity matrix I (or In).
Diagonal entries = 1, offdiagonal entries = 0.

Inverse matrix AI.
Square matrix with AI A = I and AAl = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B1 AI and (AI)T. Cofactor formula (Al)ij = Cji! detA.

Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , AjIb. Numerical methods approximate A I b by x j with residual b  Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Orthogonal subspaces.
Every v in V is orthogonal to every w in W.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

Row space C (AT) = all combinations of rows of A.
Column vectors by convention.

Schwarz inequality
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Standard basis for Rn.
Columns of n by n identity matrix (written i ,j ,k in R3).