 5.1: If T is a linear transformation from Rn to Rn such that T (e1), T (...
 5.2: If A is an invertible matrix, then the equation (AT )1 = (A1)T must...
 5.3: If matrix A is orthogonal, then matrix A2 must be orthogonal as well.
 5.4: The equation (AB)T = AT BT holds for all n n matrices A and B.
 5.5: If A and B are symmetric n n matrices, then A + B must be symmetric...
 5.6: If matrices A and S are orthogonal, then S1AS is orthogonal as well.
 5.7: . All nonzero symmetric matrices are invertible.
 5.8: If A is an n n matrix such that AAT = In, then A must be an orthogo...
 5.9: If u is a unit vector in Rn, and L = span(u), then projL (x) = (x u...
 5.10: . If A is a symmetric matrix, then 7A must be symmetric as well.
 5.11: If x and y are two vectors in Rn, then the equation x + y2 = x2 + y...
 5.12: The equation det(AT ) = det(A) holds for all 2 2 matrices A
 5.13: If matrix A is orthogonal, then AT must be orthogonal as well.
 5.14: If A and B are symmetric nn matrices, then AB must be symmetric as ...
 5.15: If matrices A and B commute, then A must commute with BT as well.
 5.16: If A is any matrix with ker(A) = {0}, then the matrix AAT represent...
 5.17: If A and B are symmetric n n matrices, then ABBA must be symmetric ...
 5.18: If matrices A and B commute, then matrices AT and BT must commute a...
 5.19: There exists a subspace V of R5 such that dim(V) = dim(V ), where V...
 5.20: Every invertible matrix A can be expressed as the product of an ort...
 5.21: The determinant of all orthogonal 2 2 matrices is 1.
 5.22: If A is any square matrix, then matrix 1 2 (A AT ) is skewsymmetric.
 5.23: The entries of an orthogonal matrix are all less than or equal to 1.
 5.24: Every nonzero subspace of Rn has an orthonormal basis.
 5.25: 3 4 4 3 is an orthogonal matrix.
 5.26: If V is a subspace of Rn and x is a vector in Rn, then vector projV...
 5.27: If A and B are orthogonal 2 2 matrices, then AB = B A.
 5.28: If A is a symmetric matrix, vector v is in the image of A, and w is...
 5.29: The formula ker(A) = ker(AT A) holds for all matrices A
 5.30: If AT A = AAT for an n n matrix A, then A must be orthogonal.
 5.31: There exist orthogonal 22 matrices A and B such that A + B is ortho...
 5.32: If Axx for all x in Rn, then A must represent the orthogonal projec...
 5.33: If A is an invertible matrix such that A1 = A, then A must be ortho...
 5.34: If the entries of two vectors v and w in Rn are all positive, then ...
 5.35: The formula (ker B) = im(BT ) holds for all matrices B.
 5.36: The matrix AT A is symmetric for all matrices A.
 5.37: If matrix A is similar to B and A is orthogonal, then B must be ort...
 5.38: The formula im(B) = im(BT B) holds for all square matrices B.
 5.39: If matrix A is symmetric and matrix S is orthogonal, then matrix S1...
 5.40: If A is a square matrix such that AT A = AAT , then ker(A) = ker(AT ).
 5.41: Any square matrix can be written as the sum of a symmetric and a sk...
 5.42: . If x1, x2,..., xn are any real numbers, then the inequality 9 n k...
 5.43: If AAT = A2 for a 2 2 matrix A, then A must be symmetric.
 5.44: If V is a subspace of Rn and x is a vector in Rn, then the inequali...
 5.45: If A is an n n matrix such that Au = 1 for all unit vectors u, then...
 5.46: If A is any symmetric 2 2 matrix, then there must exist a real numb...
 5.47: There exists a basis of R22 that consists of orthogonal matrices.
 5.48: f A = 1 2 2 1 , then the matrix Q in the Q R factorization of A is ...
 5.49: There exists a linear transformation L from R33 to R22 whose kernel...
 5.50: If a 3 3 matrix A represents the orthogonal projection onto a plane...
Solutions for Chapter 5: Linear Algebra with Applications 5th Edition
Full solutions for Linear Algebra with Applications  5th Edition
ISBN: 9780321796974
Solutions for Chapter 5
Get Full SolutionsChapter 5 includes 50 full stepbystep solutions. Linear Algebra with Applications was written by and is associated to the ISBN: 9780321796974. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 5. Since 50 problems in chapter 5 have been answered, more than 2339 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Column space C (A) =
space of all combinations of the columns of A.

Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn1c can be computed with ne/2 multiplications. Revolutionary.

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

GramSchmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

Identity matrix I (or In).
Diagonal entries = 1, offdiagonal entries = 0.

Kirchhoff's Laws.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Length II x II.
Square root of x T x (Pythagoras in n dimensions).

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Nullspace matrix N.
The columns of N are the n  r special solutions to As = O.

Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or 1) based on the number of row exchanges to reach I.

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Special solutions to As = O.
One free variable is Si = 1, other free variables = o.

Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.

Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.

Symmetric matrix A.
The transpose is AT = A, and aU = a ji. AI is also symmetric.

Toeplitz matrix.
Constant down each diagonal = timeinvariant (shiftinvariant) filter.