 2.3.1: For each of the following, compute (i) det(A), (ii) adj A, and (iii...
 2.3.2: Use Cramers rule to solve each of the following systems: (a) x1 + 2...
 2.3.3: Given A = 1 2 1 0 4 3 1 2 2 determine the (2, 3) entry of A1 by com...
 2.3.4: Let A be the matrix in Exercise 3. Compute the third column of A1 b...
 2.3.5: Let A = 1 2 3 2 3 4 3 4 5 (a) Compute the determinant of A. Is A no...
 2.3.6: If A is singular, what can you say about the product A adj A?
 2.3.7: Let Bj denote the matrix obtained by replacing the j th column of t...
 2.3.8: Let A be a nonsingular n n matrix with n > 1. Show that det(adj A) ...
 2.3.9: Let A be a 4 4 matrix. If adj A = 2 0 0 0 0 2 1 0 0 4 3 2 0 2 1 2 (...
 2.3.10: Show that if A is nonsingular, then adj A is nonsingular and (adj A...
 2.3.11: Show that if A is singular, then adj A is also singular. 1
 2.3.12: Show that if det(A) = 1, then adj(adj A) = A 1
 2.3.13: Suppose that Q is a matrix with the property Q1 = QT . Show that qi...
 2.3.14: In coding a message, a blank space was represented by 0, an A by 1,...
 2.3.15: Let x, y, and z be vectors in R3. Show each of the following: (a) x...
 2.3.16: Let x and y be vectors in R3 and define the skewsymmetric matrix Ax...
Solutions for Chapter 2.3: Additional Topics and Applications
Full solutions for Linear Algebra with Applications  8th Edition
ISBN: 9780136009290
Solutions for Chapter 2.3: Additional Topics and Applications
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. Since 16 problems in chapter 2.3: Additional Topics and Applications have been answered, more than 4186 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Linear Algebra with Applications, edition: 8. Chapter 2.3: Additional Topics and Applications includes 16 full stepbystep solutions. Linear Algebra with Applications was written by and is associated to the ISBN: 9780136009290.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A

Cyclic shift
S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Multiplier eij.
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

Norm
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

Pivot columns of A.
Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.

Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)l has AA+ = 1m.

Schwarz inequality
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.

Subspace S of V.
Any vector space inside V, including V and Z = {zero vector only}.

Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and AI are BT AT and (AT)I.