 Chapter 3.1: del(A + 8) = del(A) + del(B)
 Chapter 3.2: del(A  IB) = ~
 Chapter 3.3: If del(A) = O. then A has alleasl lwoequal rows.
 Chapter 3.4: If A a~ a column of all zeroS. then del(A} = O
 Chapter 3.5: A is singular if and only if det(A) = O
 Chapter 3.6: If 8 is the reduced row echelon fonn of A. then det(B) = del(A )
 Chapter 3.7: The determinant oran elementary millrix is always I.
 Chapter 3.8: If A is I1Ollsingular. then A  I _ ~ ~IJj(
 Chapter 3.9: If T i~ a matrix transformation frolll R.2 ..... R. 2 defined by A ...
 Chapter 3.10: If aJllhe diagonal elements of an /I X /I m.1lrix A are zero. then ...
 Chapter 3.11: det(ABTA  J)= detB.
 Chapter 3.12:  (del cA) = det(A).
Solutions for Chapter Chapter 3: Compute IAI for each of the followin g:
Full solutions for Elementary Linear Algebra with Applications  9th Edition
ISBN: 9780471669593
Solutions for Chapter Chapter 3: Compute IAI for each of the followin g:
Get Full SolutionsElementary Linear Algebra with Applications was written by and is associated to the ISBN: 9780471669593. This expansive textbook survival guide covers the following chapters and their solutions. Since 12 problems in chapter Chapter 3: Compute IAI for each of the followin g: have been answered, more than 9645 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Elementary Linear Algebra with Applications, edition: 9. Chapter Chapter 3: Compute IAI for each of the followin g: includes 12 full stepbystep solutions.

Companion matrix.
Put CI, ... ,Cn in row n and put n  1 ones just above the main diagonal. Then det(A  AI) = ±(CI + c2A + C3A 2 + .•. + cnA nl  An).

Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.

Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.

Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Lucas numbers
Ln = 2,J, 3, 4, ... satisfy Ln = L n l +Ln 2 = A1 +A~, with AI, A2 = (1 ± /5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

Network.
A directed graph that has constants Cl, ... , Cm associated with the edges.

Norm
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b  Ax) = o.

Orthogonal subspaces.
Every v in V is orthogonal to every w in W.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Singular matrix A.
A square matrix that has no inverse: det(A) = o.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Solvable system Ax = b.
The right side b is in the column space of A.

Toeplitz matrix.
Constant down each diagonal = timeinvariant (shiftinvariant) filter.