 3.3.1: LetA= [31 1 4'] Find the following minors: 5 2  3 (a) det(MlJ) (b...
 3.3.2: LetA [2 ~  I 0 u ,  2 Find the following  3 2  I minors: (a) d...
 3.3.3: LetA= ~ ~ !] Find the followingcofactors: o  3 (, ) An (b) A21
 3.3.4: LetA=  4 2 4 3  I 0]  I ~ . Find the following cofactors: (, ) A...
 3.3.5: Use Theorem 3.10 to evaluate the determinants in Exercise I(a). (d)...
 3.3.6: Use Theorem 3.10 to evaluate the determinants in Exercise I(b). (c)...
 3.3.7: Use Theorem 3.[0 to evaluate the determinants in Exercise 2(a). (c)...
 3.3.8: Use Theorem 3.10 to evaluate the determinants in Exercise 2(b). (d)...
 3.3.9: Show by a column (row) expansion that if A = [a jj ] is upper (lowe...
 3.3.10: If A = [lI ];~ ~" X "m~lr;x .
 3.3.11: Find all values of I for which (a) 2 1~3 1 =0
 3.3.12: Find all values of f for which
 3.3.13: Let A be an II x II matrix. (a) Show that 1(1) = det(1 i "  A) is ...
 3.3.14: Verify your answers to Exercise 13 with the following matrices: (a)...
 3.3.15: Let T be the triangle with vertices (3.3), ( I.  I). (4. I). (a) ...
 3.3.16: Find the area of the parallelogram with vertices (2. 3). (5.3). (4....
 3.3.17: Let Q be the quadri[ateral with vertices ( 2. 3). ( I. 4). (3.0). ...
 3.3.18: Prove that a rotation leaves the area of a triangle unchanged.
 3.3.19: Let T be the triangle with vertices (X I. )'1). (X! . )'1 ), and CI...
Solutions for Chapter 3.3: Cofactor Expansion
Full solutions for Elementary Linear Algebra with Applications  9th Edition
ISBN: 9780132296540
Solutions for Chapter 3.3: Cofactor Expansion
Get Full SolutionsElementary Linear Algebra with Applications was written by and is associated to the ISBN: 9780132296540. Since 19 problems in chapter 3.3: Cofactor Expansion have been answered, more than 12523 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra with Applications, edition: 9. Chapter 3.3: Cofactor Expansion includes 19 full stepbystep solutions.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then SI AS = A = eigenvalue matrix.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Free columns of A.
Columns without pivots; these are combinations of earlier columns.

Full column rank r = n.
Independent columns, N(A) = {O}, no free variables.

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

Lucas numbers
Ln = 2,J, 3, 4, ... satisfy Ln = L n l +Ln 2 = A1 +A~, with AI, A2 = (1 ± /5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

Network.
A directed graph that has constants Cl, ... , Cm associated with the edges.

Outer product uv T
= column times row = rank one matrix.

Partial pivoting.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

Row space C (AT) = all combinations of rows of A.
Column vectors by convention.

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.

Toeplitz matrix.
Constant down each diagonal = timeinvariant (shiftinvariant) filter.

Unitary matrix UH = U T = UI.
Orthonormal columns (complex analog of Q).

Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.