 3.4.1: Verify Theorem 3.11 for the matrix A~n :n
 3.4.2: 2  2 (. ) Find adj A. (hi Compute det(A). (e) Verify Theorem 3.12;...
 3.4.3: .L"A~H Exercise 2. 2 4  4 8;] Follow the directions of
 3.4.4: Find the inverse of the matrix in Exercise 2 by the method given in...
 3.4.5: Repeal Exercise I I of Section 2.3 by the method given in Corollary...
 3.4.6: Prove that if A is a symmetric matrix. then adj A is symmetric.
 3.4.7: 1I~ Ih", mp.lhorl giv",n in Cornlbry 1.4 ro find rh", ;m''''r~. If ...
 3.4.8: Prove that if A is a nonsingular upper triangular matrix. then A _r...
 3.4.9: Use the method given in Corollary 3.4 to find the inverse of A~ [" ...
 3.4.10: Use rhe melhod given in Corollary 3.4 ro find Ihe inverse of [Hint:...
 3.4.11: Use the method given in Corollary 3.4 to find the inverse of
 3.4.12: Use the method given in Corollary 3.4 to find the inverse of
 3.4.13: Prove that if A is sin:;ular. then adj A is singular. (Hint: Firsl ...
 3.4.14: Prove that if A is an n x II matrix. then det(adj A ) [det(A)I" l.
 3.4.15: Assuming that your software has a command for computing the inverse...
Solutions for Chapter 3.4: Inverse of a Matrix
Full solutions for Elementary Linear Algebra with Applications  9th Edition
ISBN: 9780132296540
Solutions for Chapter 3.4: Inverse of a Matrix
Get Full SolutionsChapter 3.4: Inverse of a Matrix includes 15 full stepbystep solutions. Elementary Linear Algebra with Applications was written by and is associated to the ISBN: 9780132296540. This textbook survival guide was created for the textbook: Elementary Linear Algebra with Applications, edition: 9. This expansive textbook survival guide covers the following chapters and their solutions. Since 15 problems in chapter 3.4: Inverse of a Matrix have been answered, more than 11987 students have viewed full stepbystep solutions from this chapter.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn  l . Cx = convolution c * x. Eigenvectors in F.

Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Network.
A directed graph that has constants Cl, ... , Cm associated with the edges.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Rank r (A)
= number of pivots = dimension of column space = dimension of row space.

Row space C (AT) = all combinations of rows of A.
Column vectors by convention.

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.