- 3.4.1: Verify Theorem 3.11 for the matrix A~n :-n
- 3.4.2: 2 - 2 (. ) Find adj A. (hi Compute det(A). (e) Verify Theorem 3.12;...
- 3.4.3: .L"A~H Exercise 2. 2 4 - 4 8;] Follow the directions of
- 3.4.4: Find the inverse of the matrix in Exercise 2 by the method given in...
- 3.4.5: Repeal Exercise I I of Section 2.3 by the method given in Corollary...
- 3.4.6: Prove that if A is a symmetric matrix. then adj A is symmetri
- 3.4.7: 1I~ Ih", mp.lhorl giv",n in Cornlbry 1.4 ro find rh", ;m''''r~. If ...
- 3.4.8: Prove that if A is a nonsingular upper triangular matrix. then A _r...
- 3.4.9: Use the method given in Corollary 3.4 to find the inverse of A~ [" ...
- 3.4.10: Use rhe melhod given in Corollary 3.4 ro find Ihe inverse of
- 3.4.11: Use the method given in Corollary 3.4 to find the inverse of o - 3 o
- 3.4.12: Use the method given in Corollary 3.4 to find the inverse of
- 3.4.13: Prove that if A is sin:;ular. then adj A is singular. (Hint: Firsl ...
- 3.4.14: Prove that if A is an n x II matrix. then det(adj A ) [det(A)I"-
- 3.4.15: Assuming that your software has a command for computing the inverse...
Solutions for Chapter 3.4: Inverse of a Matrix
Full solutions for Elementary Linear Algebra with Applications | 9th Edition
Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.
peA) = det(A - AI) has peA) = zero matrix.
Change of basis matrix M.
The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)
Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).
cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.
Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.
Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.
Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.
Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.
Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.
Special solutions to As = O.
One free variable is Si = 1, other free variables = o.
Constant down each diagonal = time-invariant (shift-invariant) filter.
Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·
Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.