 3.5.1: If possible, sollie the following linear systems by Cramer's rule:
 3.5.2: Repeat Exercise I for the linear system
 3.5.3: Sollie the following linear s)stem for '"3. by Cramer's rule:
 3.5.4: Repeat Exercise 5 of Section 2.2: use Cramer's rule.
 3.5.5: Repeat Exercise I for the following linear system: 2X1  X2 + 3.t) ...
 3.5.6: Repeat Exercise 6(b) of Section 2.2: use Cramer's rule.
 3.5.7: Repeat Exercise I for the following linear systems: lxl +3X2 +7.>.)...
Solutions for Chapter 3.5: Other Applicotions of Determinants
Full solutions for Elementary Linear Algebra with Applications  9th Edition
ISBN: 9780471669593
Solutions for Chapter 3.5: Other Applicotions of Determinants
Get Full SolutionsSince 7 problems in chapter 3.5: Other Applicotions of Determinants have been answered, more than 9442 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Elementary Linear Algebra with Applications was written by and is associated to the ISBN: 9780471669593. Chapter 3.5: Other Applicotions of Determinants includes 7 full stepbystep solutions. This textbook survival guide was created for the textbook: Elementary Linear Algebra with Applications, edition: 9.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Characteristic equation det(A  AI) = O.
The n roots are the eigenvalues of A.

Cholesky factorization
A = CTC = (L.J]))(L.J]))T for positive definite A.

Distributive Law
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn1c can be computed with ne/2 multiplications. Revolutionary.

Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.

Free variable Xi.
Column i has no pivot in elimination. We can give the n  r free variables any values, then Ax = b determines the r pivot variables (if solvable!).

Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Multiplier eij.
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

Normal matrix.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Rotation matrix
R = [~ CS ] rotates the plane by () and R 1 = RT rotates back by (). Eigenvalues are eiO and eiO , eigenvectors are (1, ±i). c, s = cos (), sin ().

Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.

Subspace S of V.
Any vector space inside V, including V and Z = {zero vector only}.

Unitary matrix UH = U T = UI.
Orthonormal columns (complex analog of Q).