 3.5.1: Give the dimensions of the four fundamental subspaces of the incide...
 3.5.2: Let A denote the incidence matrix of the disconnected graph shown i...
 3.5.3: Give the dimensions of the four fundamental subspaces of the incide...
 3.5.4: a. Show that in a graph with n nodes and n edges, there must be a l...
 3.5.5: Ohms Law says that V = IR; that is, voltage (in volts) = current (i...
 3.5.6: Use the approach of Exercise 5 to obtain the answer to Example 5 in...
Solutions for Chapter 3.5: A Graphic Example
Full solutions for Linear Algebra: A Geometric Approach  2nd Edition
ISBN: 9781429215213
Solutions for Chapter 3.5: A Graphic Example
Get Full SolutionsSince 6 problems in chapter 3.5: A Graphic Example have been answered, more than 4447 students have viewed full stepbystep solutions from this chapter. Chapter 3.5: A Graphic Example includes 6 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. Linear Algebra: A Geometric Approach was written by and is associated to the ISBN: 9781429215213. This textbook survival guide was created for the textbook: Linear Algebra: A Geometric Approach, edition: 2.

Companion matrix.
Put CI, ... ,Cn in row n and put n  1 ones just above the main diagonal. Then det(A  AI) = ±(CI + c2A + C3A 2 + .•. + cnA nl  An).

Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Eigenvalue A and eigenvector x.
Ax = AX with x#O so det(A  AI) = o.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Kirchhoff's Laws.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , AjIb. Numerical methods approximate A I b by x j with residual b  Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

Linear combination cv + d w or L C jV j.
Vector addition and scalar multiplication.

Network.
A directed graph that has constants Cl, ... , Cm associated with the edges.

Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b  Ax) = o.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Outer product uv T
= column times row = rank one matrix.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)l has AA+ = 1m.

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Toeplitz matrix.
Constant down each diagonal = timeinvariant (shiftinvariant) filter.

Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·

Vector space V.
Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.