×
×

# Solutions for Chapter 1.3: Applications of Systems of Linear Equations

## Full solutions for Elementary Linear Algebra | 6th Edition

ISBN: 9780618783762

Solutions for Chapter 1.3: Applications of Systems of Linear Equations

Solutions for Chapter 1.3
4 5 0 285 Reviews
14
3
##### ISBN: 9780618783762

This textbook survival guide was created for the textbook: Elementary Linear Algebra, edition: 6. Since 36 problems in chapter 1.3: Applications of Systems of Linear Equations have been answered, more than 18149 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 1.3: Applications of Systems of Linear Equations includes 36 full step-by-step solutions. Elementary Linear Algebra was written by and is associated to the ISBN: 9780618783762.

Key Math Terms and definitions covered in this textbook
• Determinant IAI = det(A).

Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

• Distributive Law

A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

• Exponential eAt = I + At + (At)2 12! + ...

has derivative AeAt; eAt u(O) solves u' = Au.

• Fourier matrix F.

Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.

• Free columns of A.

Columns without pivots; these are combinations of earlier columns.

• Hypercube matrix pl.

Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

• Markov matrix M.

All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

• Normal matrix.

If N NT = NT N, then N has orthonormal (complex) eigenvectors.

• Particular solution x p.

Any solution to Ax = b; often x p has free variables = o.

• Pascal matrix

Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

• Pseudoinverse A+ (Moore-Penrose inverse).

The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

• Rank one matrix A = uvT f=. O.

Column and row spaces = lines cu and cv.

• Schur complement S, D - C A -} B.

Appears in block elimination on [~ g ].

• Simplex method for linear programming.

The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

• Spanning set.

Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

• Symmetric matrix A.

The transpose is AT = A, and aU = a ji. A-I is also symmetric.

• Unitary matrix UH = U T = U-I.

Orthonormal columns (complex analog of Q).

v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.

• Vector space V.

Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.

• Vector v in Rn.

Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.

×