×
×

# Solutions for Chapter 4.5: Linear Dependence and Linear Independence

## Full solutions for Differential Equations | 4th Edition

ISBN: 9780321964670

Solutions for Chapter 4.5: Linear Dependence and Linear Independence

Solutions for Chapter 4.5
4 5 0 321 Reviews
20
4
##### ISBN: 9780321964670

This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Differential Equations, edition: 4. Differential Equations was written by and is associated to the ISBN: 9780321964670. Since 54 problems in chapter 4.5: Linear Dependence and Linear Independence have been answered, more than 21380 students have viewed full step-by-step solutions from this chapter. Chapter 4.5: Linear Dependence and Linear Independence includes 54 full step-by-step solutions.

Key Math Terms and definitions covered in this textbook
• Back substitution.

Upper triangular systems are solved in reverse order Xn to Xl.

• Big formula for n by n determinants.

Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or - sign.

• Cayley-Hamilton Theorem.

peA) = det(A - AI) has peA) = zero matrix.

• Change of basis matrix M.

The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

• Cyclic shift

S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

• Fundamental Theorem.

The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

• Left nullspace N (AT).

Nullspace of AT = "left nullspace" of A because y T A = OT.

• Nilpotent matrix N.

Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

• Pascal matrix

Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

• Permutation matrix P.

There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.

• Projection p = a(aTblaTa) onto the line through a.

P = aaT laTa has rank l.

• Saddle point of I(x}, ... ,xn ).

A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

• Simplex method for linear programming.

The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

• Special solutions to As = O.

One free variable is Si = 1, other free variables = o.

• Subspace S of V.

Any vector space inside V, including V and Z = {zero vector only}.

• Sum V + W of subs paces.

Space of all (v in V) + (w in W). Direct sum: V n W = to}.

• Symmetric matrix A.

The transpose is AT = A, and aU = a ji. A-I is also symmetric.

• Vector space V.

Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.

• Wavelets Wjk(t).

Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).

×