×

×

# Solutions for Chapter 1.3: Classification of Differential Equations

## Full solutions for Elementary Differential Equations | 10th Edition

ISBN: 9780470458327

Solutions for Chapter 1.3: Classification of Differential Equations

Solutions for Chapter 1.3
4 5 0 420 Reviews
16
0
##### ISBN: 9780470458327

This textbook survival guide was created for the textbook: Elementary Differential Equations, edition: 10. Chapter 1.3: Classification of Differential Equations includes 31 full step-by-step solutions. Elementary Differential Equations was written by and is associated to the ISBN: 9780470458327. This expansive textbook survival guide covers the following chapters and their solutions. Since 31 problems in chapter 1.3: Classification of Differential Equations have been answered, more than 28007 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
• Affine transformation

Tv = Av + Vo = linear transformation plus shift.

• Basis for V.

Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

• Cayley-Hamilton Theorem.

peA) = det(A - AI) has peA) = zero matrix.

• Circulant matrix C.

Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.

• Column picture of Ax = b.

The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

• Column space C (A) =

space of all combinations of the columns of A.

• Fast Fourier Transform (FFT).

A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.

• Hermitian matrix A H = AT = A.

Complex analog a j i = aU of a symmetric matrix.

• Indefinite matrix.

A symmetric matrix with eigenvalues of both signs (+ and - ).

• Jordan form 1 = M- 1 AM.

If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

• Linear combination cv + d w or L C jV j.

• Matrix multiplication AB.

The i, j entry of AB is (row i of A)ยท(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

• Nilpotent matrix N.

Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

• Rank r (A)

= number of pivots = dimension of column space = dimension of row space.

• Right inverse A+.

If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.

• Spectral Theorem A = QAQT.

Real symmetric A has real A'S and orthonormal q's.

• Trace of A

= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

• Vandermonde matrix V.

V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.