 Chapter 1.1: Some Basic Mathematical Models; Direction Fields
 Chapter 1.2: Solutions of Some Differential Equations
 Chapter 1.3: Classification of Differential Equations
 Chapter 2: First Order Differential Equations
 Chapter 2.1: Linear Equations; Method of Integrating Factors
 Chapter 2.2: Separable Equations
 Chapter 2.3: Modeling with First Order Equations
 Chapter 2.4: Differences Between Linear and Nonlinear Equations
 Chapter 2.5: Autonomous Equations and Population Dynamics
 Chapter 2.6: Exact Equations and Integrating Factors
 Chapter 2.7: Numerical Approximations: Eulers Method
 Chapter 2.8: The Existence and Uniqueness Theorem
 Chapter 2.9: First Order Difference Equations
 Chapter 3.1: Homogeneous Equations with Constant Coefficients
 Chapter 3.2: Solutions of Linear Homogeneous Equations; the Wronskian
 Chapter 3.3: Complex Roots of the Characteristic Equation
 Chapter 3.4: Repeated Roots; Reduction of Order
 Chapter 3.5: Nonhomogeneous Equations; Method of Undetermined Coefficients
 Chapter 3.6: Variation of Parameters
 Chapter 3.7: Mechanical and Electrical Vibrations
 Chapter 3.8: Forced Vibrations
 Chapter 4.1: General Theory of nth Order Linear Equations
 Chapter 4.2: Homogeneous Equations with Constant Coefficients
 Chapter 4.3: The Method of Undetermined Coefficients
 Chapter 4.4: The Method of Variation of Parameters
 Chapter 5.1: Elementary Differential Equations, 10th Edition 9780470458327 William E. Boyce / Richard C. DiPrima
 Chapter 5.2: Series Solutions Near an Ordinary Point, Part I
 Chapter 5.3: Series Solutions Near an Ordinary Point, Part II
 Chapter 5.4: Euler Equations; Regular Singular Points
 Chapter 5.5: Series Solutions Near a Regular Singular Point, Part I
 Chapter 5.6: Series Solutions Near a Regular Singular Point, Part II
 Chapter 5.7: Bessels Equation
 Chapter 6.1: Definition of the Laplace Transform
 Chapter 6.2: Solution of Initial Value Problems
 Chapter 6.3: Step Functions
 Chapter 6.4: Differential Equations with Discontinuous Forcing Functions
 Chapter 6.5: Impulse Functions
 Chapter 6.6: The Convolution Integral
 Chapter 7.1: Introduction
 Chapter 7.2: Review of Matrices
 Chapter 7.3: Systems of Linear Algebraic Equations; Linear Independence, Eigenvalues, Eigenvectors
 Chapter 7.4: Elementary Differential Equations, 10th Edition 9780470458327 William E. Boyce / Richard C. DiPrima
 Chapter 7.5: Homogeneous Linear Systems with Constant Coefficients
 Chapter 7.6: Complex Eigenvalues
 Chapter 7.7: Fundamental Matrices
 Chapter 7.8: Repeated Eigenvalues
 Chapter 7.9: Nonhomogeneous Linear Systems
 Chapter 8.1: The Euler or Tangent Line Method
 Chapter 8.2: Improvements on the Euler Method
 Chapter 8.3: The RungeKutta Method
 Chapter 8.4: Multistep Methods
 Chapter 8.5: Systems of First Order Equations
 Chapter 8.6: More on Errors; Stability
 Chapter 9.1: The Phase Plane: Linear Systems
 Chapter 9.2: Autonomous Systems and Stability
 Chapter 9.3: Locally Linear Systems
 Chapter 9.4: Competing Species
 Chapter 9.5: PredatorPrey Equations
 Chapter 9.6: Liapunovs Second Method
 Chapter 9.7: Periodic Solutions and Limit Cycles
 Chapter 9.8: Chaos and Strange Attractors: The Lorenz Equations
Elementary Differential Equations 10th Edition  Solutions by Chapter
Full solutions for Elementary Differential Equations  10th Edition
ISBN: 9780470458327
Elementary Differential Equations  10th Edition  Solutions by Chapter
Get Full SolutionsThe full stepbystep solution to problem in Elementary Differential Equations were answered by , our top Math solution expert on 03/13/18, 08:19PM. This expansive textbook survival guide covers the following chapters: 61. Since problems from 61 chapters in Elementary Differential Equations have been answered, more than 11011 students have viewed full stepbystep answer. Elementary Differential Equations was written by and is associated to the ISBN: 9780470458327. This textbook survival guide was created for the textbook: Elementary Differential Equations, edition: 10.

Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Cholesky factorization
A = CTC = (L.J]))(L.J]))T for positive definite A.

Cofactor Cij.
Remove row i and column j; multiply the determinant by (I)i + j •

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Cyclic shift
S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.

Identity matrix I (or In).
Diagonal entries = 1, offdiagonal entries = 0.

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.

Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.

Tridiagonal matrix T: tij = 0 if Ii  j I > 1.
T 1 has rank 1 above and below diagonal.