- 5.2.1: dx dt = 2 x y dy dt = y x2 (a) x0 = 2, y0 = 1 (b) x0 = 0, y0 = 1 (c...
- 5.2.2: dx dt = 2 x y dy dt = y |x| (a) x0 = 1, y0 = 1 (b) x0 = 2, y0 = 1 (...
- 5.2.3: dx dt = x(x 1) dy dt = x2 y (a) x0 = 1, y0 = 0 (b) x0 = 0.8, y0 = 0...
- 5.2.4: The Volterra-Lotka system for two competitive species is dx dt = x(...
- 5.2.5: dx dt = x(x 3y + 150) dy dt = y(2x y + 100)
- 5.2.6: dx dt = x(10 x y) dy dt = y(30 2x y)
- 5.2.7: dx dt = x(100 x 2y) dy dt = y(150 x 6y)
- 5.2.8: dx dt = x(x y + 100) dy dt = y(x2 y2 + 2500)
- 5.2.9: dx dt = x(x y + 40) dy dt = y(x2 y2 + 2500)
- 5.2.10: dx dt = x(4x y + 160) dy dt = y(x2 y2 + 2500)
- 5.2.11: dx dt = x(8x 6y + 480) dy dt = y(x2 y2 + 2500)
- 5.2.12: dx dt = x(2 x y) dy dt = y(y x2)
- 5.2.13: dx dt = x(2 x y) dy dt = y(y x)
- 5.2.14: dx dt = x(x 1) dy dt = y(x2 y)
- 5.2.15: Some species live in a cooperative mannereach species helping the o...
- 5.2.16: da dt = ab 2 db dt = ab 2
- 5.2.17: da dt = 2 ab 2 db dt = 3 2 ab 2
- 5.2.18: da dt = 2 ab 2 a2 3 db dt = 3 2 ab 2
- 5.2.19: da dt = 2 ab 2 + b2 6 db dt = 3 2 ab 2 b2 3
- 5.2.20: da dt = 2 ab 2 ab2 3 db dt = 3 2 ab 2 2ab2 3
- 5.2.21: Sketch the nullclines and find the direction of the vector field al...
- 5.2.22: Show that there is at least one solution in each of the second and ...
- 5.2.23: Find the linearized system near the equilibrium points (0, 0) and (...
Solutions for Chapter 5.2: QUALITATIVE ANALYSIS
Full solutions for Differential Equations 00 | 4th Edition
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
Characteristic equation det(A - AI) = O.
The n roots are the eigenvalues of A.
A = CTC = (L.J]))(L.J]))T for positive definite A.
Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.
A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.
Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad
Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.
Free variable Xi.
Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).
Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
Linear combination cv + d w or L C jV j.
Vector addition and scalar multiplication.
Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.
Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.
Similar matrices A and B.
Every B = M-I AM has the same eigenvalues as A.
Constant down each diagonal = time-invariant (shift-invariant) filter.