- 10.10.1.119: The second-order differential equation x f(x) g(x) 0 can be written...
- 10.10.1.120: If X X(t) is a solution to a plane autonomous system and X(t1) X(t2...
- 10.10.1.121: If the trace of the matrix A is 0 and det A 0, then the critical po...
- 10.10.1.122: If the critical point (0, 0) of the linear system X AX is a stable ...
- 10.10.1.123: If the critical point (0, 0) of the linear system X AX is a saddle ...
- 10.10.1.124: If the Jacobian matrix A g(X1) at a critical point of a plane auton...
- 10.10.1.125: It is possible to show, using linearization, that a nonlinear plane...
- 10.10.1.126: All solutions to the pendulum equation are periodic.
- 10.10.1.127: For what value(s) of a does the plane autonomous system possess per...
- 10.10.1.128: For what values of n is x np an asymptotically stable critical poin...
- 10.10.1.129: Solve the nonlinear plane autonomous system by switching to polar c...
- 10.10.1.130: Discuss the geometric nature of the solutions to the linear system ...
- 10.10.1.131: Classify the critical point (0, 0) of the given linear system by co...
- 10.10.1.132: Find and classify (if possible) the critical points of the plane au...
- 10.10.1.133: Determine the value(s) of a for which (0, 0) is a stable critical p...
- 10.10.1.134: Classify the critical point (0, 0) of the plane autonomous system c...
- 10.10.1.135: Without solving explicitly, classify (if possible) the critical poi...
- 10.10.1.136: Use the phase-plane method to show that the solutions to the nonlin...
- 10.10.1.137: In Section 5.1 we assumed that the restoring force F of the spring ...
- 10.10.1.138: The rod of a pendulum is attached to a movable joint at a point P a...
Solutions for Chapter 10: Plane Autonomous Systems
Full solutions for Differential Equations with Boundary-Value Problems, | 8th Edition
Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.
Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.
Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!
Characteristic equation det(A - AI) = O.
The n roots are the eigenvalues of A.
Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).
Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.
Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).
A directed graph that has constants Cl, ... , Cm associated with the edges.
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Nullspace N (A)
= All solutions to Ax = O. Dimension n - r = (# columns) - rank.
Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.
Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.
Pseudoinverse A+ (Moore-Penrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).
R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().
Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!
Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.