- 4.7.1: Draw the phase portraits of each of the following systems of differ...
- 4.7.2: Draw the phase portraits of each of the following systems of differ...
- 4.7.3: Draw the phase portraits of each of the following systems of differ...
- 4.7.4: Draw the phase portraits of each of the following systems of differ...
- 4.7.5: Draw the phase portraits of each of the following systems of differ...
- 4.7.6: Draw the phase portraits of each of the following systems of differ...
- 4.7.7: Draw the phase portraits of each of the following systems of differ...
- 4.7.8: Draw the phase portraits of each of the following systems of differ...
- 4.7.9: Draw the phase portraits of each of the following systems of differ...
- 4.7.10: Show that every orbit of is an ellipse.
- 4.7.11: The equation of motion of a spring-mass system with damping (see Se...
- 4.7.12: Suppose that a 2 x 2 matrix A has 2 linearly independent eigenvecto...
- 4.7.13: This problem illustrates Theorem 4. Consider the system (a) Show th...
- 4.7.14: Verify Equation (6). Hint: The expression acoswt+ bsinwt can always...
Solutions for Chapter 4.7: Phase portraits of linear systems
Full solutions for Differential Equations and Their Applications: An Introduction to Applied Mathematics | 3rd Edition
Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.
Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.
Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A
Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and
Diagonal matrix D.
dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.
Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.
Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.
Gram-Schmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.
Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.
Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.
Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).
Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).
R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().
Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).