 14.9.14.9.1: In all of the following exercises, assume 0 < << 1. In Exercises 14...
 14.9.14.9.2: In all of the following exercises, assume 0 < << 1. In Exercises 14...
 14.9.14.9.3: In all of the following exercises, assume 0 < << 1. In Exercises 14...
 14.9.14.9.4: In all of the following exercises, assume 0 < << 1. In Exercises 14...
 14.9.14.9.5: In all of the following exercises, assume 0 < << 1. In Exercises 14...
 14.9.14.9.6: In all of the following exercises, assume 0 < << 1. In Exercises 14...
 14.9.14.9.7: In all of the following exercises, assume 0 < << 1. In Exercises 14...
 14.9.14.9.8: In Exercises 14.9.814.9.10, use the method of multiply scaled varia...
 14.9.14.9.9: In Exercises 14.9.814.9.10, use the method of multiply scaled varia...
 14.9.14.9.10: In Exercises 14.9.814.9.10, use the method of multiply scaled varia...
 14.9.14.9.11: Because the reduced wave equation (14.9.49) is linear, we can simpl...
Solutions for Chapter 14.9: Dispersive Waves: Slow Variations, Stability, Nonlinearity, and Perturbation Methods
Full solutions for Applied Partial Differential Equations with Fourier Series and Boundary Value Problems  5th Edition
ISBN: 9780321797056
Solutions for Chapter 14.9: Dispersive Waves: Slow Variations, Stability, Nonlinearity, and Perturbation Methods
Get Full SolutionsApplied Partial Differential Equations with Fourier Series and Boundary Value Problems was written by and is associated to the ISBN: 9780321797056. This textbook survival guide was created for the textbook: Applied Partial Differential Equations with Fourier Series and Boundary Value Problems, edition: 5. Since 11 problems in chapter 14.9: Dispersive Waves: Slow Variations, Stability, Nonlinearity, and Perturbation Methods have been answered, more than 7837 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 14.9: Dispersive Waves: Slow Variations, Stability, Nonlinearity, and Perturbation Methods includes 11 full stepbystep solutions.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Cholesky factorization
A = CTC = (L.J]))(L.J]))T for positive definite A.

Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn  l . Cx = convolution c * x. Eigenvectors in F.

Condition number
cond(A) = c(A) = IIAIlIIAIII = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

Covariance matrix:E.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x  x) (x  x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.

Fibonacci numbers
0,1,1,2,3,5, ... satisfy Fn = Fnl + Fn 2 = (A7 A~)I()q A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].

Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

Identity matrix I (or In).
Diagonal entries = 1, offdiagonal entries = 0.

Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

Normal matrix.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.

Pascal matrix
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.

Special solutions to As = O.
One free variable is Si = 1, other free variables = o.

Stiffness matrix
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.

Symmetric matrix A.
The transpose is AT = A, and aU = a ji. AI is also symmetric.

Volume of box.
The rows (or the columns) of A generate a box with volume I det(A) I.