- 184.108.40.206.1: In 14 use (5) to approximate the solution of Laplaces equation at t...
- 220.127.116.11.2: In 14 use (5) to approximate the solution of Laplaces equation at t...
- 18.104.22.168.3: In 14 use (5) to approximate the solution of Laplaces equation at t...
- 22.214.171.124.4: In 14 use (5) to approximate the solution of Laplaces equation at t...
- 126.96.36.199.5: In 5 and 6 use (6) and Gauss-Seidel iteration to approximate the so...
- 188.8.131.52.6: In 5 and 6 use (6) and Gauss-Seidel iteration to approximate the so...
- 184.108.40.206.7: (a) In of Exercises 12.6 you solved a potential problem using a spe...
- 220.127.116.11.8: Use the result in part (a) of to approximate the solution of the Po...
Solutions for Chapter 15.1: Numerical Solutions of Partial Differential Equations
Full solutions for Differential Equations with Boundary-Value Problems, | 8th Edition
Solutions for Chapter 15.1: Numerical Solutions of Partial Differential EquationsGet Full Solutions
Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.
Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).
Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.
Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.
Hilbert matrix hilb(n).
Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.
Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.
Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).
Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.
Row space C (AT) = all combinations of rows of A.
Column vectors by convention.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.
Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).