 9.3.9.3.1: The Greens function for (9.3.1) is given explicitly by (9.3.16). Th...
 9.3.9.3.2: (a) Derive (9.3.17). (b) Integrate (9.3.17) by parts to derive (9.3...
 9.3.9.3.3: Consider u t = k 2u x2 + Q(x, t) subject to u(0, t) = 0, u x (L, t)...
 9.3.9.3.4: Consider u t = k 2u x2 + Q(x, t) subject to u(0, t) = 0, u x (L, t)...
 9.3.9.3.5: Consider d2u dx2 = f(x) with u(0) = 0 and du dx(L)=0. *(a) Solve by...
 9.3.9.3.6: Consider d2G dx2 = (x x0) with G(0, x0) = 0 and dG dx (L, x0)=0. *(...
 9.3.9.3.7: Redo Exercise 9.3.5 with the following change: du dx(L) + hu(L)=0, ...
 9.3.9.3.8: Redo Exercise 9.3.6 with the following change: dG dx (L) + hG(L)=0,...
 9.3.9.3.9: Consider d2u dx2 + u = f(x) with u(0) = 0 and u(L)=0. Assume that (...
 9.3.9.3.10: Redo Exercise 9.3.9 using the method of eigenfunction expansion.
 9.3.9.3.11: Consider d2G dx2 + G = (x x0) with G(0, x0) = 0 and G(L, x0)=0. *(a...
 9.3.9.3.12: For the following problems, determine a representation of the solut...
 9.3.9.3.13: Consider the onedimensional infinite space wave equation with a pe...
 9.3.9.3.14: Consider L(u) = f(x) with L = d dx p d dx + q. Assume that the appr...
 9.3.9.3.15: Consider L(G) = (xx0) with L = d dx p d dx +q subject to the bounda...
 9.3.9.3.16: Reconsider (9.3.41), whose solution we have obtained, (9.3.46). For...
 9.3.9.3.17: reduces to (9.3.46) for (9.3.41). 9.3.17. Consider L(u) = f(x) with...
 9.3.9.3.18: Reconsider Exercise 9.3.17. Determine u(x) by the method of eigenfu...
 9.3.9.3.19: (a) If a concentrated source is placed at a node of some mode (eige...
 9.3.9.3.20: Derive the eigenfunction expansion of the Greens function (9.3.23) ...
 9.3.9.3.21: Solve dG dx = (x x0) with G(0, x0)=0. Show that G(x, x0) is not sym...
 9.3.9.3.22: Solve dG dx + G = (x x0) with G(0, x0)=0. Show that G(x, x0) is not...
 9.3.9.3.23: Solve d4G dx4 = (x x0) G(0, x0)=0, G(L, x0)=0 dG dx (0, x0)=0, d2G ...
 9.3.9.3.24: Use Exercise 9.3.23 to solve d4u dx4 = f(x) u(0) = 0, u(L)=0 du dx(...
 9.3.9.3.25: Use the convolution theorem for Laplace transforms to obtain partic...
 9.3.9.3.26: Determine the Greens function satisfying d2G dx2 G = (x x0): (a) Di...
Solutions for Chapter 9.3: Greens Functions for TimeIndependent Problems
Full solutions for Applied Partial Differential Equations with Fourier Series and Boundary Value Problems  5th Edition
ISBN: 9780321797056
Solutions for Chapter 9.3: Greens Functions for TimeIndependent Problems
Get Full SolutionsApplied Partial Differential Equations with Fourier Series and Boundary Value Problems was written by and is associated to the ISBN: 9780321797056. Since 26 problems in chapter 9.3: Greens Functions for TimeIndependent Problems have been answered, more than 8070 students have viewed full stepbystep solutions from this chapter. Chapter 9.3: Greens Functions for TimeIndependent Problems includes 26 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Applied Partial Differential Equations with Fourier Series and Boundary Value Problems, edition: 5.

Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then SI AS = A = eigenvalue matrix.

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Free columns of A.
Columns without pivots; these are combinations of earlier columns.

Free variable Xi.
Column i has no pivot in elimination. We can give the n  r free variables any values, then Ax = b determines the r pivot variables (if solvable!).

GramSchmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.

Pivot columns of A.
Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.

Subspace S of V.
Any vector space inside V, including V and Z = {zero vector only}.

Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.

Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B IIĀ·

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.