 10.7.10.7.1: Consider the step potential u(x) = U for 1 0). [Hints: (i) Since we...
 10.7.10.7.2: Show that R 2 + T 2 = 1 for the delta function potential in the...
 10.7.10.7.3: (a) Show that the Wronskian W(1, 2) = 11 2 21 1 of two independent ...
 10.7.10.7.4: Orthogonality conditions (a) Show that the discrete eigenfunctions ...
 10.7.10.7.5: (a) Show that a pole of the transmission coefficient in the upperh...
 10.7.10.7.6: (a) Show that a pole of the transmission coefficient in the upperh...
Solutions for Chapter 10.7: Infinite Domain Problems: Fourier Transform Solutions of Partial Differential Equations
Full solutions for Applied Partial Differential Equations with Fourier Series and Boundary Value Problems  5th Edition
ISBN: 9780321797056
Solutions for Chapter 10.7: Infinite Domain Problems: Fourier Transform Solutions of Partial Differential Equations
Get Full SolutionsApplied Partial Differential Equations with Fourier Series and Boundary Value Problems was written by and is associated to the ISBN: 9780321797056. Chapter 10.7: Infinite Domain Problems: Fourier Transform Solutions of Partial Differential Equations includes 6 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. Since 6 problems in chapter 10.7: Infinite Domain Problems: Fourier Transform Solutions of Partial Differential Equations have been answered, more than 8049 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Applied Partial Differential Equations with Fourier Series and Boundary Value Problems, edition: 5.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

Complex conjugate
z = a  ib for any complex number z = a + ib. Then zz = Iz12.

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.

Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , AjIb. Numerical methods approximate A I b by x j with residual b  Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A  AI) if no eigenvalues are repeated; always meA) divides peA).

Partial pivoting.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.

Pascal matrix
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Rank r (A)
= number of pivots = dimension of column space = dimension of row space.

Schwarz inequality
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Spectrum of A = the set of eigenvalues {A I, ... , An}.
Spectral radius = max of IAi I.

Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and AI are BT AT and (AT)I.