 8.1.1: In 1 through 10, find a power series solution of the given differen...
 8.1.2: In 1 through 10, find a power series solution of the given differen...
 8.1.3: In 1 through 10, find a power series solution of the given differen...
 8.1.4: In 1 through 10, find a power series solution of the given differen...
 8.1.5: In 1 through 10, find a power series solution of the given differen...
 8.1.6: In 1 through 10, find a power series solution of the given differen...
 8.1.7: In 1 through 10, find a power series solution of the given differen...
 8.1.8: In 1 through 10, find a power series solution of the given differen...
 8.1.9: In 1 through 10, find a power series solution of the given differen...
 8.1.10: In 1 through 10, find a power series solution of the given differen...
 8.1.11: In 11 through 14, use the method of Example 4 to find two linearly ...
 8.1.12: In 11 through 14, use the method of Example 4 to find two linearly ...
 8.1.13: In 11 through 14, use the method of Example 4 to find two linearly ...
 8.1.14: In 11 through 14, use the method of Example 4 to find two linearly ...
 8.1.15: Show (as in Example 3) that the power series method fails to yield ...
 8.1.16: Show (as in Example 3) that the power series method fails to yield ...
 8.1.17: Show (as in Example 3) that the power series method fails to yield ...
 8.1.18: Show (as in Example 3) that the power series method fails to yield ...
 8.1.19: In 19 through 22, first derive a recurrence relation giving cn for ...
 8.1.20: In 19 through 22, first derive a recurrence relation giving cn for ...
 8.1.21: In 19 through 22, first derive a recurrence relation giving cn for ...
 8.1.22: In 19 through 22, first derive a recurrence relation giving cn for ...
 8.1.23: Show that the equation x2y00 C x2y0 C y D 0 has no power series sol...
 8.1.24: Establish the binomial series in (12) by means of the following ste...
 8.1.25: For the initial value problem y00 D y0 C y; y.0/ D 0; y.1/ D 1 deri...
 8.1.26: (a) Show that the solution of the initial value problem y0 D 1 C y2...
 8.1.27: This section introduces the use of infinite series to solve differe...
Solutions for Chapter 8.1: Introduction and Review of Power Series
Full solutions for Differential Equations and Boundary Value Problems: Computing and Modeling  5th Edition
ISBN: 9780321796981
Solutions for Chapter 8.1: Introduction and Review of Power Series
Get Full SolutionsThis textbook survival guide was created for the textbook: Differential Equations and Boundary Value Problems: Computing and Modeling, edition: 5. This expansive textbook survival guide covers the following chapters and their solutions. Since 27 problems in chapter 8.1: Introduction and Review of Power Series have been answered, more than 15755 students have viewed full stepbystep solutions from this chapter. Chapter 8.1: Introduction and Review of Power Series includes 27 full stepbystep solutions. Differential Equations and Boundary Value Problems: Computing and Modeling was written by and is associated to the ISBN: 9780321796981.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Complex conjugate
z = a  ib for any complex number z = a + ib. Then zz = Iz12.

Covariance matrix:E.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x  x) (x  x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

Diagonal matrix D.
dij = 0 if i # j. Blockdiagonal: zero outside square blocks Du.

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

GramSchmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.

Identity matrix I (or In).
Diagonal entries = 1, offdiagonal entries = 0.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , AjIb. Numerical methods approximate A I b by x j with residual b  Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Orthogonal subspaces.
Every v in V is orthogonal to every w in W.

Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Spanning set.
Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

Vector space V.
Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.