 5.1.1: Use Theorem 5.4 to show that each ofthe following initialvalue pro...
 5.1.2: Show that each of the following initialvalue problems has a unique...
 5.1.3: For each choice of fit, 3') given in parts (a)(d): i. Does /satisf...
 5.1.4: For each choice of fit, 3') given in parts (a)(d): i. Does /satisf...
 5.1.5: For the following initialvalue problems, show that the given equat...
 5.1.6: Suppose the perturbation 5(r) is proportional to t, thatis, 8(1) 8t...
 5.1.7: Showthatany point on the linejoining (t], yi) to fa, yz) correspond...
 5.1.8: Prove Theorem 5.3 by applying the Mean Value Theorem 1.8 to /{t, y)...
 5.1.9: Show that, for any constants a and b, the set D = {(t, y) \ a < t <...
 5.1.10: Picard's method for solving the initialvalue problem y' = f{t, y), a
Solutions for Chapter 5.1: The Elementary Theory of InitialValue Problems
Full solutions for Numerical Analysis  10th Edition
ISBN: 9781305253667
Solutions for Chapter 5.1: The Elementary Theory of InitialValue Problems
Get Full SolutionsNumerical Analysis was written by and is associated to the ISBN: 9781305253667. Since 10 problems in chapter 5.1: The Elementary Theory of InitialValue Problems have been answered, more than 14039 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 5.1: The Elementary Theory of InitialValue Problems includes 10 full stepbystep solutions. This textbook survival guide was created for the textbook: Numerical Analysis, edition: 10.

Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.

Free variable Xi.
Column i has no pivot in elimination. We can give the n  r free variables any values, then Ax = b determines the r pivot variables (if solvable!).

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

Inverse matrix AI.
Square matrix with AI A = I and AAl = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B1 AI and (AI)T. Cofactor formula (Al)ij = Cji! detA.

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Pascal matrix
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Tridiagonal matrix T: tij = 0 if Ii  j I > 1.
T 1 has rank 1 above and below diagonal.

Unitary matrix UH = U T = UI.
Orthonormal columns (complex analog of Q).

Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn 1 with P(Xi) = bi. Vij = (Xi)jI and det V = product of (Xk  Xi) for k > i.

Vector addition.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.