- 5.3.1: Use Taylor's method of order two to approximate the solutions for e...
- 5.3.2: Use Taylor's method of order two to approximate the solutions for e...
- 5.3.3: Repeat Exercise 1 using Taylor's method of order four.
- 5.3.4: Repeat Exercise 2 using Taylor's method of order four.
- 5.3.5: Use Taylor's method of order two to approximate the solution for ea...
- 5.3.6: Use Taylor's method of order two to approximate the solution for ea...
- 5.3.7: Repeat Exercise 5 using Taylor's method of order four.
- 5.3.8: Repeat Exercise 6 using Taylor's method of order four.
- 5.3.9: Given the initial-value problem 2 y' = -y + t 2 e', 1 < / < 2, y(l)...
- 5.3.10: Given the initial-value problem y'= ~2 ~ j~y2 ' yd) = -i, with exac...
- 5.3.11: Use the Taylor method of order two with /? = 0.1 to approximate the...
- 5.3.12: A projectile of mass m = 0.11 kg shot vertically upward with initia...
- 5.3.13: A large tank holds 1000 gallons of water containing 50 pounds of di...
Solutions for Chapter 5.3: Higher-Order Taylor Methods
Full solutions for Numerical Analysis | 10th Edition
cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.
Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.
Free variable Xi.
Column i has no pivot in elimination. We can give the n - r free variables any values, then Ax = b determines the r pivot variables (if solvable!).
Gram-Schmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.
Hilbert matrix hilb(n).
Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.
Identity matrix I (or In).
Diagonal entries = 1, off-diagonal entries = 0.
Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.
Inverse matrix A-I.
Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.
A sequence of steps intended to approach the desired solution.
Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.
Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.
Rank r (A)
= number of pivots = dimension of column space = dimension of row space.
Reflection matrix (Householder) Q = I -2uuT.
Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.
Volume of box.
The rows (or the columns) of A generate a box with volume I det(A) I.