- 5.8.1: Use the Extrapolation Algorithm with tolerance TOL = 104, hmax = 0....
- 5.8.2: Use the Extrapolation Algorithm with TOL = 104 to approximate the s...
- 5.8.3: Use the Extrapolation Algorithm with tolerance TOL = 106, hmax = 0....
- 5.8.4: Let P(t) be the number of individuals in a population at time t, me...
Solutions for Chapter 5.8: Extrapolation Methods
Full solutions for Numerical Analysis | 9th Edition
Big formula for n by n determinants.
Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or - sign.
A = CTC = (L.J]))(L.J]))T for positive definite A.
Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).
Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.
Dimension of vector space
dim(V) = number of vectors in any basis for V.
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.
Eigenvalue A and eigenvector x.
Ax = AX with x#-O so det(A - AI) = o.
Inverse matrix A-I.
Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.
Nullspace N (A)
= All solutions to Ax = O. Dimension n - r = (# columns) - rank.
Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.
Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.
Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.
Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.
Row space C (AT) = all combinations of rows of A.
Column vectors by convention.
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.
Skew-symmetric matrix K.
The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.
Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.
Vandermonde matrix V.
V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.
Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.