- 8.3.1: Use the zeros of T3 to construct an interpolating polynomial of deg...
- 8.3.2: Use the zeros offtto construct an interpolating polynomial ofdegree...
- 8.3.3: Find a bound for the maximum error ofthe approximation in Exercise ...
- 8.3.4: Repeat Exercise 3 for the approximations computed in Exercise 3.
- 8.3.5: Use the zeros of Tj and transformations ofthe given interval to con...
- 8.3.6: Find the sixth Maciaurin polynomial for xex and use Chebyshev econo...
- 8.3.7: Find the sixth Maciaurin polynomial for sinx and use Chebyshev econ...
- 8.3.8: The Chebyshev polynomials 7(x) are solutionsto the differential equ...
- 8.3.9: An interesting fact is that Tn (x) equals the determinant of the tr...
- 8.3.10: Show that for any positive integers i and j with / > j, we have Ti(...
- 8.3.11: Show that for each Chebyshev polynomial Tn(x), we have /' [r(x)J2 =...
- 8.3.12: Show that for each n, the Chebyshev polynomial r(x) has n distinct ...
- 8.3.13: Show that for each n, the derivative of the Chebyshev polynomial r(...
Solutions for Chapter 8.3: Chebyshev Polynomials and Economization of Power Series
Full solutions for Numerical Analysis | 10th Edition
Solutions for Chapter 8.3: Chebyshev Polynomials and Economization of Power SeriesGet Full Solutions
A = CTC = (L.J]))(L.J]))T for positive definite A.
Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.
Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).
Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.
Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].
Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).
Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.
Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad
Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).
Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Every v in V is orthogonal to every w in W.
Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.
Pivot columns of A.
Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.
Schur complement S, D - C A -} B.
Appears in block elimination on [~ g ].
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.
Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.
Vector space V.
Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.