 8.3.1: Use the zeros of T3 to construct an interpolating polynomial of deg...
 8.3.2: Use the zeros offtto construct an interpolating polynomial ofdegree...
 8.3.3: Find a bound for the maximum error ofthe approximation in Exercise ...
 8.3.4: Repeat Exercise 3 for the approximations computed in Exercise 3.
 8.3.5: Use the zeros of Tj and transformations ofthe given interval to con...
 8.3.6: Find the sixth Maciaurin polynomial for xex and use Chebyshev econo...
 8.3.7: Find the sixth Maciaurin polynomial for sinx and use Chebyshev econ...
 8.3.8: The Chebyshev polynomials 7(x) are solutionsto the differential equ...
 8.3.9: An interesting fact is that Tn (x) equals the determinant of the tr...
 8.3.10: Show that for any positive integers i and j with / > j, we have Ti(...
 8.3.11: Show that for each Chebyshev polynomial Tn(x), we have /' [r(x)J2 =...
 8.3.12: Show that for each n, the Chebyshev polynomial r(x) has n distinct ...
 8.3.13: Show that for each n, the derivative of the Chebyshev polynomial r(...
Solutions for Chapter 8.3: Chebyshev Polynomials and Economization of Power Series
Full solutions for Numerical Analysis  10th Edition
ISBN: 9781305253667
Solutions for Chapter 8.3: Chebyshev Polynomials and Economization of Power Series
Get Full SolutionsThis textbook survival guide was created for the textbook: Numerical Analysis, edition: 10. Since 13 problems in chapter 8.3: Chebyshev Polynomials and Economization of Power Series have been answered, more than 9437 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Numerical Analysis was written by and is associated to the ISBN: 9781305253667. Chapter 8.3: Chebyshev Polynomials and Economization of Power Series includes 13 full stepbystep solutions.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Cholesky factorization
A = CTC = (L.J]))(L.J]))T for positive definite A.

Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.

Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn1c can be computed with ne/2 multiplications. Revolutionary.

Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.

Free columns of A.
Columns without pivots; these are combinations of earlier columns.

Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.

Hilbert matrix hilb(n).
Entries HU = 1/(i + j 1) = Jd X i 1 xj1dx. Positive definite but extremely small Amin and large condition number: H is illconditioned.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

Singular matrix A.
A square matrix that has no inverse: det(A) = o.

Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.

Stiffness matrix
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B IIĀ·

Unitary matrix UH = U T = UI.
Orthonormal columns (complex analog of Q).