×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 8.3: Chebyshev Polynomials and Economization of Power Series

Numerical Analysis | 10th Edition | ISBN: 9781305253667 | Authors: Richard L. Burden J. Douglas Faires, Annette M. Burden

Full solutions for Numerical Analysis | 10th Edition

ISBN: 9781305253667

Numerical Analysis | 10th Edition | ISBN: 9781305253667 | Authors: Richard L. Burden J. Douglas Faires, Annette M. Burden

Solutions for Chapter 8.3: Chebyshev Polynomials and Economization of Power Series

This textbook survival guide was created for the textbook: Numerical Analysis, edition: 10. Since 13 problems in chapter 8.3: Chebyshev Polynomials and Economization of Power Series have been answered, more than 33890 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Numerical Analysis was written by and is associated to the ISBN: 9781305253667. Chapter 8.3: Chebyshev Polynomials and Economization of Power Series includes 13 full step-by-step solutions.

Key Math Terms and definitions covered in this textbook
  • Cholesky factorization

    A = CTC = (L.J]))(L.J]))T for positive definite A.

  • Commuting matrices AB = BA.

    If diagonalizable, they share n eigenvectors.

  • Complete solution x = x p + Xn to Ax = b.

    (Particular x p) + (x n in nullspace).

  • Conjugate Gradient Method.

    A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

  • Cross product u xv in R3:

    Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

  • Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.

    Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

  • Elimination matrix = Elementary matrix Eij.

    The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

  • Ellipse (or ellipsoid) x T Ax = 1.

    A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad

  • Graph G.

    Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.

  • Length II x II.

    Square root of x T x (Pythagoras in n dimensions).

  • Minimal polynomial of A.

    The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).

  • Nilpotent matrix N.

    Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

  • Normal equation AT Ax = ATb.

    Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)ยท(b - Ax) = o.

  • Orthogonal subspaces.

    Every v in V is orthogonal to every w in W.

  • Particular solution x p.

    Any solution to Ax = b; often x p has free variables = o.

  • Pivot columns of A.

    Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

  • Schur complement S, D - C A -} B.

    Appears in block elimination on [~ g ].

  • Symmetric factorizations A = LDLT and A = QAQT.

    Signs in A = signs in D.

  • Trace of A

    = sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

  • Vector space V.

    Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.