×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 2.6: Zeros of Polynomials and Muller's Method

Numerical Analysis | 10th Edition | ISBN: 9781305253667 | Authors: Richard L. Burden J. Douglas Faires, Annette M. Burden

Full solutions for Numerical Analysis | 10th Edition

ISBN: 9781305253667

Numerical Analysis | 10th Edition | ISBN: 9781305253667 | Authors: Richard L. Burden J. Douglas Faires, Annette M. Burden

Solutions for Chapter 2.6: Zeros of Polynomials and Muller's Method

This textbook survival guide was created for the textbook: Numerical Analysis, edition: 10. Numerical Analysis was written by and is associated to the ISBN: 9781305253667. This expansive textbook survival guide covers the following chapters and their solutions. Since 10 problems in chapter 2.6: Zeros of Polynomials and Muller's Method have been answered, more than 42780 students have viewed full step-by-step solutions from this chapter. Chapter 2.6: Zeros of Polynomials and Muller's Method includes 10 full step-by-step solutions.

Key Math Terms and definitions covered in this textbook
  • Block matrix.

    A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

  • Elimination.

    A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

  • Factorization

    A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

  • Gauss-Jordan method.

    Invert A by row operations on [A I] to reach [I A-I].

  • Gram-Schmidt orthogonalization A = QR.

    Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

  • Hermitian matrix A H = AT = A.

    Complex analog a j i = aU of a symmetric matrix.

  • Hypercube matrix pl.

    Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

  • Iterative method.

    A sequence of steps intended to approach the desired solution.

  • Least squares solution X.

    The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.

  • Left inverse A+.

    If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

  • Linear combination cv + d w or L C jV j.

    Vector addition and scalar multiplication.

  • Nullspace N (A)

    = All solutions to Ax = O. Dimension n - r = (# columns) - rank.

  • Pseudoinverse A+ (Moore-Penrose inverse).

    The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

  • Rotation matrix

    R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().

  • Saddle point of I(x}, ... ,xn ).

    A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

  • Standard basis for Rn.

    Columns of n by n identity matrix (written i ,j ,k in R3).

  • Stiffness matrix

    If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

  • Symmetric matrix A.

    The transpose is AT = A, and aU = a ji. A-I is also symmetric.

  • Toeplitz matrix.

    Constant down each diagonal = time-invariant (shift-invariant) filter.

  • Wavelets Wjk(t).

    Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).