×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 4.1: Numerical Differentiation

Numerical Analysis | 10th Edition | ISBN: 9781305253667 | Authors: Richard L. Burden J. Douglas Faires, Annette M. Burden

Full solutions for Numerical Analysis | 10th Edition

ISBN: 9781305253667

Numerical Analysis | 10th Edition | ISBN: 9781305253667 | Authors: Richard L. Burden J. Douglas Faires, Annette M. Burden

Solutions for Chapter 4.1: Numerical Differentiation

Solutions for Chapter 4.1
4 5 0 345 Reviews
19
2
Textbook: Numerical Analysis
Edition: 10
Author: Richard L. Burden J. Douglas Faires, Annette M. Burden
ISBN: 9781305253667

Numerical Analysis was written by and is associated to the ISBN: 9781305253667. Since 29 problems in chapter 4.1: Numerical Differentiation have been answered, more than 42931 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 4.1: Numerical Differentiation includes 29 full step-by-step solutions. This textbook survival guide was created for the textbook: Numerical Analysis, edition: 10.

Key Math Terms and definitions covered in this textbook
  • Augmented matrix [A b].

    Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

  • Back substitution.

    Upper triangular systems are solved in reverse order Xn to Xl.

  • Block matrix.

    A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

  • Cayley-Hamilton Theorem.

    peA) = det(A - AI) has peA) = zero matrix.

  • Column picture of Ax = b.

    The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

  • Cyclic shift

    S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

  • Diagonalization

    A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

  • Exponential eAt = I + At + (At)2 12! + ...

    has derivative AeAt; eAt u(O) solves u' = Au.

  • Hessenberg matrix H.

    Triangular matrix with one extra nonzero adjacent diagonal.

  • Minimal polynomial of A.

    The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).

  • Multiplicities AM and G M.

    The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

  • Network.

    A directed graph that has constants Cl, ... , Cm associated with the edges.

  • Nullspace N (A)

    = All solutions to Ax = O. Dimension n - r = (# columns) - rank.

  • Pivot columns of A.

    Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

  • Projection matrix P onto subspace S.

    Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

  • Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.

    Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

  • Reduced row echelon form R = rref(A).

    Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

  • Row picture of Ax = b.

    Each equation gives a plane in Rn; the planes intersect at x.

  • Skew-symmetric matrix K.

    The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

  • Triangle inequality II u + v II < II u II + II v II.

    For matrix norms II A + B II < II A II + II B IIĀ·