×
×

# Solutions for Chapter 3.4: ZEROS OF POLYNOMIAL FUNCTIONS

## Full solutions for College Algebra | 8th Edition

ISBN: 9781439048696

Solutions for Chapter 3.4: ZEROS OF POLYNOMIAL FUNCTIONS

Solutions for Chapter 3.4
4 5 0 285 Reviews
17
4
##### ISBN: 9781439048696

This expansive textbook survival guide covers the following chapters and their solutions. College Algebra was written by and is associated to the ISBN: 9781439048696. Chapter 3.4: ZEROS OF POLYNOMIAL FUNCTIONS includes 139 full step-by-step solutions. This textbook survival guide was created for the textbook: College Algebra , edition: 8. Since 139 problems in chapter 3.4: ZEROS OF POLYNOMIAL FUNCTIONS have been answered, more than 36855 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
• Back substitution.

Upper triangular systems are solved in reverse order Xn to Xl.

• Block matrix.

A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

• Circulant matrix C.

Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.

• Commuting matrices AB = BA.

If diagonalizable, they share n eigenvectors.

• Hilbert matrix hilb(n).

Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.

• Length II x II.

Square root of x T x (Pythagoras in n dimensions).

• Markov matrix M.

All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

• Multiplication Ax

= Xl (column 1) + ... + xn(column n) = combination of columns.

• Orthogonal subspaces.

Every v in V is orthogonal to every w in W.

• Orthonormal vectors q 1 , ... , q n·

Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

• Outer product uv T

= column times row = rank one matrix.

• Permutation matrix P.

There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.

• Projection matrix P onto subspace S.

Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

• Rank one matrix A = uvT f=. O.

Column and row spaces = lines cu and cv.

• Rank r (A)

= number of pivots = dimension of column space = dimension of row space.

• Simplex method for linear programming.

The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

• Spectrum of A = the set of eigenvalues {A I, ... , An}.

Spectral radius = max of IAi I.

• Standard basis for Rn.

Columns of n by n identity matrix (written i ,j ,k in R3).

• Triangle inequality II u + v II < II u II + II v II.

For matrix norms II A + B II < II A II + II B II·

• Volume of box.

The rows (or the columns) of A generate a box with volume I det(A) I.

×