×
×

# Solutions for Chapter 3.3: Dividing Polynomials; Remainder and Factor Theorems

## Full solutions for College Algebra | 7th Edition

ISBN: 9780134469164

Solutions for Chapter 3.3: Dividing Polynomials; Remainder and Factor Theorems

Solutions for Chapter 3.3
4 5 0 301 Reviews
17
2
##### ISBN: 9780134469164

College Algebra was written by and is associated to the ISBN: 9780134469164. This textbook survival guide was created for the textbook: College Algebra , edition: 7. Chapter 3.3: Dividing Polynomials; Remainder and Factor Theorems includes 84 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. Since 84 problems in chapter 3.3: Dividing Polynomials; Remainder and Factor Theorems have been answered, more than 29819 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
• Affine transformation

Tv = Av + Vo = linear transformation plus shift.

• Block matrix.

A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

• Change of basis matrix M.

The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

• Characteristic equation det(A - AI) = O.

The n roots are the eigenvalues of A.

• Commuting matrices AB = BA.

If diagonalizable, they share n eigenvectors.

A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

• Diagonalizable matrix A.

Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

• Hilbert matrix hilb(n).

Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.

• Least squares solution X.

The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.

• Linear transformation T.

Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

• Network.

A directed graph that has constants Cl, ... , Cm associated with the edges.

• Norm

IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

• Orthogonal matrix Q.

Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

• Positive definite matrix A.

Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

• Random matrix rand(n) or randn(n).

MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

• Schwarz inequality

Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

• Singular Value Decomposition

(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

• Spanning set.

Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

• Spectrum of A = the set of eigenvalues {A I, ... , An}.

Spectral radius = max of IAi I.

• Vector v in Rn.

Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.

×