×

# Solutions for Chapter 7-5: Multiplying a Polynomial by a Monomial

## Full solutions for Algebra 1, Student Edition (MERRILL ALGEBRA 1) | 1st Edition

ISBN: 9780078738227

Solutions for Chapter 7-5: Multiplying a Polynomial by a Monomial

Solutions for Chapter 7-5
4 5 0 422 Reviews
17
5
##### ISBN: 9780078738227

This textbook survival guide was created for the textbook: Algebra 1, Student Edition (MERRILL ALGEBRA 1) , edition: 1. Since 90 problems in chapter 7-5: Multiplying a Polynomial by a Monomial have been answered, more than 23048 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 7-5: Multiplying a Polynomial by a Monomial includes 90 full step-by-step solutions. Algebra 1, Student Edition (MERRILL ALGEBRA 1) was written by and is associated to the ISBN: 9780078738227.

Key Math Terms and definitions covered in this textbook
• Affine transformation

Tv = Av + Vo = linear transformation plus shift.

• Block matrix.

A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

• Cofactor Cij.

Remove row i and column j; multiply the determinant by (-I)i + j •

• Condition number

cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

• Cramer's Rule for Ax = b.

B j has b replacing column j of A; x j = det B j I det A

• Diagonal matrix D.

dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.

• Diagonalization

A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

• Independent vectors VI, .. " vk.

No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

• Inverse matrix A-I.

Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

• Linearly dependent VI, ... , Vn.

A combination other than all Ci = 0 gives L Ci Vi = O.

• Norm

IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

• Particular solution x p.

Any solution to Ax = b; often x p has free variables = o.

• Pseudoinverse A+ (Moore-Penrose inverse).

The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

• Schwarz inequality

Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

• Semidefinite matrix A.

(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

• Subspace S of V.

Any vector space inside V, including V and Z = {zero vector only}.

• Symmetric matrix A.

The transpose is AT = A, and aU = a ji. A-I is also symmetric.

• Triangle inequality II u + v II < II u II + II v II.

For matrix norms II A + B II < II A II + II B II·

• Unitary matrix UH = U T = U-I.

Orthonormal columns (complex analog of Q).

v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.

×