×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter R.5 : Factoring Polynomials

Full solutions for College Algebra | 9th Edition

ISBN: 9780321716811

Solutions for Chapter R.5 : Factoring Polynomials

Solutions for Chapter R.5
4 5 0 403 Reviews
30
2
Textbook: College Algebra
Edition: 9
Author: Michael Sullivan
ISBN: 9780321716811

This expansive textbook survival guide covers the following chapters and their solutions. College Algebra was written by and is associated to the ISBN: 9780321716811. This textbook survival guide was created for the textbook: College Algebra, edition: 9. Chapter R.5 : Factoring Polynomials includes 136 full step-by-step solutions. Since 136 problems in chapter R.5 : Factoring Polynomials have been answered, more than 53627 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
  • Cofactor Cij.

    Remove row i and column j; multiply the determinant by (-I)i + j •

  • Commuting matrices AB = BA.

    If diagonalizable, they share n eigenvectors.

  • Condition number

    cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

  • Covariance matrix:E.

    When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

  • Diagonal matrix D.

    dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.

  • Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.

    Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

  • Hilbert matrix hilb(n).

    Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.

  • Length II x II.

    Square root of x T x (Pythagoras in n dimensions).

  • Linear combination cv + d w or L C jV j.

    Vector addition and scalar multiplication.

  • Linear transformation T.

    Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

  • Lucas numbers

    Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

  • Minimal polynomial of A.

    The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).

  • Multiplicities AM and G M.

    The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

  • Right inverse A+.

    If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.

  • Saddle point of I(x}, ... ,xn ).

    A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

  • Semidefinite matrix A.

    (Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

  • Spanning set.

    Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

  • Spectrum of A = the set of eigenvalues {A I, ... , An}.

    Spectral radius = max of IAi I.

  • Stiffness matrix

    If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

  • Trace of A

    = sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.