Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
Reset your password

Solutions for Chapter 17: Factorization of Polynomials

Contemporary Abstract Algebra | 8th Edition | ISBN: 9781133599708 | Authors: Joseph Gallian

Full solutions for Contemporary Abstract Algebra | 8th Edition

ISBN: 9781133599708

Contemporary Abstract Algebra | 8th Edition | ISBN: 9781133599708 | Authors: Joseph Gallian

Solutions for Chapter 17: Factorization of Polynomials

Solutions for Chapter 17
4 5 0 394 Reviews
Textbook: Contemporary Abstract Algebra
Edition: 8
Author: Joseph Gallian
ISBN: 9781133599708

Summary of Chapter 17: Factorization of Polynomials

In high school, students spend much time factoring polynomials and finding their zeros. In this chapter, we consider the same problems in a more abstract setting.

This expansive textbook survival guide covers the following chapters and their solutions. Since 41 problems in chapter 17: Factorization of Polynomials have been answered, more than 229266 students have viewed full step-by-step solutions from this chapter. This textbook survival guide was created for the textbook: Contemporary Abstract Algebra , edition: 8. Contemporary Abstract Algebra was written by and is associated to the ISBN: 9781133599708. Chapter 17: Factorization of Polynomials includes 41 full step-by-step solutions.

Key Math Terms and definitions covered in this textbook
  • Eigenvalue A and eigenvector x.

    Ax = AX with x#-O so det(A - AI) = o.

  • Elimination.

    A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

  • Free columns of A.

    Columns without pivots; these are combinations of earlier columns.

  • Gauss-Jordan method.

    Invert A by row operations on [A I] to reach [I A-I].

  • Independent vectors VI, .. " vk.

    No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

  • Kirchhoff's Laws.

    Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

  • Left nullspace N (AT).

    Nullspace of AT = "left nullspace" of A because y T A = OT.

  • Length II x II.

    Square root of x T x (Pythagoras in n dimensions).

  • Markov matrix M.

    All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

  • Minimal polynomial of A.

    The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).

  • Multiplier eij.

    The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

  • Norm

    IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

  • Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.

    Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

  • Similar matrices A and B.

    Every B = M-I AM has the same eigenvalues as A.

  • Skew-symmetric matrix K.

    The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

  • Solvable system Ax = b.

    The right side b is in the column space of A.

  • Standard basis for Rn.

    Columns of n by n identity matrix (written i ,j ,k in R3).

  • Toeplitz matrix.

    Constant down each diagonal = time-invariant (shift-invariant) filter.

  • Transpose matrix AT.

    Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.

  • Vandermonde matrix V.

    V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.