×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 6.1: The Greatest Common Factor; Factoring by Grouping

Beginning Algebra | 11th Edition | ISBN: 9780321673480 | Authors: Margaret L. Lial John Hornsby, Terry McGinnis

Full solutions for Beginning Algebra | 11th Edition

ISBN: 9780321673480

Beginning Algebra | 11th Edition | ISBN: 9780321673480 | Authors: Margaret L. Lial John Hornsby, Terry McGinnis

Solutions for Chapter 6.1: The Greatest Common Factor; Factoring by Grouping

Solutions for Chapter 6.1
4 5 0 373 Reviews
15
0
Textbook: Beginning Algebra
Edition: 11
Author: Margaret L. Lial John Hornsby, Terry McGinnis
ISBN: 9780321673480

Chapter 6.1: The Greatest Common Factor; Factoring by Grouping includes 100 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. Since 100 problems in chapter 6.1: The Greatest Common Factor; Factoring by Grouping have been answered, more than 38056 students have viewed full step-by-step solutions from this chapter. This textbook survival guide was created for the textbook: Beginning Algebra, edition: 11. Beginning Algebra was written by and is associated to the ISBN: 9780321673480.

Key Math Terms and definitions covered in this textbook
  • Associative Law (AB)C = A(BC).

    Parentheses can be removed to leave ABC.

  • Determinant IAI = det(A).

    Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

  • Distributive Law

    A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

  • Elimination matrix = Elementary matrix Eij.

    The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

  • Jordan form 1 = M- 1 AM.

    If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

  • Left inverse A+.

    If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Minimal polynomial of A.

    The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).

  • Norm

    IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

  • Normal equation AT Ax = ATb.

    Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.

  • Normal matrix.

    If N NT = NT N, then N has orthonormal (complex) eigenvectors.

  • Orthonormal vectors q 1 , ... , q n·

    Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

  • Outer product uv T

    = column times row = rank one matrix.

  • Particular solution x p.

    Any solution to Ax = b; often x p has free variables = o.

  • Pascal matrix

    Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

  • Pivot columns of A.

    Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

  • Schur complement S, D - C A -} B.

    Appears in block elimination on [~ g ].

  • Skew-symmetric matrix K.

    The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

  • Symmetric factorizations A = LDLT and A = QAQT.

    Signs in A = signs in D.

  • Tridiagonal matrix T: tij = 0 if Ii - j I > 1.

    T- 1 has rank 1 above and below diagonal.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password