×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Textbooks / Math / College Algebra 7

College Algebra 7th Edition - Solutions by Chapter

College Algebra | 7th Edition | ISBN: 9780134469164 | Authors: Robert F. Blitzer

Full solutions for College Algebra | 7th Edition

ISBN: 9780134469164

College Algebra | 7th Edition | ISBN: 9780134469164 | Authors: Robert F. Blitzer

College Algebra | 7th Edition - Solutions by Chapter

Solutions by Chapter
4 5 0 326 Reviews
Textbook: College Algebra
Edition: 7
Author: Robert F. Blitzer
ISBN: 9780134469164

This textbook survival guide was created for the textbook: College Algebra , edition: 7. Since problems from 63 chapters in College Algebra have been answered, more than 48156 students have viewed full step-by-step answer. College Algebra was written by and is associated to the ISBN: 9780134469164. The full step-by-step solution to problem in College Algebra were answered by , our top Math solution expert on 03/08/18, 08:30PM. This expansive textbook survival guide covers the following chapters: 63.

Key Math Terms and definitions covered in this textbook
  • Augmented matrix [A b].

    Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

  • Column picture of Ax = b.

    The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

  • Companion matrix.

    Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).

  • Eigenvalue A and eigenvector x.

    Ax = AX with x#-O so det(A - AI) = o.

  • Elimination matrix = Elementary matrix Eij.

    The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

  • Elimination.

    A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

  • Exponential eAt = I + At + (At)2 12! + ...

    has derivative AeAt; eAt u(O) solves u' = Au.

  • Fast Fourier Transform (FFT).

    A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.

  • Free columns of A.

    Columns without pivots; these are combinations of earlier columns.

  • Fundamental Theorem.

    The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

  • Hessenberg matrix H.

    Triangular matrix with one extra nonzero adjacent diagonal.

  • Inverse matrix A-I.

    Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

  • Linear combination cv + d w or L C jV j.

    Vector addition and scalar multiplication.

  • Markov matrix M.

    All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

  • Nilpotent matrix N.

    Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

  • Projection p = a(aTblaTa) onto the line through a.

    P = aaT laTa has rank l.

  • Simplex method for linear programming.

    The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

  • Singular Value Decomposition

    (SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

  • Transpose matrix AT.

    Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.

  • Vandermonde matrix V.

    V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.