×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Textbooks / Math / College Algebra 6

College Algebra 6th Edition - Solutions by Chapter

College Algebra | 6th Edition | ISBN: 9780321782281 | Authors: Robert F. Blitzer

Full solutions for College Algebra | 6th Edition

ISBN: 9780321782281

College Algebra | 6th Edition | ISBN: 9780321782281 | Authors: Robert F. Blitzer

College Algebra | 6th Edition - Solutions by Chapter

Solutions by Chapter
4 5 0 322 Reviews
Textbook: College Algebra
Edition: 6
Author: Robert F. Blitzer
ISBN: 9780321782281

College Algebra was written by and is associated to the ISBN: 9780321782281. This textbook survival guide was created for the textbook: College Algebra , edition: 6. The full step-by-step solution to problem in College Algebra were answered by , our top Math solution expert on 03/08/18, 08:26PM. Since problems from 63 chapters in College Algebra have been answered, more than 72243 students have viewed full step-by-step answer. This expansive textbook survival guide covers the following chapters: 63.

Key Math Terms and definitions covered in this textbook
  • Block matrix.

    A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

  • Circulant matrix C.

    Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.

  • Diagonalizable matrix A.

    Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

  • Diagonalization

    A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

  • Dimension of vector space

    dim(V) = number of vectors in any basis for V.

  • Distributive Law

    A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

  • Elimination.

    A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

  • Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).

    Use AT for complex A.

  • Fundamental Theorem.

    The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

  • Jordan form 1 = M- 1 AM.

    If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

  • Multiplicities AM and G M.

    The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

  • Network.

    A directed graph that has constants Cl, ... , Cm associated with the edges.

  • Norm

    IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

  • Orthogonal subspaces.

    Every v in V is orthogonal to every w in W.

  • Particular solution x p.

    Any solution to Ax = b; often x p has free variables = o.

  • Positive definite matrix A.

    Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

  • Rank r (A)

    = number of pivots = dimension of column space = dimension of row space.

  • Reduced row echelon form R = rref(A).

    Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.

  • Saddle point of I(x}, ... ,xn ).

    A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

  • Vector addition.

    v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.