 Chapter 1: Equations and Inequalities
 Chapter 1.1: Linear Equations
 Chapter 1.2: Quadratic Equations
 Chapter 1.3: Complex Numbers; Quadratic Equations in the Complex Number System
 Chapter 1.4: Radical Equations; Equations Quadratic in Form; Factorable Equations
 Chapter 1.5: Solving Inequalities
 Chapter 1.6: Equations and Inequalities Involving Absolute Value
 Chapter 1.7: Problem Solving: Interest, Mixture, Uniform Motion, Constant Rate Job Applications
 Chapter 10.1: Counting
 Chapter 10.2: Permutations and Combinations
 Chapter 10.3: Probability
 Chapter 2: Graphs
 Chapter 2.1: The Distance and Midpoint Formulas
 Chapter 2.2: Graphs of Equations in Two Variables; Intercepts; Symmetry
 Chapter 2.3: Lines
 Chapter 2.4: Circles
 Chapter 2.5: Variation
 Chapter 3: Functions and Their Graphs
 Chapter 3.1: Functions
 Chapter 3.2: The Graph of a Function
 Chapter 3.3: Properties of Functions
 Chapter 3.4: Library of Functions; Piecewisedefined Functions
 Chapter 3.5: Graphing Techniques: Transformations
 Chapter 3.6: Mathematical Models: Building Functions
 Chapter 4: Linear and Quadratic Functions
 Chapter 4.1: Linear Functions and Their Properties
 Chapter 4.2: Linear Models: Building Linear Functions from Data
 Chapter 4.3: Quadratic Functions and Their Properties
 Chapter 4.4: Build Quadratic Models from Verbal Descriptions and from Data
 Chapter 4.5: Inequalities Involving Quadratic Functions
 Chapter 5.1: Polynomial Functions and Models
 Chapter 6: Exponential and Logarithmic Functions
 Chapter 6.1: Composite Functions
 Chapter 6.2: OnetoOne Functions; Inverse Functions
 Chapter 6.3: Exponential Functions
 Chapter 6.4: Logarithmic Functions
 Chapter 6.5: Properties of Logarithms
 Chapter 6.6: Logarithmic and Exponential Equations
 Chapter 6.7: Financial Models
 Chapter 6.8: Exponential Growth and Decay Models; Newtons Law; Logistic Growth and Decay Models
 Chapter 6.9: Building Exponential, Logarithmic, and Logistic Models from Data
 Chapter 7: Analytic Geometry
 Chapter 7.2: The Parabola
 Chapter 7.3: The Ellipse
 Chapter 7.4: The Hyperbola
 Chapter 8.1: Systems of Linear Equations: Substitution and Elimination
 Chapter 8.2: Systems of Linear Equations: Matrices
 Chapter 8.3: Systems of Linear Equations: Determinants
 Chapter 8.4: Matrix Algebra
 Chapter 8.5: Partial Fraction Decomposition
 Chapter 8.6: Systems of Nonlinear Equations
 Chapter 8.7: Systems of Inequalities
 Chapter 9.1: Sequences
 Chapter 9.2: Arithmetic Sequences
 Chapter 9.3: Geometric Sequences; Geometric Series
 Chapter 9.4: Mathematical Induction
 Chapter 9.5: The Binomial Theorem
 Chapter Chapter 10: Counting and Probability
 Chapter Chapter 8: Systems of Equations and Inequalities
 Chapter Chapter 9: Sequences; Induction; the Binomial Theorem
 Chapter R.1: Real Numbers
 Chapter R.2: Algebra Essentials
 Chapter R.3: Geometry Essentials
 Chapter R.4 : Polynomials
 Chapter R.5 : Factoring Polynomials
 Chapter R.6 : Synthetic Division
 Chapter R.7 : Rational Expressions
 Chapter R.8 : nth Roots; Rational Exponents
College Algebra 9th Edition  Solutions by Chapter
Full solutions for College Algebra  9th Edition
ISBN: 9780321716811
College Algebra  9th Edition  Solutions by Chapter
Get Full SolutionsThe full stepbystep solution to problem in College Algebra were answered by , our top Math solution expert on 03/19/18, 03:33PM. College Algebra was written by and is associated to the ISBN: 9780321716811. This expansive textbook survival guide covers the following chapters: 68. Since problems from 68 chapters in College Algebra have been answered, more than 24164 students have viewed full stepbystep answer. This textbook survival guide was created for the textbook: College Algebra, edition: 9.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Block matrix.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

Characteristic equation det(A  AI) = O.
The n roots are the eigenvalues of A.

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Companion matrix.
Put CI, ... ,Cn in row n and put n  1 ones just above the main diagonal. Then det(A  AI) = ±(CI + c2A + C3A 2 + .•. + cnA nl  An).

Complex conjugate
z = a  ib for any complex number z = a + ib. Then zz = Iz12.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Elimination.
A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

Hilbert matrix hilb(n).
Entries HU = 1/(i + j 1) = Jd X i 1 xj1dx. Positive definite but extremely small Amin and large condition number: H is illconditioned.

Inverse matrix AI.
Square matrix with AI A = I and AAl = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B1 AI and (AI)T. Cofactor formula (Al)ij = Cji! detA.

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.

Length II x II.
Square root of x T x (Pythagoras in n dimensions).

Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

Nullspace N (A)
= All solutions to Ax = O. Dimension n  r = (# columns)  rank.

Outer product uv T
= column times row = rank one matrix.

Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or 1) based on the number of row exchanges to reach I.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.