- Chapter 1: Graphs, Functions, and Models
- Chapter 1.1: Introduction to Graphing
- Chapter 1.2: Functions and Graphs
- Chapter 1.3: Linear Functions, Slope, and Applications
- Chapter 1.4: Equations of Lines and Modeling
- Chapter 1.5: Linear Equations, Functions, Zeros, and Applications
- Chapter 1.6: Solving Linear Inequalities
- Chapter 2: More on Functions
- Chapter 2.1: Increasing, Decreasing, and Piecewise Functions; Applications
- Chapter 2.2: The Algebra of Functions
- Chapter 2.3: The Composition of Functions
- Chapter 2.4: Symmetry
- Chapter 2.5: Transformations
- Chapter 2.6: Variation and Applications
- Chapter 3: Quadratic Functions and Equations; Inequalities
- Chapter 3.1: The Complex Numbers
- Chapter 3.2: Quadratic Equations, Functions, Zeros, and Models
- Chapter 3.3: Analyzing Graphs of Quadratic Functions
- Chapter 3.4: Solving Rational Equations and Radical Equations
- Chapter 3.5: Solving Equations and Inequalities with Absolute Value
- Chapter 4: Polynomial Functions and Rational Functions
- Chapter 4.1: Polynomial Functions and Modeling
- Chapter 4.2: Graphing Polynomial Functions
- Chapter 4.3: Polynomial Division; The Remainder Theorem and the Factor Theorem
- Chapter 4.4: Theorems about Zeros of Polynomial Functions
- Chapter 4.5: Rational Functions
- Chapter 4.6: Polynomial Inequalities and Rational Inequalities
- Chapter 5: Exponential Functions and Logarithmic Functions
- Chapter 5.1: Inverse Functions
- Chapter 5.2: Exponential Functions and Graphs
- Chapter 5.3: Logarithmic Functions and Graphs
- Chapter 5.4: Properties of Logarithmic Functions
- Chapter 5.5: Solving Exponential Equations and Logarithmic Equations
- Chapter 5.6: Applications and Models: Growth and Decay; Compound Interest
- Chapter 6: Systems of Equations and Matrices
- Chapter 6.1: Systems of Equations in Two Variables
- Chapter 6.2: Systems of Equations in Three Variables
- Chapter 6.3: Matrices and Systems of Equations
- Chapter 6.4: Matrix Operations
- Chapter 6.5: Inverses of Matrices
- Chapter 6.6: Determinants and Cramers Rule
- Chapter 6.7: Systems of Inequalities and Linear Programming
- Chapter 6.8: Partial Fractions
- Chapter 7: Conic Sections
- Chapter 7.1: The Parabola
- Chapter 7.2: The Circle and the Ellipse
- Chapter 7.3: The Hyperbola
- Chapter 7.4: Nonlinear Systems of Equations and Inequalities
- Chapter 8: Sequences, Series, and Combinatorics
- Chapter 8.1: Sequences and Series
- Chapter 8.2: Arithmetic Sequences and Series
- Chapter 8.3: Geometric Sequences and Series
- Chapter 8.4: Mathematical Induction
- Chapter 8.5: Combinatorics: Permutations
- Chapter 8.6: Combinatorics: Combinations
- Chapter 8.7: The Binomial Theorem
- Chapter 8.8: Probability
- Chapter R: Basic Concepts of Algebra
- Chapter R.1: The Real-Number System
- Chapter R.2: Integer Exponents, Scientific Notation, and Order of Operations
- Chapter R.3: Addition, Subtraction, and Multiplication of Polynomials
- Chapter R.4: Factoring
- Chapter R.5: The Basics of Equation Solving
- Chapter R.6: Rational Expressions
- Chapter R.7: Radical Notation and Rational Exponents
College Algebra: Graphs and Models 5th Edition - Solutions by Chapter
Full solutions for College Algebra: Graphs and Models | 5th Edition
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
peA) = det(A - AI) has peA) = zero matrix.
A(B + C) = AB + AC. Add then multiply, or mUltiply then add.
Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).
Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.
Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.
Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.
Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .
Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).
Nullspace N (A)
= All solutions to Ax = O. Dimension n - r = (# columns) - rank.
Every v in V is orthogonal to every w in W.
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
Pivot columns of A.
Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.
Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.
Row space C (AT) = all combinations of rows of A.
Column vectors by convention.
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Similar matrices A and B.
Every B = M-I AM has the same eigenvalues as A.
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.
Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.
Having trouble accessing your account? Let us help you, contact support at +1(510) 944-1054 or email@example.com
Forgot password? Reset it here