- Chapter 1.1: Algebraic Expressions, real Numbers, and Interval Novation
- Chapter 1.2: Operations with Real Numbers and Simplifying Algebraic Expressions
- Chapter 1.3: Graphing Equations
- Chapter 1.4: Solving Linear Equations
- Chapter 1.5: Problem Solving and Using Formulas
- Chapter 1.6: Properties of Integral Exponents
- Chapter 1.7: Scientific Notation
- Chapter 10.1: Distance and Midpoint Formulas; Circles
- Chapter 10.2: The Ellipse
- Chapter 10.3: The Hyperbola
- Chapter 10.4: The Parabola; Identifying Conic Sections
- Chapter 10.5: Systems of Nonlinear Equations in Two Variables
- Chapter 11.1: Sequences and Summation Notation
- Chapter 11.2: Arithmetic Sequences
- Chapter 11.3: Geometric Sequences and Series
- Chapter 11.4: The Binomial Theorem
- Chapter 2.1: Introduction to Functions
- Chapter 2.2: Graphs of Functions
- Chapter 2.3: The Algebra of Functions
- Chapter 2.4: Linear Functions and Slope
- Chapter 2.5: The Point Slope-Form of the Equation of a Line
- Chapter 3.1: Systems of Linear Equations in Two Variables
- Chapter 3.2: Problem Solving and Business Applications Using Systems of Equations
- Chapter 3.3: Systems of Linear Equations in Three Variables
- Chapter 3.4: Matrix Solutions of Linear Systems
- Chapter 3.5: Determinants and Cramers Rule
- Chapter 4.1: Solving Linear Inequalities
- Chapter 4.2: Compound Inequalities
- Chapter 4.3: Equations and Inequalities Involving Absolute Value
- Chapter 4.4: Linear Inequalities in Two Variables
- Chapter 4.5: Linear Programming
- Chapter 5.1: Introduction to Polynomials and Polynomial Functions
- Chapter 5.2: Multiplication of Polynomials
- Chapter 5.3: Greatest Common Factors and Factoring by Grouping
- Chapter 5.4: Factoring Trinomials
- Chapter 5.5: Factoring Special Forms
- Chapter 5.6: A General Factoring Strategy
- Chapter 5.7: Polynomial Equations and Their Applications
- Chapter 6.1: Rational Expressions and Functions: Multiplying and Dividing
- Chapter 6.2: Adding and Subtracting Rational Expressions
- Chapter 6.3: Complex Rational Expressions
- Chapter 6.4: Division of Polynomials
- Chapter 6.5: Synthetic Division and the Remainder Theorem
- Chapter 6.6: Rational Equations
- Chapter 6.7: Formulas and Applications of Rational Equations
- Chapter 6.8: Modeling Using Variation
- Chapter 7.1: Radical Expressions and Functions
- Chapter 7.2: Rational Exponents
- Chapter 7.3: Multiplying and Simplifying Radical Expressions
- Chapter 7.4: Adding, Subtracting, and Dividing Radical Expressions
- Chapter 7.5: Multiplying with More Than One Term and Rationalizing Denominators
- Chapter 7.6: Radical Equations
- Chapter 8.1: The Square Root Property and Completing the Square
- Chapter 8.2: The Quadratic Formula
- Chapter 8.3: Quadratic Functions and Their Graphs
- Chapter 8.4: Equations Quadratic in Form
- Chapter 8.5: Polynomial and Rational Inequalities
- Chapter 9.1: Exponential Functions
- Chapter 9.2: Composite and Inverse Functions
- Chapter 9.3: Logarithmic Functions
- Chapter 9.4: Properties of Logarithms
- Chapter 9.5: Exponential and Logarithmic Equations
- Chapter 9.6: Exponential Growth and Decay; Modeling Data
- Chapter Chapter 1: Algebra, Mathematical Models, and Problem Solving
- Chapter Chapter 10: Conic Sections and Systems of Nonlinear Equations
- Chapter Chapter 11: Sequences, Series, and the Binomial Theorem
- Chapter Chapter 2: Functions and Linear Equations
- Chapter Chapter 3: Systems of Linear Equations
- Chapter Chapter 4: Inequalities and Problem Solving
- Chapter Chapter 5: Polynomials, Polynomial Functions, and Factoring
- Chapter Chapter 6: Rational Expressions, Functions, and Equations
- Chapter Chapter 7: Radicals, Radical Functions, and Rational Exponents
- Chapter Chapter 8: Quadratic Equations and Functions
- Chapter Chapter 9: Exponential and Logarithmic Functions
Intermediate Algebra for College Students 6th Edition - Solutions by Chapter
Full solutions for Intermediate Algebra for College Students | 6th Edition
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.
Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).
Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).
lA-II = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n - 1, volume of box = I det( A) I.
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.
Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).
Nullspace matrix N.
The columns of N are the n - r special solutions to As = O.
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.
Rank r (A)
= number of pivots = dimension of column space = dimension of row space.
Row space C (AT) = all combinations of rows of A.
Column vectors by convention.
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.
Symmetric matrix A.
The transpose is AT = A, and aU = a ji. A-I is also symmetric.
Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.
Having trouble accessing your account? Let us help you, contact support at +1(510) 944-1054 or email@example.com
Forgot password? Reset it here