- Chapter 0: Out of Chaos
- Chapter 0.1: The Same yet Smaller
- Chapter 0.2: More and More
- Chapter 0.3: Shorter yet Longer
- Chapter 0.4: Going Somewhere?
- Chapter 0.5: Out of Chaos
- Chapter 1: Data Exploration
- Chapter 1.1: Bar Graphs and Dot Plots
- Chapter 1.2: Summarizing Data with Measures of Center
- Chapter 1.3: Five-Number Summaries and Box Plots
- Chapter 1.4: Histograms and Stem-and-Leaf Plots
- Chapter 1.6: Two-Variable Data
- Chapter 1.7: Estimating
- Chapter 1.8: Using Matrices to Organize and Combine Data
- Chapter 10: Probability
- Chapter 10.1: Relative Frequency Graphs
- Chapter 10.2: Probability Outcomes and Trials
- Chapter 10.3: Random Outcomes
- Chapter 10.4: Counting Techniques
- Chapter 10.5: Multiple-Stage Experiments
- Chapter 10.6: Expected Value
- Chapter 11: Introduction to Geometry
- Chapter 11.1: Parallel and Perpendicular
- Chapter 11.2: Finding the Midpoint
- Chapter 11.3: Squares, Right Triangles, and Areas
- Chapter 11.4: The Pythagorean Theorem
- Chapter 11.5: Operations with Roots
- Chapter 11.6: A Distance Formula
- Chapter 11.7: Similar Triangles and Trigonometric Functions
- Chapter 11.8: Trigonometry
- Chapter 2: Proportional Reasoning and Variation
- Chapter 2.1: Proportions
- Chapter 2.3: Proportions and Measurement Systems
- Chapter 2.4: Direct Variation
- Chapter 2.5: Inverse Variation
- Chapter 2.7: Evaluating Expressions
- Chapter 2.8: Undoing Operations
- Chapter 3: Linear Explorations
- Chapter 3.1: Recursive Sequences
- Chapter 3.2: Linear Plots
- Chapter 3.3: Time-Distance Relationships
- Chapter 3.4: Linear Equations and the Intercept Form
- Chapter 3.5: Linear Equations and Rate of Change
- Chapter 3.6: Solving Equations Using the Balancing Method
- Chapter 4: Fitting a Line to Data
- Chapter 4.1: A Formula for Slope
- Chapter 4.2: Writing a Linear Equation to Fit Data
- Chapter 4.3: Point-Slope Form of a Linear Equation
- Chapter 4.4: Equivalent Algebraic Equations
- Chapter 4.5: Writing Point-Slope Equations to Fit Data
- Chapter 4.6: More on Modeling
- Chapter 4.7: Applications of Modeling
- Chapter 5: Systems of Equations and Inequalities
- Chapter 5.1: Solving Systems of Equations
- Chapter 5.2: Solving Systems of Equations Using Substitution
- Chapter 5.3: Solving Systems of Equations Using Elimination
- Chapter 5.4: Solving Systems of Equations Using Matrices
- Chapter 5.5: Inequalities in One Variable
- Chapter 5.6: Graphing Inequalities in Two Variables
- Chapter 5.7: Systems of Inequalities
- Chapter 6: Exponents and Exponential Models
- Chapter 6.1: Recursive Routines
- Chapter 6.2: Exponential Equations
- Chapter 6.3: Multiplication and Exponents
- Chapter 6.4: Scientific Notation for Large Numbers
- Chapter 6.5: Looking Back with Exponents
- Chapter 6.6: Zero and Negative Exponents
- Chapter 6.7: Fitting Exponential Models to Data
- Chapter 7: Functions
- Chapter 7.1: Secret Codes
- Chapter 7.2: Functions and Graphs
- Chapter 7.3: Graphs of Real-World Situations
- Chapter 7.4: Function Notation
- Chapter 7.5: Defining the AbsoluteValue Function
- Chapter 7.6: Squares, Squaring, and Parabolas
- Chapter 8: Transformations
- Chapter 8.1: Translating Points
- Chapter 8.2: Translating Graphs
- Chapter 8.3: Reflecting Points and Graphs
- Chapter 8.4: Stretching and Shrinking Graphs
- Chapter 8.6: Introduction to Rational Functions
- Chapter 8.7: Transformations with Matrices
- Chapter 9: Quadratic Models
- Chapter 9.1: Solving Quadratic Equations
- Chapter 9.2: Finding the Roots and the Vertex
- Chapter 9.3: From Vertex to General Form
- Chapter 9.4: Factored Form
- Chapter 9.6: Completing the Square
- Chapter 9.7: The Quadratic Formula
- Chapter 9.8: Cubic Functions
Discovering Algebra: An Investigative Approach 2nd Edition - Solutions by Chapter
Full solutions for Discovering Algebra: An Investigative Approach | 2nd Edition
Discovering Algebra: An Investigative Approach | 2nd Edition - Solutions by ChapterGet Full Solutions
Tv = Av + Vo = linear transformation plus shift.
Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.
Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).
Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A
Dimension of vector space
dim(V) = number of vectors in any basis for V.
Fast Fourier Transform (FFT).
A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
A sequence of steps intended to approach the desired solution.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.
Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Nullspace N (A)
= All solutions to Ax = O. Dimension n - r = (# columns) - rank.
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Rank r (A)
= number of pivots = dimension of column space = dimension of row space.
Schur complement S, D - C A -} B.
Appears in block elimination on [~ g ].
Trace of A
= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.
Unitary matrix UH = U T = U-I.
Orthonormal columns (complex analog of Q).