 Chapter 0: Out of Chaos
 Chapter 0.1: The Same yet Smaller
 Chapter 0.2: More and More
 Chapter 0.3: Shorter yet Longer
 Chapter 0.4: Going Somewhere?
 Chapter 0.5: Out of Chaos
 Chapter 1: Data Exploration
 Chapter 1.1: Bar Graphs and Dot Plots
 Chapter 1.2: Summarizing Data with Measures of Center
 Chapter 1.3: FiveNumber Summaries and Box Plots
 Chapter 1.4: Histograms and StemandLeaf Plots
 Chapter 1.6: TwoVariable Data
 Chapter 1.7: Estimating
 Chapter 1.8: Using Matrices to Organize and Combine Data
 Chapter 10: Probability
 Chapter 10.1: Relative Frequency Graphs
 Chapter 10.2: Probability Outcomes and Trials
 Chapter 10.3: Random Outcomes
 Chapter 10.4: Counting Techniques
 Chapter 10.5: MultipleStage Experiments
 Chapter 10.6: Expected Value
 Chapter 11: Introduction to Geometry
 Chapter 11.1: Parallel and Perpendicular
 Chapter 11.2: Finding the Midpoint
 Chapter 11.3: Squares, Right Triangles, and Areas
 Chapter 11.4: The Pythagorean Theorem
 Chapter 11.5: Operations with Roots
 Chapter 11.6: A Distance Formula
 Chapter 11.7: Similar Triangles and Trigonometric Functions
 Chapter 11.8: Trigonometry
 Chapter 2: Proportional Reasoning and Variation
 Chapter 2.1: Proportions
 Chapter 2.3: Proportions and Measurement Systems
 Chapter 2.4: Direct Variation
 Chapter 2.5: Inverse Variation
 Chapter 2.7: Evaluating Expressions
 Chapter 2.8: Undoing Operations
 Chapter 3: Linear Explorations
 Chapter 3.1: Recursive Sequences
 Chapter 3.2: Linear Plots
 Chapter 3.3: TimeDistance Relationships
 Chapter 3.4: Linear Equations and the Intercept Form
 Chapter 3.5: Linear Equations and Rate of Change
 Chapter 3.6: Solving Equations Using the Balancing Method
 Chapter 4: Fitting a Line to Data
 Chapter 4.1: A Formula for Slope
 Chapter 4.2: Writing a Linear Equation to Fit Data
 Chapter 4.3: PointSlope Form of a Linear Equation
 Chapter 4.4: Equivalent Algebraic Equations
 Chapter 4.5: Writing PointSlope Equations to Fit Data
 Chapter 4.6: More on Modeling
 Chapter 4.7: Applications of Modeling
 Chapter 5: Systems of Equations and Inequalities
 Chapter 5.1: Solving Systems of Equations
 Chapter 5.2: Solving Systems of Equations Using Substitution
 Chapter 5.3: Solving Systems of Equations Using Elimination
 Chapter 5.4: Solving Systems of Equations Using Matrices
 Chapter 5.5: Inequalities in One Variable
 Chapter 5.6: Graphing Inequalities in Two Variables
 Chapter 5.7: Systems of Inequalities
 Chapter 6: Exponents and Exponential Models
 Chapter 6.1: Recursive Routines
 Chapter 6.2: Exponential Equations
 Chapter 6.3: Multiplication and Exponents
 Chapter 6.4: Scientific Notation for Large Numbers
 Chapter 6.5: Looking Back with Exponents
 Chapter 6.6: Zero and Negative Exponents
 Chapter 6.7: Fitting Exponential Models to Data
 Chapter 7: Functions
 Chapter 7.1: Secret Codes
 Chapter 7.2: Functions and Graphs
 Chapter 7.3: Graphs of RealWorld Situations
 Chapter 7.4: Function Notation
 Chapter 7.5: Defining the AbsoluteValue Function
 Chapter 7.6: Squares, Squaring, and Parabolas
 Chapter 8: Transformations
 Chapter 8.1: Translating Points
 Chapter 8.2: Translating Graphs
 Chapter 8.3: Reflecting Points and Graphs
 Chapter 8.4: Stretching and Shrinking Graphs
 Chapter 8.6: Introduction to Rational Functions
 Chapter 8.7: Transformations with Matrices
 Chapter 9: Quadratic Models
 Chapter 9.1: Solving Quadratic Equations
 Chapter 9.2: Finding the Roots and the Vertex
 Chapter 9.3: From Vertex to General Form
 Chapter 9.4: Factored Form
 Chapter 9.6: Completing the Square
 Chapter 9.7: The Quadratic Formula
 Chapter 9.8: Cubic Functions
Discovering Algebra: An Investigative Approach 2nd Edition  Solutions by Chapter
Full solutions for Discovering Algebra: An Investigative Approach  2nd Edition
ISBN: 9781559537636
Discovering Algebra: An Investigative Approach  2nd Edition  Solutions by Chapter
Get Full SolutionsThis expansive textbook survival guide covers the following chapters: 90. This textbook survival guide was created for the textbook: Discovering Algebra: An Investigative Approach, edition: 2. Since problems from 90 chapters in Discovering Algebra: An Investigative Approach have been answered, more than 7241 students have viewed full stepbystep answer. Discovering Algebra: An Investigative Approach was written by and is associated to the ISBN: 9781559537636. The full stepbystep solution to problem in Discovering Algebra: An Investigative Approach were answered by , our top Math solution expert on 03/13/18, 07:06PM.

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Block matrix.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

Cofactor Cij.
Remove row i and column j; multiply the determinant by (I)i + j •

Companion matrix.
Put CI, ... ,Cn in row n and put n  1 ones just above the main diagonal. Then det(A  AI) = ±(CI + c2A + C3A 2 + .•. + cnA nl  An).

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Fibonacci numbers
0,1,1,2,3,5, ... satisfy Fn = Fnl + Fn 2 = (A7 A~)I()q A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].

Free variable Xi.
Column i has no pivot in elimination. We can give the n  r free variables any values, then Ax = b determines the r pivot variables (if solvable!).

Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A  AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

Pivot columns of A.
Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Rotation matrix
R = [~ CS ] rotates the plane by () and R 1 = RT rotates back by (). Eigenvalues are eiO and eiO , eigenvectors are (1, ±i). c, s = cos (), sin ().

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Stiffness matrix
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

Subspace S of V.
Any vector space inside V, including V and Z = {zero vector only}.