 Chapter 0: Out of Chaos
 Chapter 0.1: The Same yet Smaller
 Chapter 0.2: More and More
 Chapter 0.3: Shorter yet Longer
 Chapter 0.4: Going Somewhere?
 Chapter 0.5: Out of Chaos
 Chapter 1: Data Exploration
 Chapter 1.1: Bar Graphs and Dot Plots
 Chapter 1.2: Summarizing Data with Measures of Center
 Chapter 1.3: FiveNumber Summaries and Box Plots
 Chapter 1.4: Histograms and StemandLeaf Plots
 Chapter 1.6: TwoVariable Data
 Chapter 1.7: Estimating
 Chapter 1.8: Using Matrices to Organize and Combine Data
 Chapter 10: Probability
 Chapter 10.1: Relative Frequency Graphs
 Chapter 10.2: Probability Outcomes and Trials
 Chapter 10.3: Random Outcomes
 Chapter 10.4: Counting Techniques
 Chapter 10.5: MultipleStage Experiments
 Chapter 10.6: Expected Value
 Chapter 11: Introduction to Geometry
 Chapter 11.1: Parallel and Perpendicular
 Chapter 11.2: Finding the Midpoint
 Chapter 11.3: Squares, Right Triangles, and Areas
 Chapter 11.4: The Pythagorean Theorem
 Chapter 11.5: Operations with Roots
 Chapter 11.6: A Distance Formula
 Chapter 11.7: Similar Triangles and Trigonometric Functions
 Chapter 11.8: Trigonometry
 Chapter 2: Proportional Reasoning and Variation
 Chapter 2.1: Proportions
 Chapter 2.3: Proportions and Measurement Systems
 Chapter 2.4: Direct Variation
 Chapter 2.5: Inverse Variation
 Chapter 2.7: Evaluating Expressions
 Chapter 2.8: Undoing Operations
 Chapter 3: Linear Explorations
 Chapter 3.1: Recursive Sequences
 Chapter 3.2: Linear Plots
 Chapter 3.3: TimeDistance Relationships
 Chapter 3.4: Linear Equations and the Intercept Form
 Chapter 3.5: Linear Equations and Rate of Change
 Chapter 3.6: Solving Equations Using the Balancing Method
 Chapter 4: Fitting a Line to Data
 Chapter 4.1: A Formula for Slope
 Chapter 4.2: Writing a Linear Equation to Fit Data
 Chapter 4.3: PointSlope Form of a Linear Equation
 Chapter 4.4: Equivalent Algebraic Equations
 Chapter 4.5: Writing PointSlope Equations to Fit Data
 Chapter 4.6: More on Modeling
 Chapter 4.7: Applications of Modeling
 Chapter 5: Systems of Equations and Inequalities
 Chapter 5.1: Solving Systems of Equations
 Chapter 5.2: Solving Systems of Equations Using Substitution
 Chapter 5.3: Solving Systems of Equations Using Elimination
 Chapter 5.4: Solving Systems of Equations Using Matrices
 Chapter 5.5: Inequalities in One Variable
 Chapter 5.6: Graphing Inequalities in Two Variables
 Chapter 5.7: Systems of Inequalities
 Chapter 6: Exponents and Exponential Models
 Chapter 6.1: Recursive Routines
 Chapter 6.2: Exponential Equations
 Chapter 6.3: Multiplication and Exponents
 Chapter 6.4: Scientific Notation for Large Numbers
 Chapter 6.5: Looking Back with Exponents
 Chapter 6.6: Zero and Negative Exponents
 Chapter 6.7: Fitting Exponential Models to Data
 Chapter 7: Functions
 Chapter 7.1: Secret Codes
 Chapter 7.2: Functions and Graphs
 Chapter 7.3: Graphs of RealWorld Situations
 Chapter 7.4: Function Notation
 Chapter 7.5: Defining the AbsoluteValue Function
 Chapter 7.6: Squares, Squaring, and Parabolas
 Chapter 8: Transformations
 Chapter 8.1: Translating Points
 Chapter 8.2: Translating Graphs
 Chapter 8.3: Reflecting Points and Graphs
 Chapter 8.4: Stretching and Shrinking Graphs
 Chapter 8.6: Introduction to Rational Functions
 Chapter 8.7: Transformations with Matrices
 Chapter 9: Quadratic Models
 Chapter 9.1: Solving Quadratic Equations
 Chapter 9.2: Finding the Roots and the Vertex
 Chapter 9.3: From Vertex to General Form
 Chapter 9.4: Factored Form
 Chapter 9.6: Completing the Square
 Chapter 9.7: The Quadratic Formula
 Chapter 9.8: Cubic Functions
Discovering Algebra: An Investigative Approach 2nd Edition  Solutions by Chapter
Full solutions for Discovering Algebra: An Investigative Approach  2nd Edition
ISBN: 9781559537636
Discovering Algebra: An Investigative Approach  2nd Edition  Solutions by Chapter
Get Full SolutionsThis expansive textbook survival guide covers the following chapters: 90. This textbook survival guide was created for the textbook: Discovering Algebra: An Investigative Approach, edition: 2. Since problems from 90 chapters in Discovering Algebra: An Investigative Approach have been answered, more than 12652 students have viewed full stepbystep answer. Discovering Algebra: An Investigative Approach was written by and is associated to the ISBN: 9781559537636. The full stepbystep solution to problem in Discovering Algebra: An Investigative Approach were answered by , our top Math solution expert on 03/13/18, 07:06PM.

Block matrix.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

Characteristic equation det(A  AI) = O.
The n roots are the eigenvalues of A.

Column picture of Ax = b.
The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

Column space C (A) =
space of all combinations of the columns of A.

Covariance matrix:E.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x  x) (x  x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A

Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

Cyclic shift
S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Hessenberg matrix H.
Triangular matrix with one extra nonzero adjacent diagonal.

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b  Ax) = o.

Pascal matrix
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or 1) based on the number of row exchanges to reach I.

Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Spanning set.
Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!