 3.6.1: Give the equation that each picture models.
 3.6.2: Copy and fill in the table to solve the equation as in Example A.
 3.6.3: Give the next stages of the equation, matching the action taken, to...
 3.6.4: Complete the tables to solve the equations.
 3.6.5: Give the additive inverse of each number. a. b. 17 c. 23 d. x
 3.6.6: A multiplicative inverse is a number or expression that you can mul...
 3.6.7: Solve these equations. Tell what action you take at each stage. a. ...
 3.6.8: MiniInvestigation A solution to the equation 10 + 3x = 5 is shown ...
 3.6.9: Solve the equation 4 + 1.2x = 12.4 by using each method. a. balanci...
 3.6.10: Solve each equation symbolically using the balancing method. a. 3 +...
 3.6.11: You can solve familiar formulas for a specific variable. For exampl...
 3.6.12: An equation can have the variable on both sides. In these cases you...
 3.6.13: APPLICATION Economy drapes for a certain size window cost $90. They...
 3.6.14: Run the easy level of the LINES program on your calculator. [ See C...
 3.6.15: The local bagel store sells a bakers dozen of bagels for $6.49, whi...
Solutions for Chapter 3.6: Solving Equations Using the Balancing Method
Full solutions for Discovering Algebra: An Investigative Approach  2nd Edition
ISBN: 9781559537636
Solutions for Chapter 3.6: Solving Equations Using the Balancing Method
Get Full SolutionsThis textbook survival guide was created for the textbook: Discovering Algebra: An Investigative Approach, edition: 2. Chapter 3.6: Solving Equations Using the Balancing Method includes 15 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. Discovering Algebra: An Investigative Approach was written by Patricia and is associated to the ISBN: 9781559537636. Since 15 problems in chapter 3.6: Solving Equations Using the Balancing Method have been answered, more than 2880 students have viewed full stepbystep solutions from this chapter.

Cholesky factorization
A = CTC = (L.J]))(L.J]))T for positive definite A.

Column space C (A) =
space of all combinations of the columns of A.

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Cyclic shift
S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Factorization
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

Free columns of A.
Columns without pivots; these are combinations of earlier columns.

Free variable Xi.
Column i has no pivot in elimination. We can give the n  r free variables any values, then Ax = b determines the r pivot variables (if solvable!).

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Pivot.
The diagonal entry (first nonzero) at the time when a row is used in elimination.

Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.

Rank r (A)
= number of pivots = dimension of column space = dimension of row space.

Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)l has AA+ = 1m.

Similar matrices A and B.
Every B = MI AM has the same eigenvalues as A.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.
I don't want to reset my password
Need help? Contact support
Having trouble accessing your account? Let us help you, contact support at +1(510) 9441054 or support@studysoup.com
Forgot password? Reset it here