 51.1: Solve for the unknown in each equation. 5A = 20
 51.2: Solve for the unknown in each equation. B 7 5A = 20 = 4
 51.3: Solve for the unknown in each equation. 7C = 56
 51.4: Solve for the unknown in each equation. 4M = 48
 51.5: Solve for the unknown in each equation. R 12 = 3
 51.6: Solve for the unknown in each equation. P 5 = 8
 51.7: Solve for the unknown in each equation. B + 7 = 12
 51.8: Solve for the unknown in each equation. A  9 = 15
 51.9: Solve for the unknown in each equation. R + 7 = 28
 51.10: Solve for the unknown in each equation. A  16 = 3
 51.11: Solve for the unknown in each equation. X  48 = 36
 51.12: Solve for the unknown in each equation. C + 5 = 21
 51.13: Solve for the unknown in each equation. 4A + 3 = 27
 51.14: Solve for the unknown in each equation. B 3 4A + 3 = 27 + 2 = 7
 51.15: Solve for the unknown in each equation. 3B  1 = 11
 51.16: Solve for the unknown in each equation. K 4 3B  1 = 11  5 = 3
 51.17: Solve for the unknown in each equation. K 2 + 3 = 5
 51.18: Solve for the unknown in each equation. 7B  1 = 6
 51.19: Solve for the unknown in each equation. C + 5 = 21
 51.20: Solve for the unknown in each equation. 8A  1 = 19
 51.21: Solve for the unknown in each equation. 2A + 5A = 35
 51.22: Solve for the unknown in each equation. B + 2B = 27
 51.23: Solve for the unknown in each equation. 5K  3K = 40
 51.24: Solve for the unknown in each equation. 8K  2K = 42
 51.25: Solve for the unknown in each equation. 3J + J = 28
 51.26: Solve for the unknown in each equation. 2J  J = 21
 51.27: Solve for the unknown in each equation. 3B + 2B  6 = 9
 51.28: Solve for the unknown in each equation. 8C  C + 6 = 48
 51.29: Solve for the unknown in each equation. 2(X  3) = 6
 51.30: Solve for the unknown in each equation. 4(A + 3) = 16
 51.31: Solve for the unknown in each equation. 3(B  1) = 21
 51.32: Solve for the unknown in each equation. 6(B + 2) = 30
 51.33: Solve each proportion for N.
 51.34: Solve each proportion for N.
 51.35: Solve each proportion for N.
 51.36: Solve each proportion for N.
Solutions for Chapter 51: EQUATIONS
Full solutions for Business Math,  9th Edition
ISBN: 9780135108178
Solutions for Chapter 51: EQUATIONS
Get Full SolutionsSince 36 problems in chapter 51: EQUATIONS have been answered, more than 18078 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 51: EQUATIONS includes 36 full stepbystep solutions. This textbook survival guide was created for the textbook: Business Math, , edition: 9. Business Math, was written by and is associated to the ISBN: 9780135108178.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Affine transformation
Tv = Av + Vo = linear transformation plus shift.

Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn  l . Cx = convolution c * x. Eigenvectors in F.

Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then SI AS = A = eigenvalue matrix.

Free columns of A.
Columns without pivots; these are combinations of earlier columns.

Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , AjIb. Numerical methods approximate A I b by x j with residual b  Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Nullspace matrix N.
The columns of N are the n  r special solutions to As = O.

Nullspace N (A)
= All solutions to Ax = O. Dimension n  r = (# columns)  rank.

Partial pivoting.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(DÂ» O.

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Special solutions to As = O.
One free variable is Si = 1, other free variables = o.

Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.

Subspace S of V.
Any vector space inside V, including V and Z = {zero vector only}.

Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.