 101.1: If Timothy Oaks earns a salary of $35,204 a year and is paid weekly...
 101.2: If Nita McMillan earns a salary of $31,107.96 a year and is paid bi...
 101.3: Gregory Maksi earns a salary of $52,980 annually and is paid monthl...
 101.4: Amelia Mattix is an accountant and is paid semimonthly. Her annual ...
 101.5: William Melton worked 47 hours in one week. His regular pay was $7....
 101.6: Bethany Colangelo, whose regular rate of pay is $8.25 per hour, wit...
 101.7: Carlos Espinosa earns $15.90 per hour with time and a half for over...
 101.8: Lacy Dodd earns a monthly salary of $2,988 and is nonexempt from FL...
 101.9: Rob Farinelli earns a monthly salary of $2,756 and is nonexempt fro...
 101.10: A belt manufacturer pays a worker $0.84 for each buckle she correct...
 101.11: Last week, Laurie Golson packaged 289 boxes of Holiday Cheese Assor...
 101.12: Joe Thweatt makes icons for a major distributor. He is paid $9.13 f...
 101.13: Mark Moses is a paper mill sales representative who receives 6% of ...
 101.14: Mary Lee Strode is paid a straight commission on sales as a real es...
 101.15: Dwayne Moody is paid on a salarypluscommission basis. He receives...
 101.16: Vincent Ores sells equipment to receive satellite signals. He earns...
Solutions for Chapter 101: GROSS PAY
Full solutions for Business Math,  9th Edition
ISBN: 9780135108178
Solutions for Chapter 101: GROSS PAY
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Business Math, , edition: 9. Since 16 problems in chapter 101: GROSS PAY have been answered, more than 17375 students have viewed full stepbystep solutions from this chapter. Business Math, was written by and is associated to the ISBN: 9780135108178. Chapter 101: GROSS PAY includes 16 full stepbystep solutions.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Back substitution.
Upper triangular systems are solved in reverse order Xn to Xl.

Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn  l . Cx = convolution c * x. Eigenvectors in F.

Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax  x Tb over growing Krylov subspaces.

Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).

Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , AjIb. Numerical methods approximate A I b by x j with residual b  Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

Norm
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

Outer product uv T
= column times row = rank one matrix.

Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.