- 8.1: Maximize f = 2x + 3y subject to 2x + 4y :5 16 3x + 2y :5 12 x O,y 0
- 8.2: Maximize f = 6x + 4 y subject to x + 2y :5 16 3x + 2y :5 24 x O,y 0
- 8.3: Minimize f = 4x + y subject to 3x + 2y :5 21 x + Sy ::s 20 x O,y:::; 0
- 8.4: A farmer has to decide how many acres of a 40-acre plot are to be d...
- 8.5: A company is buying lockers. It has narrowed the choice down to two...
- 8.6: Use the simplex method to maximize f = 2x + y + z under the constra...
- 8.7: A furniture company finishes two kinds of tables, X and Y. There ar...
- 8.8: Minimize f = x - 2y + 4z subject to x - y + 3z ::s 4 2x + 2y - 3z :...
Solutions for Chapter 8: Linear Programming
Full solutions for Linear Algebra with Applications | 8th Edition
Tv = Av + Vo = linear transformation plus shift.
Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).
Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).
Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.
Diagonal matrix D.
dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.
Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).
A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.
Invert A by row operations on [A I] to reach [I A-I].
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Hilbert matrix hilb(n).
Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.
Jordan form 1 = M- 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.
A directed graph that has constants Cl, ... , Cm associated with the edges.
If N NT = NT N, then N has orthonormal (complex) eigenvectors.
Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
Similar matrices A and B.
Every B = M-I AM has the same eigenvalues as A.
Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.
Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).