 5.6.1: Match each graph with an inequality. a. y 3 + 2x b. y 2 + 3x c. 2x ...
 5.6.2: Solve each inequality for y. a. 84x + 7y 70 b. 4.8x
 5.6.3: Sketch each inequality on a number line. a. x 5 b. x > 2.5 c. 3 x 3...
 5.6.4: Consider the inequality y < 2 0.5x. a. Graph the boundary line for ...
 5.6.5: Consider the inequality y 1 + 2x. a. Graph the boundary line for th...
 5.6.6: Sketch each inequality. a. y 3 + x b. y > 2 1.5x c. 2x y 4
 5.6.7: Write the inequality for each graph.
 5.6.8: MiniInvestigation Consider the inequality 3x 2y 6. a. Solve the eq...
 5.6.9: Sketch each inequality on coordinate axes. a. y < 4 b. x 3 c. y 1 d...
 5.6.10: APPLICATION The total number of points from a combination of onepo...
 5.6.11: Graph the inequalities in Exercises 4 and 5 on your calculator. [ S...
 5.6.12: These data are federal minimum wages of the past 70 years.a. Graph ...
 5.6.13: Ellie was talking with her grandmother about a trip she took this s...
 5.6.14: Solve each equation for y. a. 7x 3y = 22 b. 5x + 4y = 12
Solutions for Chapter 5.6: Graphing Inequalities in Two Variables
Full solutions for Discovering Algebra: An Investigative Approach  2nd Edition
ISBN: 9781559537636
Solutions for Chapter 5.6: Graphing Inequalities in Two Variables
Get Full SolutionsThis expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Discovering Algebra: An Investigative Approach, edition: 2. Chapter 5.6: Graphing Inequalities in Two Variables includes 14 full stepbystep solutions. Since 14 problems in chapter 5.6: Graphing Inequalities in Two Variables have been answered, more than 9130 students have viewed full stepbystep solutions from this chapter. Discovering Algebra: An Investigative Approach was written by and is associated to the ISBN: 9781559537636.

Covariance matrix:E.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x  x) (x  x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

Dimension of vector space
dim(V) = number of vectors in any basis for V.

Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

Free columns of A.
Columns without pivots; these are combinations of earlier columns.

Fundamental Theorem.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n  r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

Multiplier eij.
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

Network.
A directed graph that has constants Cl, ... , Cm associated with the edges.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Orthogonal subspaces.
Every v in V is orthogonal to every w in W.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.

Rotation matrix
R = [~ CS ] rotates the plane by () and R 1 = RT rotates back by (). Eigenvalues are eiO and eiO , eigenvectors are (1, ±i). c, s = cos (), sin ().

Schur complement S, D  C A } B.
Appears in block elimination on [~ g ].

Special solutions to As = O.
One free variable is Si = 1, other free variables = o.

Stiffness matrix
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

Symmetric matrix A.
The transpose is AT = A, and aU = a ji. AI is also symmetric.