×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 5.6: Graphing Inequalities in Two Variables

Discovering Algebra: An Investigative Approach | 2nd Edition | ISBN: 9781559537636 | Authors: Jerald Murdock, Ellen Kamischke, Eric Kamischke

Full solutions for Discovering Algebra: An Investigative Approach | 2nd Edition

ISBN: 9781559537636

Discovering Algebra: An Investigative Approach | 2nd Edition | ISBN: 9781559537636 | Authors: Jerald Murdock, Ellen Kamischke, Eric Kamischke

Solutions for Chapter 5.6: Graphing Inequalities in Two Variables

This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Discovering Algebra: An Investigative Approach, edition: 2. Chapter 5.6: Graphing Inequalities in Two Variables includes 14 full step-by-step solutions. Since 14 problems in chapter 5.6: Graphing Inequalities in Two Variables have been answered, more than 9130 students have viewed full step-by-step solutions from this chapter. Discovering Algebra: An Investigative Approach was written by and is associated to the ISBN: 9781559537636.

Key Math Terms and definitions covered in this textbook
  • Covariance matrix:E.

    When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

  • Dimension of vector space

    dim(V) = number of vectors in any basis for V.

  • Echelon matrix U.

    The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

  • Free columns of A.

    Columns without pivots; these are combinations of earlier columns.

  • Fundamental Theorem.

    The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

  • Gauss-Jordan method.

    Invert A by row operations on [A I] to reach [I A-I].

  • Hypercube matrix pl.

    Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

  • Incidence matrix of a directed graph.

    The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Multiplier eij.

    The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

  • Network.

    A directed graph that has constants Cl, ... , Cm associated with the edges.

  • Orthogonal matrix Q.

    Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

  • Orthogonal subspaces.

    Every v in V is orthogonal to every w in W.

  • Pseudoinverse A+ (Moore-Penrose inverse).

    The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

  • Rank one matrix A = uvT f=. O.

    Column and row spaces = lines cu and cv.

  • Rotation matrix

    R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().

  • Schur complement S, D - C A -} B.

    Appears in block elimination on [~ g ].

  • Special solutions to As = O.

    One free variable is Si = 1, other free variables = o.

  • Stiffness matrix

    If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

  • Symmetric matrix A.

    The transpose is AT = A, and aU = a ji. A-I is also symmetric.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password