×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 1.6: Equations and Inequalities Involving Absolute Value

Full solutions for College Algebra | 9th Edition

ISBN: 9780321716811

Solutions for Chapter 1.6: Equations and Inequalities Involving Absolute Value

Solutions for Chapter 1.6
4 5 0 420 Reviews
21
2
Textbook: College Algebra
Edition: 9
Author: Michael Sullivan
ISBN: 9780321716811

This expansive textbook survival guide covers the following chapters and their solutions. Since 95 problems in chapter 1.6: Equations and Inequalities Involving Absolute Value have been answered, more than 37319 students have viewed full step-by-step solutions from this chapter. College Algebra was written by and is associated to the ISBN: 9780321716811. Chapter 1.6: Equations and Inequalities Involving Absolute Value includes 95 full step-by-step solutions. This textbook survival guide was created for the textbook: College Algebra, edition: 9.

Key Math Terms and definitions covered in this textbook
  • Associative Law (AB)C = A(BC).

    Parentheses can be removed to leave ABC.

  • Cofactor Cij.

    Remove row i and column j; multiply the determinant by (-I)i + j •

  • Commuting matrices AB = BA.

    If diagonalizable, they share n eigenvectors.

  • Echelon matrix U.

    The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

  • Graph G.

    Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.

  • Incidence matrix of a directed graph.

    The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .

  • Krylov subspace Kj(A, b).

    The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

  • Linear transformation T.

    Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

  • Minimal polynomial of A.

    The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).

  • Orthogonal subspaces.

    Every v in V is orthogonal to every w in W.

  • Orthonormal vectors q 1 , ... , q n·

    Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

  • Plane (or hyperplane) in Rn.

    Vectors x with aT x = O. Plane is perpendicular to a =1= O.

  • Polar decomposition A = Q H.

    Orthogonal Q times positive (semi)definite H.

  • Pseudoinverse A+ (Moore-Penrose inverse).

    The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

  • Rank r (A)

    = number of pivots = dimension of column space = dimension of row space.

  • Row space C (AT) = all combinations of rows of A.

    Column vectors by convention.

  • Spanning set.

    Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

  • Spectrum of A = the set of eigenvalues {A I, ... , An}.

    Spectral radius = max of IAi I.

  • Unitary matrix UH = U T = U-I.

    Orthonormal columns (complex analog of Q).

  • Vector v in Rn.

    Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.