×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 7.3: Square Root Functions and Inequalities

California Algebra 2: Concepts, Skills, and Problem Solving | 1st Edition | ISBN: 9780078778568 | Authors: Berchie Holliday

Full solutions for California Algebra 2: Concepts, Skills, and Problem Solving | 1st Edition

ISBN: 9780078778568

California Algebra 2: Concepts, Skills, and Problem Solving | 1st Edition | ISBN: 9780078778568 | Authors: Berchie Holliday

Solutions for Chapter 7.3: Square Root Functions and Inequalities

Solutions for Chapter 7.3
4 5 0 234 Reviews
22
1
Textbook: California Algebra 2: Concepts, Skills, and Problem Solving
Edition: 1
Author: Berchie Holliday
ISBN: 9780078778568

This textbook survival guide was created for the textbook: California Algebra 2: Concepts, Skills, and Problem Solving, edition: 1. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 7.3: Square Root Functions and Inequalities includes 47 full step-by-step solutions. California Algebra 2: Concepts, Skills, and Problem Solving was written by and is associated to the ISBN: 9780078778568. Since 47 problems in chapter 7.3: Square Root Functions and Inequalities have been answered, more than 36524 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
  • Adjacency matrix of a graph.

    Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

  • Affine transformation

    Tv = Av + Vo = linear transformation plus shift.

  • Augmented matrix [A b].

    Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

  • Block matrix.

    A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

  • Cofactor Cij.

    Remove row i and column j; multiply the determinant by (-I)i + j •

  • Column picture of Ax = b.

    The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

  • Condition number

    cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.

  • Hilbert matrix hilb(n).

    Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.

  • Hypercube matrix pl.

    Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

  • Length II x II.

    Square root of x T x (Pythagoras in n dimensions).

  • Nilpotent matrix N.

    Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

  • Normal equation AT Ax = ATb.

    Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.

  • Rank one matrix A = uvT f=. O.

    Column and row spaces = lines cu and cv.

  • Reflection matrix (Householder) Q = I -2uuT.

    Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.

  • Saddle point of I(x}, ... ,xn ).

    A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

  • Simplex method for linear programming.

    The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

  • Skew-symmetric matrix K.

    The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

  • Solvable system Ax = b.

    The right side b is in the column space of A.

  • Trace of A

    = sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

  • Tridiagonal matrix T: tij = 0 if Ii - j I > 1.

    T- 1 has rank 1 above and below diagonal.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password