 Chapter 7.1: x 2  5x + 2 = 0
 Chapter 7.2: 3 x 2 + x  4 = 0
 Chapter 7.3: (3 x 2  14x  24) (x  6)
 Chapter 7.4: ( a 2  2a  30) (a + 7)
 Chapter 7.5: Perform the division indicated by 170 t _2 t 2 + 1
 Chapter 7.6: Use the formula to estimate how many people will become ill during ...
 Chapter 7.7: Express the price of the meal after the discount and the price of t...
 Chapter 7.8: Which composition of functions represents the price of the meal, p[...
 Chapter 7.9: f(x) = x + 73 g(x) = x  73
 Chapter 7.10: g(x) = 7x  11 h(x) =_17x + 11
 Chapter 7.11: Find the inverse f 1(x). What is the significance of f 1(x) for K...
 Chapter 7.12: What will the new carpet cost Kimi?
 Chapter 7.13: y < x + 3 14.
 Chapter 7.14: y 5 x
 Chapter 7.15: y = 3  x 1
 Chapter 7.16: y = 5x
 Chapter 7.17: y = 2x  7 + 4 18.
 Chapter 7.18: y = 2 6x  1 19
 Chapter 7.19: MULTIPLE CHOICE What is the domain of f(x) = 5x  3 ? A xx > _3 5 ...
 Chapter 7.20: 36 x 2 y 6
 Chapter 7.21: 64 a 6 b 9 22.
 Chapter 7.22: 4 n 2 + 12n + 9 23. x
 Chapter 7.23: x 4 _ y 3
 Chapter 7.24: MULTIPLE CHOICE The relationship between the length and mass of Pac...
 Chapter 7.25: BASEBALL Refer to the drawing below. How far does the catcher have ...
 Chapter 7.26: y = 5x  3
 Chapter 7.27: y = 4 + 2 x  3
 Chapter 7.28: y x  2
 Chapter 7.29: y < 4x  5
 Chapter 7.30: OCEAN The speed a tsunami, or tidal wave, can travel is modeled by ...
 Chapter 7.31: 256
 Chapter 7.32: 216
 Chapter 7.33: (8) 2 3
 Chapter 7.34: c 5 d 15
 Chapter 7.35: ( x 4  3) 2 3
 Chapter 7.36: 3 (512 + x 2 ) 3
 Chapter 7.37: 16 m 8
 Chapter 7.38: a 2  10a + 25
 Chapter 7.39: PHYSICS The velocity v of an object can be defined as v = _2K m , w...
 Chapter 7.40: 128
 Chapter 7.41: 5 12  3 75
 Chapter 7.42: 6 5 11  8 5 11
 Chapter 7.43: (8 + 12 ) 2
 Chapter 7.44: 8 15 21
 Chapter 7.45: 243 3
 Chapter 7.46: 1 3 + 5
 Chapter 7.47: 10 4 + 2
 Chapter 7.48: GEOMETRY The measures of the legs of a right triangle can be repres...
 Chapter 7.49: 27  _2 3
 Chapter 7.50: 9 _1 3 9 _5 3
 Chapter 7.51: (_8 27)  _2 3
 Chapter 7.52: 1 y _2 5
 Chapter 7.53: xy 3 z
 Chapter 7.54: 3x + 4 x _2 x  _2 3
 Chapter 7.55: ELECTRICITY The amount of current in amperes I that an appliance us...
 Chapter 7.56: x = 6
 Chapter 7.57: y _1 3  7 = 0
 Chapter 7.58: (x  2) _3 2 = 8
 Chapter 7.59: x + 5  3 = 0
 Chapter 7.60: 3t  5  3 = 4
 Chapter 7.61: 2x  1 = 3
 Chapter 7.62: 4 2x  1 = 2
 Chapter 7.63: y + 5 = 2y  3
 Chapter 7.64: y + 1 + y  4 = 5
 Chapter 7.65: 1 + 5x  2 > 4
 Chapter 7.66: 2x + 14  6 4
 Chapter 7.67: 10  2x + 7 3
 Chapter 7.68: 6 + 3y + 4 < 6
 Chapter 7.69: d + 3 + d + 7 > 4
 Chapter 7.70: 2x + 5  9 + x > 0
 Chapter 7.71: GRAVITY Hugo drops his keys from the top of a Ferris wheel. The for...
Solutions for Chapter Chapter 7: Radical Equations and Inequalities
Full solutions for Algebra 2, Student Edition (MERRILL ALGEBRA 2)  1st Edition
ISBN: 9780078738302
Solutions for Chapter Chapter 7: Radical Equations and Inequalities
Get Full SolutionsThis textbook survival guide was created for the textbook: Algebra 2, Student Edition (MERRILL ALGEBRA 2), edition: 1. This expansive textbook survival guide covers the following chapters and their solutions. Algebra 2, Student Edition (MERRILL ALGEBRA 2) was written by and is associated to the ISBN: 9780078738302. Since 71 problems in chapter Chapter 7: Radical Equations and Inequalities have been answered, more than 53663 students have viewed full stepbystep solutions from this chapter. Chapter Chapter 7: Radical Equations and Inequalities includes 71 full stepbystep solutions.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

CayleyHamilton Theorem.
peA) = det(A  AI) has peA) = zero matrix.

Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.

Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).

Complex conjugate
z = a  ib for any complex number z = a + ib. Then zz = Iz12.

Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax  x Tb over growing Krylov subspaces.

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA1 yll2 = Y T(AAT)1 Y = 1 displayed by eigshow; axis lengths ad

Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.

Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Jordan form 1 = M 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , AjIb. Numerical methods approximate A I b by x j with residual b  Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b  Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) 1 AT.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Wavelets Wjk(t).
Stretch and shift the time axis to create Wjk(t) = woo(2j t  k).