×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 5-7: Analyzing Graphs of Quadratic Functions

Algebra 2, Student Edition (MERRILL ALGEBRA 2) | 1st Edition | ISBN: 9780078738302 | Authors: McGraw-Hill Education

Full solutions for Algebra 2, Student Edition (MERRILL ALGEBRA 2) | 1st Edition

ISBN: 9780078738302

Algebra 2, Student Edition (MERRILL ALGEBRA 2) | 1st Edition | ISBN: 9780078738302 | Authors: McGraw-Hill Education

Solutions for Chapter 5-7: Analyzing Graphs of Quadratic Functions

Solutions for Chapter 5-7
4 5 0 429 Reviews
18
1
Textbook: Algebra 2, Student Edition (MERRILL ALGEBRA 2)
Edition: 1
Author: McGraw-Hill Education
ISBN: 9780078738302

Algebra 2, Student Edition (MERRILL ALGEBRA 2) was written by and is associated to the ISBN: 9780078738302. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Algebra 2, Student Edition (MERRILL ALGEBRA 2), edition: 1. Chapter 5-7: Analyzing Graphs of Quadratic Functions includes 70 full step-by-step solutions. Since 70 problems in chapter 5-7: Analyzing Graphs of Quadratic Functions have been answered, more than 56137 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
  • Adjacency matrix of a graph.

    Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

  • Back substitution.

    Upper triangular systems are solved in reverse order Xn to Xl.

  • Cofactor Cij.

    Remove row i and column j; multiply the determinant by (-I)i + j •

  • Conjugate Gradient Method.

    A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

  • Determinant IAI = det(A).

    Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and

  • Diagonalizable matrix A.

    Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

  • Dimension of vector space

    dim(V) = number of vectors in any basis for V.

  • Left inverse A+.

    If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

  • Linear transformation T.

    Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Multiplicities AM and G M.

    The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

  • Nilpotent matrix N.

    Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.

  • Orthogonal matrix Q.

    Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

  • Orthonormal vectors q 1 , ... , q n·

    Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

  • Partial pivoting.

    In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

  • Solvable system Ax = b.

    The right side b is in the column space of A.

  • Transpose matrix AT.

    Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.

  • Vandermonde matrix V.

    V c = b gives coefficients of p(x) = Co + ... + Cn_IXn- 1 with P(Xi) = bi. Vij = (Xi)j-I and det V = product of (Xk - Xi) for k > i.

  • Vector space V.

    Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.

  • Wavelets Wjk(t).

    Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password