×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 9: Quadratic and Exponential Functions

Algebra 1, Student Edition (MERRILL ALGEBRA 1) | 1st Edition | ISBN: 9780078738227 | Authors: Berchie Holliday, Gilbert J. Cuevas, Beatrice Luchin, Ruth M. Casey, Linda M. Hayek, John A. Carter, Daniel Marks, Roger Day, & 2 more

Full solutions for Algebra 1, Student Edition (MERRILL ALGEBRA 1) | 1st Edition

ISBN: 9780078738227

Algebra 1, Student Edition (MERRILL ALGEBRA 1) | 1st Edition | ISBN: 9780078738227 | Authors: Berchie Holliday, Gilbert J. Cuevas, Beatrice Luchin, Ruth M. Casey, Linda M. Hayek, John A. Carter, Daniel Marks, Roger Day, & 2 more

Solutions for Chapter 9: Quadratic and Exponential Functions

Solutions for Chapter 9
4 5 0 386 Reviews
24
2
Textbook: Algebra 1, Student Edition (MERRILL ALGEBRA 1)
Edition: 1
Author: Berchie Holliday, Gilbert J. Cuevas, Beatrice Luchin, Ruth M. Casey, Linda M. Hayek, John A. Carter, Daniel Marks, Roger Day, & 2 more
ISBN: 9780078738227

Chapter 9: Quadratic and Exponential Functions includes 51 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. Algebra 1, Student Edition (MERRILL ALGEBRA 1) was written by and is associated to the ISBN: 9780078738227. This textbook survival guide was created for the textbook: Algebra 1, Student Edition (MERRILL ALGEBRA 1) , edition: 1. Since 51 problems in chapter 9: Quadratic and Exponential Functions have been answered, more than 37145 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
  • Associative Law (AB)C = A(BC).

    Parentheses can be removed to leave ABC.

  • Cayley-Hamilton Theorem.

    peA) = det(A - AI) has peA) = zero matrix.

  • Change of basis matrix M.

    The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

  • Commuting matrices AB = BA.

    If diagonalizable, they share n eigenvectors.

  • Complete solution x = x p + Xn to Ax = b.

    (Particular x p) + (x n in nullspace).

  • Diagonal matrix D.

    dij = 0 if i #- j. Block-diagonal: zero outside square blocks Du.

  • Diagonalization

    A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

  • Ellipse (or ellipsoid) x T Ax = 1.

    A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad

  • Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).

    Use AT for complex A.

  • Gram-Schmidt orthogonalization A = QR.

    Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

  • Hermitian matrix A H = AT = A.

    Complex analog a j i = aU of a symmetric matrix.

  • Inverse matrix A-I.

    Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

  • Linear transformation T.

    Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

  • Minimal polynomial of A.

    The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).

  • Network.

    A directed graph that has constants Cl, ... , Cm associated with the edges.

  • Norm

    IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.

  • Rotation matrix

    R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().

  • Row picture of Ax = b.

    Each equation gives a plane in Rn; the planes intersect at x.

  • Stiffness matrix

    If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

  • Symmetric matrix A.

    The transpose is AT = A, and aU = a ji. A-I is also symmetric.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password