×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Already have an account? Login here
×
Reset your password

Solutions for Chapter 11: Analytic Geometry

Algebra and Trigonometry | 8th Edition | ISBN: 9780132329033 | Authors: Michael Sullivan

Full solutions for Algebra and Trigonometry | 8th Edition

ISBN: 9780132329033

Algebra and Trigonometry | 8th Edition | ISBN: 9780132329033 | Authors: Michael Sullivan

Solutions for Chapter 11: Analytic Geometry

Solutions for Chapter 11
4 5 0 405 Reviews
30
1
Textbook: Algebra and Trigonometry
Edition: 8
Author: Michael Sullivan
ISBN: 9780132329033

This textbook survival guide was created for the textbook: Algebra and Trigonometry, edition: 8. Algebra and Trigonometry was written by and is associated to the ISBN: 9780132329033. Since 494 problems in chapter 11: Analytic Geometry have been answered, more than 87260 students have viewed full step-by-step solutions from this chapter. Chapter 11: Analytic Geometry includes 494 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions.

Key Math Terms and definitions covered in this textbook
  • Back substitution.

    Upper triangular systems are solved in reverse order Xn to Xl.

  • Big formula for n by n determinants.

    Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or - sign.

  • Block matrix.

    A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

  • Change of basis matrix M.

    The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

  • Complex conjugate

    z = a - ib for any complex number z = a + ib. Then zz = Iz12.

  • Diagonalizable matrix A.

    Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

  • Elimination.

    A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

  • Free columns of A.

    Columns without pivots; these are combinations of earlier columns.

  • Identity matrix I (or In).

    Diagonal entries = 1, off-diagonal entries = 0.

  • Indefinite matrix.

    A symmetric matrix with eigenvalues of both signs (+ and - ).

  • Left inverse A+.

    If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

  • Linear transformation T.

    Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

  • Linearly dependent VI, ... , Vn.

    A combination other than all Ci = 0 gives L Ci Vi = O.

  • Matrix multiplication AB.

    The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .

  • Normal equation AT Ax = ATb.

    Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.

  • Partial pivoting.

    In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

  • Projection matrix P onto subspace S.

    Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

  • Skew-symmetric matrix K.

    The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

  • Stiffness matrix

    If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.

  • Symmetric factorizations A = LDLT and A = QAQT.

    Signs in A = signs in D.