×
Log in to StudySoup
Get Full Access to Algebra - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Algebra - Textbook Survival Guide

Solutions for Chapter 8.2: Algebra and Trigonometry 9th Edition

Algebra and Trigonometry | 9th Edition | ISBN: 9780321716569 | Authors: Michael Sullivan

Full solutions for Algebra and Trigonometry | 9th Edition

ISBN: 9780321716569

Algebra and Trigonometry | 9th Edition | ISBN: 9780321716569 | Authors: Michael Sullivan

Solutions for Chapter 8.2

Solutions for Chapter 8.2
4 5 0 236 Reviews
29
2
Textbook: Algebra and Trigonometry
Edition: 9
Author: Michael Sullivan
ISBN: 9780321716569

Since 86 problems in chapter 8.2 have been answered, more than 57688 students have viewed full step-by-step solutions from this chapter. Chapter 8.2 includes 86 full step-by-step solutions. Algebra and Trigonometry was written by and is associated to the ISBN: 9780321716569. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Algebra and Trigonometry, edition: 9.

Key Math Terms and definitions covered in this textbook
  • Basis for V.

    Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

  • Conjugate Gradient Method.

    A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

  • Diagonalization

    A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

  • Dimension of vector space

    dim(V) = number of vectors in any basis for V.

  • Distributive Law

    A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

  • Factorization

    A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.

  • Fundamental Theorem.

    The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.

  • Incidence matrix of a directed graph.

    The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .

  • Inverse matrix A-I.

    Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

  • Jordan form 1 = M- 1 AM.

    If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.

  • Krylov subspace Kj(A, b).

    The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

  • Linear transformation T.

    Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

  • Markov matrix M.

    All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

  • Polar decomposition A = Q H.

    Orthogonal Q times positive (semi)definite H.

  • Projection matrix P onto subspace S.

    Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.

  • Right inverse A+.

    If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.

  • Spanning set.

    Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!

  • Sum V + W of subs paces.

    Space of all (v in V) + (w in W). Direct sum: V n W = to}.

  • Symmetric matrix A.

    The transpose is AT = A, and aU = a ji. A-I is also symmetric.

  • Transpose matrix AT.

    Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.

×
Log in to StudySoup
Get Full Access to Algebra - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Algebra - Textbook Survival Guide
×
Reset your password