×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 1.3: Trigonometric Functions

Trigonometry | 10th Edition | ISBN: 9780321671776 | Authors: Margaret L. Lial, John Hornsby, David I. Schneider, Callie Daniels

Full solutions for Trigonometry | 10th Edition

ISBN: 9780321671776

Trigonometry | 10th Edition | ISBN: 9780321671776 | Authors: Margaret L. Lial, John Hornsby, David I. Schneider, Callie Daniels

Solutions for Chapter 1.3: Trigonometric Functions

Solutions for Chapter 1.3
4 5 0 409 Reviews
31
2
Textbook: Trigonometry
Edition: 10
Author: Margaret L. Lial, John Hornsby, David I. Schneider, Callie Daniels
ISBN: 9780321671776

This expansive textbook survival guide covers the following chapters and their solutions. Since 112 problems in chapter 1.3: Trigonometric Functions have been answered, more than 32925 students have viewed full step-by-step solutions from this chapter. Trigonometry was written by and is associated to the ISBN: 9780321671776. This textbook survival guide was created for the textbook: Trigonometry, edition: 10. Chapter 1.3: Trigonometric Functions includes 112 full step-by-step solutions.

Key Math Terms and definitions covered in this textbook
  • Adjacency matrix of a graph.

    Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

  • Affine transformation

    Tv = Av + Vo = linear transformation plus shift.

  • Augmented matrix [A b].

    Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

  • Basis for V.

    Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!

  • Change of basis matrix M.

    The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

  • Diagonalizable matrix A.

    Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

  • Distributive Law

    A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

  • Hermitian matrix A H = AT = A.

    Complex analog a j i = aU of a symmetric matrix.

  • Hypercube matrix pl.

    Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

  • Inverse matrix A-I.

    Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

  • Least squares solution X.

    The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.

  • Linear combination cv + d w or L C jV j.

    Vector addition and scalar multiplication.

  • Markov matrix M.

    All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

  • Minimal polynomial of A.

    The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A - AI) if no eigenvalues are repeated; always meA) divides peA).

  • Multiplication Ax

    = Xl (column 1) + ... + xn(column n) = combination of columns.

  • Pivot columns of A.

    Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

  • Projection p = a(aTblaTa) onto the line through a.

    P = aaT laTa has rank l.

  • Reflection matrix (Householder) Q = I -2uuT.

    Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.

  • Singular Value Decomposition

    (SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.

  • Wavelets Wjk(t).

    Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password