> > > Chapter 5

# Solutions for Chapter 5: Trigonometric Identities

## Full solutions for Trigonometry | 11th Edition

ISBN: 9780134217437

Solutions for Chapter 5: Trigonometric Identities

Solutions for Chapter 5
4 5 0 377 Reviews
11
5
##### ISBN: 9780134217437

Since 72 problems in chapter 5: Trigonometric Identities have been answered, more than 9643 students have viewed full step-by-step solutions from this chapter. Chapter 5: Trigonometric Identities includes 72 full step-by-step solutions. Trigonometry was written by and is associated to the ISBN: 9780134217437. This textbook survival guide was created for the textbook: Trigonometry, edition: 11. This expansive textbook survival guide covers the following chapters and their solutions.

Key Math Terms and definitions covered in this textbook
• Cramer's Rule for Ax = b.

B j has b replacing column j of A; x j = det B j I det A

• Diagonalization

A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

• Fast Fourier Transform (FFT).

A factorization of the Fourier matrix Fn into e = log2 n matrices Si times a permutation. Each Si needs only nl2 multiplications, so Fnx and Fn-1c can be computed with ne/2 multiplications. Revolutionary.

• Hermitian matrix A H = AT = A.

Complex analog a j i = aU of a symmetric matrix.

• Hessenberg matrix H.

Triangular matrix with one extra nonzero adjacent diagonal.

• Hilbert matrix hilb(n).

Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.

• Incidence matrix of a directed graph.

The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .

• Inverse matrix A-I.

Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

• Linear transformation T.

Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

• Orthogonal matrix Q.

Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

• Partial pivoting.

In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

• Plane (or hyperplane) in Rn.

Vectors x with aT x = O. Plane is perpendicular to a =1= O.

• Reflection matrix (Householder) Q = I -2uuT.

Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.

• Semidefinite matrix A.

(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.

• Simplex method for linear programming.

The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

• Singular matrix A.

A square matrix that has no inverse: det(A) = o.

• Solvable system Ax = b.

The right side b is in the column space of A.

• Spectrum of A = the set of eigenvalues {A I, ... , An}.

Spectral radius = max of IAi I.

• Trace of A

= sum of diagonal entries = sum of eigenvalues of A. Tr AB = Tr BA.

• Transpose matrix AT.

Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.

×

I don't want to reset my password

Need help? Contact support

Need an Account? Is not associated with an account
We're here to help