×
×

# Solutions for Chapter Chapter 13: Trigonometric Functions

## Full solutions for California Algebra 2: Concepts, Skills, and Problem Solving | 1st Edition

ISBN: 9780078778568

Solutions for Chapter Chapter 13: Trigonometric Functions

Solutions for Chapter Chapter 13
4 5 0 300 Reviews
31
1
##### ISBN: 9780078778568

California Algebra 2: Concepts, Skills, and Problem Solving was written by and is associated to the ISBN: 9780078778568. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: California Algebra 2: Concepts, Skills, and Problem Solving, edition: 1. Chapter Chapter 13: Trigonometric Functions includes 51 full step-by-step solutions. Since 51 problems in chapter Chapter 13: Trigonometric Functions have been answered, more than 44153 students have viewed full step-by-step solutions from this chapter.

Key Math Terms and definitions covered in this textbook
• Big formula for n by n determinants.

Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or - sign.

• Block matrix.

A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

• Cofactor Cij.

Remove row i and column j; multiply the determinant by (-I)i + j •

• Column picture of Ax = b.

The vector b becomes a combination of the columns of A. The system is solvable only when b is in the column space C (A).

• Complete solution x = x p + Xn to Ax = b.

(Particular x p) + (x n in nullspace).

• Cramer's Rule for Ax = b.

B j has b replacing column j of A; x j = det B j I det A

• Cross product u xv in R3:

Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].

• Distributive Law

A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

• Echelon matrix U.

The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

• Elimination matrix = Elementary matrix Eij.

The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

• Fibonacci numbers

0,1,1,2,3,5, ... satisfy Fn = Fn-l + Fn- 2 = (A7 -A~)I()q -A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].

• Gram-Schmidt orthogonalization A = QR.

Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

• Hankel matrix H.

Constant along each antidiagonal; hij depends on i + j.

• Markov matrix M.

All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

• Pivot columns of A.

Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

• Simplex method for linear programming.

The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

• Singular matrix A.

A square matrix that has no inverse: det(A) = o.

• Special solutions to As = O.

One free variable is Si = 1, other free variables = o.

• Transpose matrix AT.

Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.

• Vector space V.

Set of vectors such that all combinations cv + d w remain within V. Eight required rules are given in Section 3.1 for scalars c, d and vectors v, w.

×