 4.1  4.2.1: Give the amplitude, period, vertical translation, and phase shift o...
 4.1  4.2.2: Graph each function over a twoperiod interval. Give the period and...
 4.1  4.2.3: Graph each function over a twoperiod interval. Give the period and...
 4.1  4.2.4: Graph each function over a twoperiod interval. Give the period and...
 4.1  4.2.5: Graph each function over a twoperiod interval. Give the period and...
 4.1  4.2.6: Graph each function over a twoperiod interval. Give the period and...
 4.1  4.2.7: Graph each function over a twoperiod interval. Give the period and...
 4.1  4.2.8: Connecting Graphs with Equations Each function graphed is of the fo...
 4.1  4.2.9: Connecting Graphs with Equations Each function graphed is of the fo...
 4.1  4.2.10: Connecting Graphs with Equations Each function graphed is of the fo...
 4.1  4.2.11: What is the average temperature in April?
 4.1  4.2.12: What is the lowest average monthly temperature? What is the highest
Solutions for Chapter 4.1  4.2: Quiz
Full solutions for Trigonometry  10th Edition
ISBN: 9780321671776
Solutions for Chapter 4.1  4.2: Quiz
Get Full SolutionsSince 12 problems in chapter 4.1  4.2: Quiz have been answered, more than 35748 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Trigonometry, edition: 10. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 4.1  4.2: Quiz includes 12 full stepbystep solutions. Trigonometry was written by and is associated to the ISBN: 9780321671776.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Block matrix.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

Conjugate Gradient Method.
A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax  x Tb over growing Krylov subspaces.

Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A

Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then SI AS = A = eigenvalue matrix.

Hilbert matrix hilb(n).
Entries HU = 1/(i + j 1) = Jd X i 1 xj1dx. Positive definite but extremely small Amin and large condition number: H is illconditioned.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Inverse matrix AI.
Square matrix with AI A = I and AAl = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B1 AI and (AI)T. Cofactor formula (Al)ij = Cji! detA.

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.

Minimal polynomial of A.
The lowest degree polynomial with meA) = zero matrix. This is peA) = det(A  AI) if no eigenvalues are repeated; always meA) divides peA).

Multiplication Ax
= Xl (column 1) + ... + xn(column n) = combination of columns.

Pascal matrix
Ps = pascal(n) = the symmetric matrix with binomial entries (i1~;2). Ps = PL Pu all contain Pascal's triangle with det = 1 (see Pascal in the index).

Reflection matrix (Householder) Q = I 2uuT.
Unit vector u is reflected to Qu = u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q1 = Q.

Skewsymmetric matrix K.
The transpose is K, since Kij = Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.

Solvable system Ax = b.
The right side b is in the column space of A.

Spectrum of A = the set of eigenvalues {A I, ... , An}.
Spectral radius = max of IAi I.

Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and AI are BT AT and (AT)I.

Vector addition.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.