Make up to \$500 this semester by taking notes for StudySoup as an Elite Notetaker

# Solutions for Chapter 4.5: Writing Point-Slope Equations to Fit Data

## Full solutions for Discovering Algebra: An Investigative Approach | 2nd Edition

ISBN: 9781559537636

Solutions for Chapter 4.5: Writing Point-Slope Equations to Fit Data

Solutions for Chapter 4.5
4 5 0 238 Reviews
27
0
##### ISBN: 9781559537636

Since 11 problems in chapter 4.5: Writing Point-Slope Equations to Fit Data have been answered, more than 2880 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 4.5: Writing Point-Slope Equations to Fit Data includes 11 full step-by-step solutions. This textbook survival guide was created for the textbook: Discovering Algebra: An Investigative Approach, edition: 2. Discovering Algebra: An Investigative Approach was written by Patricia and is associated to the ISBN: 9781559537636.

Key Math Terms and definitions covered in this textbook
• Change of basis matrix M.

The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

• Characteristic equation det(A - AI) = O.

The n roots are the eigenvalues of A.

• Covariance matrix:E.

When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

• Distributive Law

A(B + C) = AB + AC. Add then multiply, or mUltiply then add.

• Eigenvalue A and eigenvector x.

Ax = AX with x#-O so det(A - AI) = o.

• Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).

Use AT for complex A.

• Gram-Schmidt orthogonalization A = QR.

Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.

• Left inverse A+.

If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

• Linear transformation T.

Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

• Lucas numbers

Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.

• Multiplicities AM and G M.

The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).

• Multiplier eij.

The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

• Orthogonal subspaces.

Every v in V is orthogonal to every w in W.

• Pivot columns of A.

Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.

• Rank one matrix A = uvT f=. O.

Column and row spaces = lines cu and cv.

• Reflection matrix (Householder) Q = I -2uuT.

Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.

• Schwarz inequality

Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

• Subspace S of V.

Any vector space inside V, including V and Z = {zero vector only}.

• Vector v in Rn.

Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.

• Wavelets Wjk(t).

Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).

×

I don't want to reset my password

Need help? Contact support

Need an Account? Is not associated with an account
We're here to help