- Chapter 1: AN INTRODUCTION TO DATA AND FUNCTIONS
- Chapter 2: RATES OF CHANGE AND LINEAR FUNCTIONS
- Chapter 3: WHEN LINES MEET: LINEAR SYSTEMS
- Chapter 4: THE LAWS OF EXPONENTS AND LOGARITHMS: MEASURING THE UNIVERSE
- Chapter 5: GROWTH AND DECAY: AN INTRODUCTION TO EXPONENTIAL FUNCTIONS
- Chapter 6: LOGARITHMIC LINKS: LOGARITHMIC AND EXPONENTIAL FUNCTIONS
- Chapter 7: POWER FUNCTIONS
- Chapter 8: QUADRATICS AND THE MATHEMATICS OF MOTION
- Chapter 9: NEW FUNCTIONS FROM OLD
Explorations in College Algebra 5th Edition - Solutions by Chapter
Full solutions for Explorations in College Algebra | 5th Edition
Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Ellipse (or ellipsoid) x T Ax = 1.
A must be positive definite; the axes of the ellipse are eigenvectors of A, with lengths 1/.JI. (For IIx II = 1 the vectors y = Ax lie on the ellipse IIA-1 yll2 = Y T(AAT)-1 Y = 1 displayed by eigshow; axis lengths ad
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.
Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.
Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).
Every v in V is orthogonal to every w in W.
Outer product uv T
= column times row = rank one matrix.
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Schur complement S, D - C A -} B.
Appears in block elimination on [~ g ].
Singular matrix A.
A square matrix that has no inverse: det(A) = o.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.
Symmetric matrix A.
The transpose is AT = A, and aU = a ji. A-I is also symmetric.