- Chapter 1: The Mathematics of Elections
- Chapter 10: Financial Mathematics
- Chapter 11: The Mathematics of Symmetry
- Chapter 12: Fractal Geometry
- Chapter 13: Fibonacci Numbers and the Golden Ratio
- Chapter 14: Censuses, Surveys, Polls, and Studies
- Chapter 15: Graphs, Charts, and Numbers
- Chapter 2: The Mathematics of Power
- Chapter 3: The Mathematics of Sharing
- Chapter 4: The Mathematics of Apportionment
- Chapter 5: The Mathematics of Getting Around
- Chapter 6: The Mathematics of Touring
- Chapter 7: The Mathematics of Networks
- Chapter 8: The Mathematics of Scheduling
- Chapter 9: Population Growth Models
Excursions in Modern Mathematics 8th Edition - Solutions by Chapter
Full solutions for Excursions in Modern Mathematics | 8th Edition
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.
peA) = det(A - AI) has peA) = zero matrix.
Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).
z = a - ib for any complex number z = a + ib. Then zz = Iz12.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.
Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and
Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).
Free columns of A.
Columns without pivots; these are combinations of earlier columns.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.
Invert A by row operations on [A I] to reach [I A-I].
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
Outer product uv T
= column times row = rank one matrix.
Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.
R = [~ CS ] rotates the plane by () and R- 1 = RT rotates back by -(). Eigenvalues are eiO and e-iO , eigenvectors are (1, ±i). c, s = cos (), sin ().
Semidefinite matrix A.
(Positive) semidefinite: all x T Ax > 0, all A > 0; A = any RT R.
Skew-symmetric matrix K.
The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.
Symmetric matrix A.
The transpose is AT = A, and aU = a ji. A-I is also symmetric.
Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·
Having trouble accessing your account? Let us help you, contact support at +1(510) 944-1054 or email@example.com
Forgot password? Reset it here