- Chapter 1: Speaking Mathematically
- Chapter 10: Graphs and Trees
- Chapter 2: THE LOGIC OF COMPOUND STATEMENTS
- Chapter 3: The Logic of Quantied Statements
- Chapter 4: Elementary Number Theory and Methods of Proof
- Chapter 5: Sequences, Mathematical Induction, and Recursion
- Chapter 6: Set Theory
- Chapter 7: Functions
- Chapter 8: Relations
- Chapter 9: Counting and Probability
Discrete Mathematics: Introduction to Mathematical Reasoning 1st Edition - Solutions by Chapter
Full solutions for Discrete Mathematics: Introduction to Mathematical Reasoning | 1st Edition
Discrete Mathematics: Introduction to Mathematical Reasoning | 1st Edition - Solutions by ChapterGet Full Solutions
Tv = Av + Vo = linear transformation plus shift.
A = CTC = (L.J]))(L.J]))T for positive definite A.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.
Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.
Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.
Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.
Identity matrix I (or In).
Diagonal entries = 1, off-diagonal entries = 0.
Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.
A sequence of steps intended to approach the desired solution.
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.
Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
Rank r (A)
= number of pivots = dimension of column space = dimension of row space.
Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.
Schur complement S, D - C A -} B.
Appears in block elimination on [~ g ].
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.
Similar matrices A and B.
Every B = M-I AM has the same eigenvalues as A.
Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.
Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!