- Chapter 1.1:
- Chapter 1.1: Variables
- Chapter 1.2:
- Chapter 1.2: The Language of Sets
- Chapter 1.3:
- Chapter 1.3: The Language of Relations and Functions
- Chapter 10.1:
- Chapter 10.1: Graphs: Definitions and Basic Properties
- Chapter 10.2:
- Chapter 10.2: Trails, Paths, and Circuits
- Chapter 10.3:
- Chapter 10.3: Matrix Representations of Graphs
- Chapter 10.4:
- Chapter 10.4: Isomorphisms of Graphs
- Chapter 10.5:
- Chapter 10.5: Trees
- Chapter 10.6:
- Chapter 10.6: Rooted Trees
- Chapter 10.7:
- Chapter 10.7: Spanning Trees and Shortest Paths
- Chapter 11.1:
- Chapter 11.1: Real-Valued Functions of a Real Variable and Their Graphs
- Chapter 11.2:
- Chapter 11.2: O-, -, and -Notations
- Chapter 11.3:
- Chapter 11.3: Application: Analysis of Algorithm Efficiency I
- Chapter 11.4:
- Chapter 11.4: Exponential and Logarithmic Functions: Graphs and Orders
- Chapter 11.5:
- Chapter 11.5: Application: Analysis of Algorithm Efficiency II
- Chapter 12.1:
- Chapter 12.1: Formal Languages and Regular Expressions
- Chapter 12.2:
- Chapter 12.2: Finite-State Automata
- Chapter 12.3:
- Chapter 12.3: Simplifying Finite-State Automata
- Chapter 2.1:
- Chapter 2.1: Logical Form and Logical Equivalence
- Chapter 2.2:
- Chapter 2.2: Conditional Statements
- Chapter 2.3:
- Chapter 2.3: Valid and Invalid Arguments
- Chapter 2.4:
- Chapter 2.4: Application: Digital Logic Circuits
- Chapter 2.5:
- Chapter 2.5: Application: Number Systems and Circuits for Addition
- Chapter 3.1: Predicates and Quantified Statements I
- Chapter 3.2:
- Chapter 3.2: Predicates and Quantified Statements II
- Chapter 3.3:
- Chapter 3.3: Statements with Multiple Quantifiers
- Chapter 3.4:
- Chapter 3.4: Arguments with Quantified Statements
- Chapter 4.1:
- Chapter 4.1: Direct Proof and Counterexample I: Introduction
- Chapter 4.2:
- Chapter 4.2: Direct Proof and Counterexample II: Rational Numbers
- Chapter 4.3:
- Chapter 4.3: Direct Proof and Counterexample III: Divisibility
- Chapter 4.4:
- Chapter 4.4: Direct Proof and Counterexample IV: Division into Cases and the Quotient-Remainder Theorem
- Chapter 4.5:
- Chapter 4.5: Direct Proof and Counterexample V: Floor and Ceiling
- Chapter 4.6:
- Chapter 4.6: Indirect Argument: Contradiction and Contraposition
- Chapter 4.7:
- Chapter 4.7: Indirect Argument: Two Classical Theorems
- Chapter 4.8:
- Chapter 4.8: Application: Algorithms
- Chapter 5.1:
- Chapter 5.1: Sequences
- Chapter 5.2:
- Chapter 5.2: Mathematical Induction I
- Chapter 5.3:
- Chapter 5.3: Mathematical Induction II
- Chapter 5.4:
- Chapter 5.4: Strong Mathematical Induction and the Well-Ordering Principle for the Integers
- Chapter 5.5:
- Chapter 5.5: Application: Correctness of Algorithms
- Chapter 5.6:
- Chapter 5.6: Defining Sequences Recursively
- Chapter 5.7:
- Chapter 5.7: Solving Recurrence Relations by Iteration
- Chapter 5.8:
- Chapter 5.8: Second-Order Linear Homogeneous Recurrence Relations with Constant Coefficients
- Chapter 5.9:
- Chapter 5.9: General Recursive Definitions and Structural Induction
- Chapter 6.1:
- Chapter 6.1: Set Theory: Definitions and the Element Method of Proof
- Chapter 6.2:
- Chapter 6.2: Properties of Sets
- Chapter 6.3:
- Chapter 6.3: Disproofs, Algebraic Proofs, and Boolean Algebras
- Chapter 6.4:
- Chapter 6.4: Boolean Algebras, Russells Paradox, and the Halting Problem
- Chapter 7.1:
- Chapter 7.1: Functions Defined on General Sets
- Chapter 7.2:
- Chapter 7.2: One-to-One and Onto, Inverse Functions
- Chapter 7.3:
- Chapter 7.3: Composition of Functions
- Chapter 7.4:
- Chapter 7.4: Cardinality with Applications to Computability
- Chapter 8.1:
- Chapter 8.1: Relations on Sets
- Chapter 8.2:
- Chapter 8.2: Reflexivity, Symmetry, and Transitivity
- Chapter 8.3:
- Chapter 8.3: Equivalence Relations
- Chapter 8.4:
- Chapter 8.4: Modular Arithmetic with Applications to Cryptography
- Chapter 8.5:
- Chapter 8.5: Partial Order Relations
- Chapter 9.1:
- Chapter 9.1: Introduction
- Chapter 9.2:
- Chapter 9.2: Possibility Trees and the Multiplication Rule
- Chapter 9.3:
- Chapter 9.3: Counting Elements of Disjoint Sets: The Addition Rule
- Chapter 9.4:
- Chapter 9.4: The Pigeonhole Principle
- Chapter 9.5:
- Chapter 9.5: Counting Subsets of a Set: Combinations
- Chapter 9.6:
- Chapter 9.6: r-Combinations with Repetition Allowed
- Chapter 9.7:
- Chapter 9.7: Pascals Formula and the Binomial Theorem
- Chapter 9.8:
- Chapter 9.8: Probability Axioms and Expected Value
- Chapter 9.9:
- Chapter 9.9: Conditional Probability, Bayes Formula, and Independent Events
Discrete Mathematics with Applications 4th Edition - Solutions by Chapter
Full solutions for Discrete Mathematics with Applications | 4th Edition
Upper triangular systems are solved in reverse order Xn to Xl.
Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!
Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.
Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A
Set of n nodes connected pairwise by m edges. A complete graph has all n(n - 1)/2 edges between nodes. A tree has only n - 1 edges and no closed loops.
Independent vectors VI, .. " vk.
No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.
= Xl (column 1) + ... + xn(column n) = combination of columns.
Particular solution x p.
Any solution to Ax = b; often x p has free variables = o.
Pivot columns of A.
Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.
Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Projection p = a(aTblaTa) onto the line through a.
P = aaT laTa has rank l.
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
Reduced row echelon form R = rref(A).
Pivots = 1; zeros above and below pivots; the r nonzero rows of R give a basis for the row space of A.
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.