- Chapter 1.1:
- Chapter 1.1: Variables
- Chapter 1.2:
- Chapter 1.2: The Language of Sets
- Chapter 1.3:
- Chapter 1.3: The Language of Relations and Functions
- Chapter 10.1:
- Chapter 10.1: Graphs: Definitions and Basic Properties
- Chapter 10.2:
- Chapter 10.2: Trails, Paths, and Circuits
- Chapter 10.3:
- Chapter 10.3: Matrix Representations of Graphs
- Chapter 10.4:
- Chapter 10.4: Isomorphisms of Graphs
- Chapter 10.5:
- Chapter 10.5: Trees
- Chapter 10.6:
- Chapter 10.6: Rooted Trees
- Chapter 10.7:
- Chapter 10.7: Spanning Trees and Shortest Paths
- Chapter 11.1:
- Chapter 11.1: Real-Valued Functions of a Real Variable and Their Graphs
- Chapter 11.2:
- Chapter 11.2: O-, -, and -Notations
- Chapter 11.3:
- Chapter 11.3: Application: Analysis of Algorithm Efficiency I
- Chapter 11.4:
- Chapter 11.4: Exponential and Logarithmic Functions: Graphs and Orders
- Chapter 11.5:
- Chapter 11.5: Application: Analysis of Algorithm Efficiency II
- Chapter 12.1:
- Chapter 12.1: Formal Languages and Regular Expressions
- Chapter 12.2:
- Chapter 12.2: Finite-State Automata
- Chapter 12.3:
- Chapter 12.3: Simplifying Finite-State Automata
- Chapter 2.1:
- Chapter 2.1: Logical Form and Logical Equivalence
- Chapter 2.2:
- Chapter 2.2: Conditional Statements
- Chapter 2.3:
- Chapter 2.3: Valid and Invalid Arguments
- Chapter 2.4:
- Chapter 2.4: Application: Digital Logic Circuits
- Chapter 2.5:
- Chapter 2.5: Application: Number Systems and Circuits for Addition
- Chapter 3.1: Predicates and Quantified Statements I
- Chapter 3.2:
- Chapter 3.2: Predicates and Quantified Statements II
- Chapter 3.3:
- Chapter 3.3: Statements with Multiple Quantifiers
- Chapter 3.4:
- Chapter 3.4: Arguments with Quantified Statements
- Chapter 4.1:
- Chapter 4.1: Direct Proof and Counterexample I: Introduction
- Chapter 4.2:
- Chapter 4.2: Direct Proof and Counterexample II: Rational Numbers
- Chapter 4.3:
- Chapter 4.3: Direct Proof and Counterexample III: Divisibility
- Chapter 4.4:
- Chapter 4.4: Direct Proof and Counterexample IV: Division into Cases and the Quotient-Remainder Theorem
- Chapter 4.5:
- Chapter 4.5: Direct Proof and Counterexample V: Floor and Ceiling
- Chapter 4.6:
- Chapter 4.6: Indirect Argument: Contradiction and Contraposition
- Chapter 4.7:
- Chapter 4.7: Indirect Argument: Two Classical Theorems
- Chapter 4.8:
- Chapter 4.8: Application: Algorithms
- Chapter 5.1:
- Chapter 5.1: Sequences
- Chapter 5.2:
- Chapter 5.2: Mathematical Induction I
- Chapter 5.3:
- Chapter 5.3: Mathematical Induction II
- Chapter 5.4:
- Chapter 5.4: Strong Mathematical Induction and the Well-Ordering Principle for the Integers
- Chapter 5.5:
- Chapter 5.5: Application: Correctness of Algorithms
- Chapter 5.6:
- Chapter 5.6: Defining Sequences Recursively
- Chapter 5.7:
- Chapter 5.7: Solving Recurrence Relations by Iteration
- Chapter 5.8:
- Chapter 5.8: Second-Order Linear Homogeneous Recurrence Relations with Constant Coefficients
- Chapter 5.9:
- Chapter 5.9: General Recursive Definitions and Structural Induction
- Chapter 6.1:
- Chapter 6.1: Set Theory: Definitions and the Element Method of Proof
- Chapter 6.2:
- Chapter 6.2: Properties of Sets
- Chapter 6.3:
- Chapter 6.3: Disproofs, Algebraic Proofs, and Boolean Algebras
- Chapter 6.4:
- Chapter 6.4: Boolean Algebras, Russells Paradox, and the Halting Problem
- Chapter 7.1:
- Chapter 7.1: Functions Defined on General Sets
- Chapter 7.2:
- Chapter 7.2: One-to-One and Onto, Inverse Functions
- Chapter 7.3:
- Chapter 7.3: Composition of Functions
- Chapter 7.4:
- Chapter 7.4: Cardinality with Applications to Computability
- Chapter 8.1:
- Chapter 8.1: Relations on Sets
- Chapter 8.2:
- Chapter 8.2: Reflexivity, Symmetry, and Transitivity
- Chapter 8.3:
- Chapter 8.3: Equivalence Relations
- Chapter 8.4:
- Chapter 8.4: Modular Arithmetic with Applications to Cryptography
- Chapter 8.5:
- Chapter 8.5: Partial Order Relations
- Chapter 9.1:
- Chapter 9.1: Introduction
- Chapter 9.2:
- Chapter 9.2: Possibility Trees and the Multiplication Rule
- Chapter 9.3:
- Chapter 9.3: Counting Elements of Disjoint Sets: The Addition Rule
- Chapter 9.4:
- Chapter 9.4: The Pigeonhole Principle
- Chapter 9.5:
- Chapter 9.5: Counting Subsets of a Set: Combinations
- Chapter 9.6:
- Chapter 9.6: r-Combinations with Repetition Allowed
- Chapter 9.7:
- Chapter 9.7: Pascals Formula and the Binomial Theorem
- Chapter 9.8:
- Chapter 9.8: Probability Axioms and Expected Value
- Chapter 9.9:
- Chapter 9.9: Conditional Probability, Bayes Formula, and Independent Events
Discrete Mathematics with Applications 4th Edition - Solutions by Chapter
Full solutions for Discrete Mathematics with Applications | 4th Edition
Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).
cond(A) = c(A) = IIAIlIIA-III = amaxlamin. In Ax = b, the relative change Ilox III Ilx II is less than cond(A) times the relative change Ilob III lib II· Condition numbers measure the sensitivity of the output to change in the input.
Cross product u xv in R3:
Vector perpendicular to u and v, length Ilullllvlll sin el = area of parallelogram, u x v = "determinant" of [i j k; UI U2 U3; VI V2 V3].
Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and
Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.
Eigenvalue A and eigenvector x.
Ax = AX with x#-O so det(A - AI) = o.
Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.
Free columns of A.
Columns without pivots; these are combinations of earlier columns.
Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.
Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.
Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.
Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
Transpose matrix AT.
Entries AL = Ajj. AT is n by In, AT A is square, symmetric, positive semidefinite. The transposes of AB and A-I are BT AT and (AT)-I.
Unitary matrix UH = U T = U-I.
Orthonormal columns (complex analog of Q).
Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.