- Chapter 1.1: Statements, Symbolic Representation, and Tautologies
- Chapter 1.2: Propositional Logic
- Chapter 1.3: Quantifiers, Predicates, and Validity
- Chapter 1.4: Predicate Logic
- Chapter 1.5: Logic Programming
- Chapter 1.6: Logic Programming
- Chapter 2.1: Proof Techniques
- Chapter 2.2: Induction
- Chapter 2.3: More on Proof of Correctness
- Chapter 2.4: Number Theory
- Chapter 3.1: Recursive Definitions
- Chapter 3.2: Recurrence Relations
- Chapter 3.3: Analysis of Algorithms
- Chapter 4.1: Sets
- Chapter 4.2: Counting
- Chapter 4.3: Principle of Inclusion and Exclusion; Pigeonhole Principle
- Chapter 4.4: Permutations and Combinations
- Chapter 5.1: Relations
- Chapter 5.2: Topological Sorting
- Chapter 5.3: Relations and Databases
- Chapter 5.4: Functions
- Chapter 5.5: Order of Magnitude
- Chapter 5.6: The Mighty Mod Function
- Chapter 5.7: Matrices
- Chapter 6.1: Graphs and Their Representations
- Chapter 6.2: Trees and Their Representations
- Chapter 6.3: Decision Trees
- Chapter 6.4: Huffman Codes
- Chapter 7.1: Directed Graphs and Binary Relations; Warshalls Algorithm
- Chapter 7.2: Euler Path and Hamiltonian Circuit
- Chapter 7.3: Shortest Path and Minimal Spanning Tree
- Chapter 7.4: Traversal Algorithms
- Chapter 7.5: Articulation Points and Computer Networks
- Chapter 8.1: Boolean Algebra Structure
- Chapter 8.2: Logic Networks
- Chapter 8.3: Minimization
- Chapter 9.1: Algebraic Structures
- Chapter 9.2: Coding Theory
- Chapter 9.3: Finite-State Machines
- Chapter 9.4: Turing Machines
- Chapter 9.5: Formal Languages
Mathematical Structures for Computer Science 7th Edition - Solutions by Chapter
Full solutions for Mathematical Structures for Computer Science | 7th Edition
Mathematical Structures for Computer Science | 7th Edition - Solutions by ChapterGet Full Solutions
Characteristic equation det(A - AI) = O.
The n roots are the eigenvalues of A.
z = a - ib for any complex number z = a + ib. Then zz = Iz12.
Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A
Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.
Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.
Gram-Schmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.
A symmetric matrix with eigenvalues of both signs (+ and - ).
Inverse matrix A-I.
Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.
Linear combination cv + d w or L C jV j.
Vector addition and scalar multiplication.
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).
Every v in V is orthogonal to every w in W.
Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.
Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.
Schur complement S, D - C A -} B.
Appears in block elimination on [~ g ].
Singular matrix A.
A square matrix that has no inverse: det(A) = o.
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.
Constant down each diagonal = time-invariant (shift-invariant) filter.