- 59.59.1: Let P be the poset in the figure. Please calculate: a b c d e f g h...
- 59.59.2: Consider the poset .Z; / (ordinary less than or equal to). For x; y...
- 59.59.3: Let P D .X; / be a lattice. Prove that P is a linear order if and o...
- 59.59.4: Prove Proposition 59.2.
- 59.59.5: The following statement is false: Every lattice has a maximum eleme...
- 59.59.6: Let P D .X; / be a lattice and let m be an element of the lattice. ...
- 59.59.7: In Theorem 12.3, we showed that [ and \ satisfy the distributive pr...
- 59.59.8: Consider the poset .Z Z; / where is the product order; that is, .x;...
- 59.59.9: Consider the following infinite poset P. The elements of P are vari...
- 59.59.10: Let P be a lattice with minimum element b and maximum element t. a....
Solutions for Chapter 59: Lattices
Full solutions for Mathematics: A Discrete Introduction | 3rd Edition
Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.
Put CI, ... ,Cn in row n and put n - 1 ones just above the main diagonal. Then det(A - AI) = ±(CI + c2A + C3A 2 + .•. + cnA n-l - An).
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.
Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Eigenvalue A and eigenvector x.
Ax = AX with x#-O so det(A - AI) = o.
0,1,1,2,3,5, ... satisfy Fn = Fn-l + Fn- 2 = (A7 -A~)I()q -A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].
Gram-Schmidt orthogonalization A = QR.
Independent columns in A, orthonormal columns in Q. Each column q j of Q is a combination of the first j columns of A (and conversely, so R is upper triangular). Convention: diag(R) > o.
Incidence matrix of a directed graph.
The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
Rank r (A)
= number of pivots = dimension of column space = dimension of row space.
Reflection matrix (Householder) Q = I -2uuT.
Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.
Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.
Similar matrices A and B.
Every B = M-I AM has the same eigenvalues as A.
Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!
Singular matrix A.
A square matrix that has no inverse: det(A) = o.