- 49.49.1: Let G be the graph in the figure. a b a. How many different paths a...
- 49.49.2: Is concatenation a commutative operation?
- 49.49.3: Prove that Kn is connected
- 49.49.4: Let n 2 be an integer. Form a graph Gn whose vertices are all the t...
- 49.49.5: Consider the following (incorrect) restatement of the definition of...
- 49.49.6: Let G be a graph. A path P in G that contains all the vertices of G...
- 49.49.7: How many Hamiltonian paths (see previous problem for definition) do...
- 49.49.8: Mouse and cheese. A block of cheese is made up of 3 3 3 cubes as in...
- 49.49.9: Consider the is-connected-to relation on the vertices of a graph. S...
- 49.49.10: Let G be a graph. Prove that G or G (or both) must be connected.
- 49.49.11: Let G be a graph with n 2 vertices. Prove that if .G/ 1 2 n, then G...
- 49.49.12: Let G be a graph with n 2 vertices. a. Prove that if G has at least...
- 49.49.13: Let G be a graph and let v; w 2 V .G/. The distance from v to w is ...
- 49.49.14: For those who have studied linear algebra. Let A be the adjacency m...
- 49.49.15: Let n and k be integers with 1 k < n. Form a graph G whose vertices...
Solutions for Chapter 49: Connection
Full solutions for Mathematics: A Discrete Introduction | 3rd Edition
Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.
Basis for V.
Independent vectors VI, ... , v d whose linear combinations give each vector in V as v = CIVI + ... + CdVd. V has many bases, each basis gives unique c's. A vector space has many bases!
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.
Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn - l . Cx = convolution c * x. Eigenvectors in F.
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
0,1,1,2,3,5, ... satisfy Fn = Fn-l + Fn- 2 = (A7 -A~)I()q -A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.
Multiplicities AM and G M.
The algebraic multiplicity A M of A is the number of times A appears as a root of det(A - AI) = O. The geometric multiplicity GM is the number of independent eigenvectors for A (= dimension of the eigenspace).
The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).
Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.
The diagonal entry (first nonzero) at the time when a row is used in elimination.
Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.
Reflection matrix (Householder) Q = I -2uuT.
Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.
Skew-symmetric matrix K.
The transpose is -K, since Kij = -Kji. Eigenvalues are pure imaginary, eigenvectors are orthogonal, eKt is an orthogonal matrix.
Spectral Theorem A = QAQT.
Real symmetric A has real A'S and orthonormal q's.
Volume of box.
The rows (or the columns) of A generate a box with volume I det(A) I.