- 53.53.1: Give an example of a curve that is closed but not simple.
- 53.53.2: Each of the graphs in the figure is planar. Redraw these graphs wit...
- 53.53.4: Complete the proof of Corollary 53.5. That is, prove that if G is p...
- 53.53.3: Let G be a planar graph with n vertices, m edges, and c components....
- 53.53.7: For which values of n is the n-cube Qn planar? (See Exercise 52.13....
- 53.53.5: Let G be a graph with 11 vertices. Prove that G or G must be nonpla...
- 53.53.10: Let G D .V; E/ be a planar graph in which every cycle has length 8 ...
- 53.53.6: Let G be a 5-regular graph with ten vertices. Prove that G is nonpl...
- 53.53.13: A soccer ball is formed by stitching together pieces of material th...
- 53.53.8: The graph in the figure is known as Petersens graph. Prove that it ...
- 53.53.9: Give a short proof that .G/ 6 for planar graphs (Proposition 53.11)...
- 53.53.11: A graph is called outerplanar if it can be drawn in the plane so th...
- 53.53.12: A Platonic graph is a connected planar graph in which all vertices ...
Solutions for Chapter 53: Planar Graphs
Full solutions for Mathematics: A Discrete Introduction | 3rd Edition
Tv = Av + Vo = linear transformation plus shift.
Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.
Commuting matrices AB = BA.
If diagonalizable, they share n eigenvectors.
Cramer's Rule for Ax = b.
B j has b replacing column j of A; x j = det B j I det A
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Incidence matrix of a directed graph.
The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .
Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).
Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.
Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.
Outer product uv T
= column times row = rank one matrix.
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
The diagonal entry (first nonzero) at the time when a row is used in elimination.
Projection matrix P onto subspace S.
Projection p = P b is the closest point to b in S, error e = b - Pb is perpendicularto S. p 2 = P = pT, eigenvalues are 1 or 0, eigenvectors are in S or S...L. If columns of A = basis for S then P = A (AT A) -1 AT.
Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).
Right inverse A+.
If A has full row rank m, then A+ = AT(AAT)-l has AA+ = 1m.
Row picture of Ax = b.
Each equation gives a plane in Rn; the planes intersect at x.
Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.
Combinations of VI, ... ,Vm fill the space. The columns of A span C (A)!
Symmetric factorizations A = LDLT and A = QAQT.
Signs in A = signs in D.