- 13.3.1: Use Kruskals algorithm to find a minimum spanning tree in the conne...
- 13.3.2: Use Kruskals algorithm to find a minimum spanning tree in the conne...
- 13.3.3: What does Kruskals algorithm produce when no two weights in a conne...
- 13.3.4: What does Kruskals algorithm produce if all weights in a connected ...
- 13.3.5: Use Prims algorithm to find a minimum spanning tree in the connecte...
- 13.3.6: Use Prims algorithm to find a minimum spanning tree in the connecte...
- 13.3.7: Let G be a connected weighted graph. If every spanning tree of G is...
- 13.3.8: Let G be a connected weighted graph and let w1 < w2 < < wk be the k...
- 13.3.9: Let G be a connected weighted graph. Describe, with justification, ...
- 13.3.10: The weights of the edges of a complete graph of order 5 are 1, 2, 2...
- 13.3.11: For each of the weighted graphs G1 and G2 in Figure 13.34, determin...
- 13.3.12: Let G is a connected weighted graph of order n 4 all of whose edges...
- 13.3.13: Let G be a connected weighted graph and T a minimum spanning tree o...
- 13.3.14: Show, for each integer k 2, that there exists a connected weighted ...
Solutions for Chapter 13.3: The Minimum Spanning Tree Problem
Full solutions for Discrete Mathematics | 1st Edition
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.
Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.
Big formula for n by n determinants.
Det(A) is a sum of n! terms. For each term: Multiply one entry from each row and column of A: rows in order 1, ... , nand column order given by a permutation P. Each of the n! P 's has a + or - sign.
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Fourier matrix F.
Entries Fjk = e21Cijk/n give orthogonal columns FT F = nI. Then y = Fe is the (inverse) Discrete Fourier Transform Y j = L cke21Cijk/n.
Free columns of A.
Columns without pivots; these are combinations of earlier columns.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.
Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.
Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).
lA-II = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n - 1, volume of box = I det( A) I.
Linear transformation T.
Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.
Nilpotent matrix N.
Some power of N is the zero matrix, N k = o. The only eigenvalue is A = 0 (repeated n times). Examples: triangular matrices with zero diagonal.
Outer product uv T
= column times row = rank one matrix.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.
Pivot columns of A.
Columns that contain pivots after row reduction. These are not combinations of earlier columns. The pivot columns are a basis for the column space.
Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.
Solvable system Ax = b.
The right side b is in the column space of A.
Special solutions to As = O.
One free variable is Si = 1, other free variables = o.
Sum V + W of subs paces.
Space of all (v in V) + (w in W). Direct sum: V n W = to}.