- 11.R.1RQ: a) Define a tree.________________b) Define a forest.
- 11.R.2RQ: Can there be two different simple paths between the vertices of a t...
- 11.R.3RQ: Give at least three examples of how trees are used in modeling.
- 11.R.4RQ: a) Define a rooted tree and the root of such a tree._______________...
- 11.R.6RQ: a) Define a full m-ary tree.________________b) How many vertices do...
- 11.R.5RQ: a) How many edges does a tree with n vertices have?________________...
- 11.R.7RQ: a) What is the height of a rooted tree?________________b) What is a...
- 11.R.8RQ: a) What is a binary search tree?________________b) Describe an algo...
- 11.R.9RQ: a) What is a prefix code?________________b) How can a prefix code b...
- 11.R.10RQ: a) Define preorder, inorder, and postorder tree traversal._________...
- 11.R.11RQ: a) Explain how to use preorder, inorder, and postorder traversals t...
- 11.R.12RQ: Show that the number of comparisons used by a sorting algorithm to ...
- 11.R.13RQ: a) Describe the Huffman coding algorithm for constructing an optima...
- 11.R.15RQ: a) What is a spanning tree of a simple graph?________________b) Whi...
- 11.R.14RQ: Draw the game tree for nim if the starting position consists of two...
- 11.R.16RQ: a) Describe two different algorithms for finding a spanning tree in...
- 11.R.17RQ: a) Explain how backtracking can be used to determine whether a simp...
- 11.R.18RQ: a) What is a minimum spanning tree of a connected weighted graph?__...
- 11.R.19RQ: a) Describe Kruskal’s algorithm and Prim’s algorithm for finding mi...
Solutions for Chapter 11.R: Discrete Mathematics and Its Applications 7th Edition
Full solutions for Discrete Mathematics and Its Applications | 7th Edition
Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).
Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.
Change of basis matrix M.
The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)
Exponential eAt = I + At + (At)2 12! + ...
has derivative AeAt; eAt u(O) solves u' = Au.
The nullspace N (A) and row space C (AT) are orthogonal complements in Rn(perpendicular from Ax = 0 with dimensions rand n - r). Applied to AT, the column space C(A) is the orthogonal complement of N(AT) in Rm.
Hankel matrix H.
Constant along each antidiagonal; hij depends on i + j.
Hilbert matrix hilb(n).
Entries HU = 1/(i + j -1) = Jd X i- 1 xj-1dx. Positive definite but extremely small Amin and large condition number: H is ill-conditioned.
Jordan form 1 = M- 1 AM.
If A has s independent eigenvectors, its "generalized" eigenvector matrix M gives 1 = diag(lt, ... , 1s). The block his Akh +Nk where Nk has 1 's on diagonall. Each block has one eigenvalue Ak and one eigenvector.
lA-II = l/lAI and IATI = IAI.
The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n - 1, volume of box = I det( A) I.
Left inverse A+.
If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.
Length II x II.
Square root of x T x (Pythagoras in n dimensions).
Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.
IIA II. The ".e 2 norm" of A is the maximum ratio II Ax II/l1x II = O"max· Then II Ax II < IIAllllxll and IIABII < IIAIIIIBII and IIA + BII < IIAII + IIBII. Frobenius norm IIAII} = L La~. The.e 1 and.e oo norms are largest column and row sums of laij I.
Every v in V is orthogonal to every w in W.
The diagonal entry (first nonzero) at the time when a row is used in elimination.
Rank one matrix A = uvT f=. O.
Column and row spaces = lines cu and cv.
Row space C (AT) = all combinations of rows of A.
Column vectors by convention.
Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.
Triangle inequality II u + v II < II u II + II v II.
For matrix norms II A + B II < II A II + II B II·
Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.