 6.3.1: Draw the decision tree for sequential search on a list of three ele...
 6.3.2: Draw the decision tree for sequential search on a list of six eleme...
 6.3.3: Draw the decision tree for binary search on a sorted list of seven ...
 6.3.4: Draw the decision tree for binary search on a sorted list of four e...
 6.3.5: Consider a search algorithm that compares an item with the last ele...
 6.3.6: Consider a search algorithm that compares an item with an element o...
 6.3.7: a. Given the data 9, 5, 6, 2, 4, 7 construct the binary search tree...
 6.3.8: a. Given the data g, d, r, s, b, q, c, m construct the binary searc...
 6.3.9: a. For a set of six data items, what is the minimum worstcase numb...
 6.3.10: a. For a set of nine data items, what is the minimum worstcase num...
 6.3.11: An inorder tree traversal of a binary search tree produces a listin...
 6.3.12: Construct a binary search tree for In the high and faroff times th...
 6.3.13: Use the theorem on the lower bound for sorting to find lower bounds...
 6.3.14: Contrast the number of comparisons required for selection sort and ...
 6.3.15: One of five coinsis counterfeit and islighter than the other four. ...
 6.3.16: One of five coins is counterfeit and is either too heavy or too lig...
 6.3.17: One of four coins is counterfeit and is either too heavy or too lig...
 6.3.18: One of four coins is counterfeit and is either too heavy or too lig...
 6.3.19: Devise an algorithm to solve the problem of Exercise 18 using three...
 6.3.20: One of eight coins is counterfeit and is either too heavy or too li...
 6.3.21: In the decision tree for the binary search algorithm (and the binar...
 6.3.22: Our existing binary search algorithm (Chapter 3, Example 13) contai...
 6.3.23: To prove that log n! = (n log n), we can use the definition of orde...
Solutions for Chapter 6.3: Decision Trees
Full solutions for Mathematical Structures for Computer Science  7th Edition
ISBN: 9781429215107
Solutions for Chapter 6.3: Decision Trees
Get Full SolutionsMathematical Structures for Computer Science was written by and is associated to the ISBN: 9781429215107. Chapter 6.3: Decision Trees includes 23 full stepbystep solutions. Since 23 problems in chapter 6.3: Decision Trees have been answered, more than 21569 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Mathematical Structures for Computer Science, edition: 7.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Block matrix.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

Diagonalizable matrix A.
Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then SI AS = A = eigenvalue matrix.

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Elimination.
A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).
Use AT for complex A.

Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

Hermitian matrix A H = AT = A.
Complex analog a j i = aU of a symmetric matrix.

Indefinite matrix.
A symmetric matrix with eigenvalues of both signs (+ and  ).

Kronecker product (tensor product) A ® B.
Blocks aij B, eigenvalues Ap(A)Aq(B).

Multiplication Ax
= Xl (column 1) + ... + xn(column n) = combination of columns.

Orthogonal matrix Q.
Square matrix with orthonormal columns, so QT = Ql. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

Polar decomposition A = Q H.
Orthogonal Q times positive (semi)definite H.

Positive definite matrix A.
Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

Random matrix rand(n) or randn(n).
MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

Reflection matrix (Householder) Q = I 2uuT.
Unit vector u is reflected to Qu = u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q1 = Q.

Simplex method for linear programming.
The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

Unitary matrix UH = U T = UI.
Orthonormal columns (complex analog of Q).

Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.