×
×

Solutions for Chapter 6.3: Decision Trees

Full solutions for Mathematical Structures for Computer Science | 7th Edition

ISBN: 9781429215107

Solutions for Chapter 6.3: Decision Trees

Solutions for Chapter 6.3
4 5 0 343 Reviews
26
0
ISBN: 9781429215107

Mathematical Structures for Computer Science was written by and is associated to the ISBN: 9781429215107. Chapter 6.3: Decision Trees includes 23 full step-by-step solutions. Since 23 problems in chapter 6.3: Decision Trees have been answered, more than 21569 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Mathematical Structures for Computer Science, edition: 7.

Key Math Terms and definitions covered in this textbook
• Associative Law (AB)C = A(BC).

Parentheses can be removed to leave ABC.

• Block matrix.

A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.

• Diagonalizable matrix A.

Must have n independent eigenvectors (in the columns of S; automatic with n different eigenvalues). Then S-I AS = A = eigenvalue matrix.

• Diagonalization

A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.

• Elimination matrix = Elementary matrix Eij.

The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

• Elimination.

A sequence of row operations that reduces A to an upper triangular U or to the reduced form R = rref(A). Then A = LU with multipliers eO in L, or P A = L U with row exchanges in P, or E A = R with an invertible E.

• Four Fundamental Subspaces C (A), N (A), C (AT), N (AT).

Use AT for complex A.

• Full row rank r = m.

Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

• Hermitian matrix A H = AT = A.

Complex analog a j i = aU of a symmetric matrix.

• Indefinite matrix.

A symmetric matrix with eigenvalues of both signs (+ and - ).

• Kronecker product (tensor product) A ® B.

Blocks aij B, eigenvalues Ap(A)Aq(B).

• Multiplication Ax

= Xl (column 1) + ... + xn(column n) = combination of columns.

• Orthogonal matrix Q.

Square matrix with orthonormal columns, so QT = Q-l. Preserves length and angles, IIQxll = IIxll and (QX)T(Qy) = xTy. AlllAI = 1, with orthogonal eigenvectors. Examples: Rotation, reflection, permutation.

• Polar decomposition A = Q H.

Orthogonal Q times positive (semi)definite H.

• Positive definite matrix A.

Symmetric matrix with positive eigenvalues and positive pivots. Definition: x T Ax > 0 unless x = O. Then A = LDLT with diag(D» O.

• Random matrix rand(n) or randn(n).

MATLAB creates a matrix with random entries, uniformly distributed on [0 1] for rand and standard normal distribution for randn.

• Reflection matrix (Householder) Q = I -2uuT.

Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.

• Simplex method for linear programming.

The minimum cost vector x * is found by moving from comer to lower cost comer along the edges of the feasible set (where the constraints Ax = b and x > 0 are satisfied). Minimum cost at a comer!

• Unitary matrix UH = U T = U-I.

Orthonormal columns (complex analog of Q).

• Vector v in Rn.

Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.

×