×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 6.3: Decision Trees

Mathematical Structures for Computer Science | 7th Edition | ISBN: 9781429215107 | Authors: Judith L. Gersting

Full solutions for Mathematical Structures for Computer Science | 7th Edition

ISBN: 9781429215107

Mathematical Structures for Computer Science | 7th Edition | ISBN: 9781429215107 | Authors: Judith L. Gersting

Solutions for Chapter 6.3: Decision Trees

Solutions for Chapter 6.3
4 5 0 289 Reviews
26
0
Textbook: Mathematical Structures for Computer Science
Edition: 7
Author: Judith L. Gersting
ISBN: 9781429215107

Mathematical Structures for Computer Science was written by and is associated to the ISBN: 9781429215107. Chapter 6.3: Decision Trees includes 23 full step-by-step solutions. Since 23 problems in chapter 6.3: Decision Trees have been answered, more than 12339 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Mathematical Structures for Computer Science, edition: 7.

Key Math Terms and definitions covered in this textbook
  • Change of basis matrix M.

    The old basis vectors v j are combinations L mij Wi of the new basis vectors. The coordinates of CI VI + ... + cnvn = dl wI + ... + dn Wn are related by d = M c. (For n = 2 set VI = mll WI +m21 W2, V2 = m12WI +m22w2.)

  • Column space C (A) =

    space of all combinations of the columns of A.

  • Complete solution x = x p + Xn to Ax = b.

    (Particular x p) + (x n in nullspace).

  • Conjugate Gradient Method.

    A sequence of steps (end of Chapter 9) to solve positive definite Ax = b by minimizing !x T Ax - x Tb over growing Krylov subspaces.

  • Exponential eAt = I + At + (At)2 12! + ...

    has derivative AeAt; eAt u(O) solves u' = Au.

  • Full row rank r = m.

    Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.

  • Gauss-Jordan method.

    Invert A by row operations on [A I] to reach [I A-I].

  • Independent vectors VI, .. " vk.

    No combination cl VI + ... + qVk = zero vector unless all ci = O. If the v's are the columns of A, the only solution to Ax = 0 is x = o.

  • Inverse matrix A-I.

    Square matrix with A-I A = I and AA-l = I. No inverse if det A = 0 and rank(A) < n and Ax = 0 for a nonzero vector x. The inverses of AB and AT are B-1 A-I and (A-I)T. Cofactor formula (A-l)ij = Cji! detA.

  • Kirchhoff's Laws.

    Current Law: net current (in minus out) is zero at each node. Voltage Law: Potential differences (voltage drops) add to zero around any closed loop.

  • lA-II = l/lAI and IATI = IAI.

    The big formula for det(A) has a sum of n! terms, the cofactor formula uses determinants of size n - 1, volume of box = I det( A) I.

  • Linear transformation T.

    Each vector V in the input space transforms to T (v) in the output space, and linearity requires T(cv + dw) = c T(v) + d T(w). Examples: Matrix multiplication A v, differentiation and integration in function space.

  • Multiplier eij.

    The pivot row j is multiplied by eij and subtracted from row i to eliminate the i, j entry: eij = (entry to eliminate) / (jth pivot).

  • Pivot.

    The diagonal entry (first nonzero) at the time when a row is used in elimination.

  • Polar decomposition A = Q H.

    Orthogonal Q times positive (semi)definite H.

  • Saddle point of I(x}, ... ,xn ).

    A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

  • Singular matrix A.

    A square matrix that has no inverse: det(A) = o.

  • Spectrum of A = the set of eigenvalues {A I, ... , An}.

    Spectral radius = max of IAi I.

  • Standard basis for Rn.

    Columns of n by n identity matrix (written i ,j ,k in R3).

  • Wavelets Wjk(t).

    Stretch and shift the time axis to create Wjk(t) = woo(2j t - k).

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password