×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide

Solutions for Chapter 9.8: Probability Axioms and Expected Value

Discrete Mathematics with Applications | 4th Edition | ISBN: 9780495391326 | Authors: Susanna S. Epp

Full solutions for Discrete Mathematics with Applications | 4th Edition

ISBN: 9780495391326

Discrete Mathematics with Applications | 4th Edition | ISBN: 9780495391326 | Authors: Susanna S. Epp

Solutions for Chapter 9.8: Probability Axioms and Expected Value

Solutions for Chapter 9.8
4 5 0 335 Reviews
27
1
Textbook: Discrete Mathematics with Applications
Edition: 4
Author: Susanna S. Epp
ISBN: 9780495391326

Since 23 problems in chapter 9.8: Probability Axioms and Expected Value have been answered, more than 56882 students have viewed full step-by-step solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 9.8: Probability Axioms and Expected Value includes 23 full step-by-step solutions. This textbook survival guide was created for the textbook: Discrete Mathematics with Applications , edition: 4. Discrete Mathematics with Applications was written by and is associated to the ISBN: 9780495391326.

Key Math Terms and definitions covered in this textbook
  • Adjacency matrix of a graph.

    Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

  • Augmented matrix [A b].

    Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

  • Covariance matrix:E.

    When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x - x) (x - x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

  • Elimination matrix = Elementary matrix Eij.

    The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.

  • Free columns of A.

    Columns without pivots; these are combinations of earlier columns.

  • Full column rank r = n.

    Independent columns, N(A) = {O}, no free variables.

  • Hypercube matrix pl.

    Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

  • Incidence matrix of a directed graph.

    The m by n edge-node incidence matrix has a row for each edge (node i to node j), with entries -1 and 1 in columns i and j .

  • Krylov subspace Kj(A, b).

    The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

  • Least squares solution X.

    The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b - Ax is orthogonal to all columns of A.

  • Left inverse A+.

    If A has full column rank n, then A+ = (AT A)-I AT has A+ A = In.

  • Normal equation AT Ax = ATb.

    Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b - Ax) = o.

  • Partial pivoting.

    In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

  • Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.

    Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

  • Reflection matrix (Householder) Q = I -2uuT.

    Unit vector u is reflected to Qu = -u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q-1 = Q.

  • Saddle point of I(x}, ... ,xn ).

    A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

  • Schwarz inequality

    Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

  • Solvable system Ax = b.

    The right side b is in the column space of A.

  • Vector addition.

    v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.

  • Vector v in Rn.

    Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.

×
Log in to StudySoup
Get Full Access to Math - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Math - Textbook Survival Guide
×
Reset your password