 9.8.1: In any sample space S, what is P()?
 9.8.2: Suppose A, B, and C are mutually exclusive events in a sample space...
 9.8.3: Suppose A and B are mutually exclusive events in a sample space S,C...
 9.8.4: Suppose A and B are events in a sample space S with probabilities 0...
 9.8.5: Suppose A and B are events in a sample space S and suppose that P(A...
 9.8.6: Suppose U and V are events in a sample space S and suppose that P(U...
 9.8.7: Suppose a sample space S consists of three outcomes: 0, 1, and 2. L...
 9.8.8: Redo exercise 7 assuming that P(A) = 0.5 and P(B) = 0.4.
 9.8.9: Let A and B be events in a sample space S, and let C = S (A B). Sup...
 9.8.10: Redo exercise 9 assuming that P(A) = 0.7, P(B) = 0.3, and P(A B) = ...
 9.8.11: Prove that if S is any sample space and U and V are events in S wit...
 9.8.12: Prove that if S is any sample space and U and V are any events in S...
 9.8.13: Use the axioms for probability and mathematical induction to prove ...
 9.8.14: A lottery game offers $2 million to the grand prize winner, $20 to ...
 9.8.15: A company sends millions of people an entry form for a sweepstakes ...
 9.8.16: An urn contains four balls numbered 2, 2, 5, and 6. If a person sel...
 9.8.17: An urn contains five balls numbered 1, 2, 2, 8, and 8. If a person ...
 9.8.18: An urn contains five balls numbered 1, 2, 2, 8, and 8. If a person ...
 9.8.19: When a pair of balanced dice are rolled and the sum of the numbers ...
 9.8.20: Suppose a person offers to play a game with you. In this game, when...
 9.8.21: A person pays $1 to play the following game: The person tosses a fa...
 9.8.22: A fair coin is tossed until either a head comes up or four tails ar...
 9.8.23: A gambler repeatedly bets that a die will come up 6 when rolled. Ea...
Solutions for Chapter 9.8: Probability Axioms and Expected Value
Full solutions for Discrete Mathematics with Applications  4th Edition
ISBN: 9780495391326
Solutions for Chapter 9.8: Probability Axioms and Expected Value
Get Full SolutionsSince 23 problems in chapter 9.8: Probability Axioms and Expected Value have been answered, more than 56882 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 9.8: Probability Axioms and Expected Value includes 23 full stepbystep solutions. This textbook survival guide was created for the textbook: Discrete Mathematics with Applications , edition: 4. Discrete Mathematics with Applications was written by and is associated to the ISBN: 9780495391326.

Adjacency matrix of a graph.
Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected). Adjacency matrix of a graph. Square matrix with aij = 1 when there is an edge from node i to node j; otherwise aij = O. A = AT when edges go both ways (undirected).

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Covariance matrix:E.
When random variables Xi have mean = average value = 0, their covariances "'£ ij are the averages of XiX j. With means Xi, the matrix :E = mean of (x  x) (x  x) T is positive (semi)definite; :E is diagonal if the Xi are independent.

Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra eij in the i, j entry (i # j). Then Eij A subtracts eij times row j of A from row i.

Free columns of A.
Columns without pivots; these are combinations of earlier columns.

Full column rank r = n.
Independent columns, N(A) = {O}, no free variables.

Hypercube matrix pl.
Row n + 1 counts corners, edges, faces, ... of a cube in Rn.

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , AjIb. Numerical methods approximate A I b by x j with residual b  Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.

Least squares solution X.
The vector x that minimizes the error lie 112 solves AT Ax = ATb. Then e = b  Ax is orthogonal to all columns of A.

Left inverse A+.
If A has full column rank n, then A+ = (AT A)I AT has A+ A = In.

Normal equation AT Ax = ATb.
Gives the least squares solution to Ax = b if A has full rank n (independent columns). The equation says that (columns of A)·(b  Ax) = o.

Partial pivoting.
In each column, choose the largest available pivot to control roundoff; all multipliers have leij I < 1. See condition number.

Rayleigh quotient q (x) = X T Ax I x T x for symmetric A: Amin < q (x) < Amax.
Those extremes are reached at the eigenvectors x for Amin(A) and Amax(A).

Reflection matrix (Householder) Q = I 2uuT.
Unit vector u is reflected to Qu = u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q1 = Q.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Schwarz inequality
Iv·wl < IIvll IIwll.Then IvTAwl2 < (vT Av)(wT Aw) for pos def A.

Solvable system Ax = b.
The right side b is in the column space of A.

Vector addition.
v + w = (VI + WI, ... , Vn + Wn ) = diagonal of parallelogram.

Vector v in Rn.
Sequence of n real numbers v = (VI, ... , Vn) = point in Rn.