 9.8.1E: In any sample space S, what is P(?)?
 9.8.2E: Suppose A, B, and C are mutually exclusive events in a sample space...
 9.8.3E: Suppose A and B are mutually exclusive events in a sample space S,C...
 9.8.4E: Suppose A and B are events in a sample space S with probabilities 0...
 9.8.5E: Suppose A and B are events in a sample space S and suppose that P(A...
 9.8.6E: Suppose U and V are events in a sample space S and suppose that P(U...
 9.8.7E: Suppose a sample space S consists of three outcomes: 0, 1, and 2. L...
 9.8.8E: Redo exercise 7 assuming that P(A) = 0.5 and P(B) = 0.4.Reference:S...
 9.8.9E: Let A and B be events in a sample space S, and let C = S ? (A ? B)....
 9.8.10E: Redo exercise 9 assuming that P(A) = 0.7, P(B) = 0.3, and P(A n B) ...
 9.8.11E: Prove that if S is any sample space and U and V are events in S wit...
 9.8.12E: Prove that if S is any sample space and U and V are any events in S...
 9.8.23E: A gambler repeatedly bets that a die will come up 6 when rolled. Ea...
Solutions for Chapter 9.8: Discrete Mathematics with Applications 4th Edition
Full solutions for Discrete Mathematics with Applications  4th Edition
ISBN: 9780495391326
Solutions for Chapter 9.8
Get Full SolutionsSince 13 problems in chapter 9.8 have been answered, more than 52947 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Discrete Mathematics with Applications , edition: 4. Discrete Mathematics with Applications was written by and is associated to the ISBN: 9780495391326. Chapter 9.8 includes 13 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions.

Associative Law (AB)C = A(BC).
Parentheses can be removed to leave ABC.

Augmented matrix [A b].
Ax = b is solvable when b is in the column space of A; then [A b] has the same rank as A. Elimination on [A b] keeps equations correct.

Circulant matrix C.
Constant diagonals wrap around as in cyclic shift S. Every circulant is Col + CIS + ... + Cn_lSn  l . Cx = convolution c * x. Eigenvectors in F.

Complete solution x = x p + Xn to Ax = b.
(Particular x p) + (x n in nullspace).

Cyclic shift
S. Permutation with S21 = 1, S32 = 1, ... , finally SIn = 1. Its eigenvalues are the nth roots e2lrik/n of 1; eigenvectors are columns of the Fourier matrix F.

Diagonalization
A = S1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k SI.

Dot product = Inner product x T y = XI Y 1 + ... + Xn Yn.
Complex dot product is x T Y . Perpendicular vectors have x T y = O. (AB)ij = (row i of A)T(column j of B).

Echelon matrix U.
The first nonzero entry (the pivot) in each row comes in a later column than the pivot in the previous row. All zero rows come last.

Fibonacci numbers
0,1,1,2,3,5, ... satisfy Fn = Fnl + Fn 2 = (A7 A~)I()q A2). Growth rate Al = (1 + .J5) 12 is the largest eigenvalue of the Fibonacci matrix [ } A].

Full column rank r = n.
Independent columns, N(A) = {O}, no free variables.

GaussJordan method.
Invert A by row operations on [A I] to reach [I AI].

Incidence matrix of a directed graph.
The m by n edgenode incidence matrix has a row for each edge (node i to node j), with entries 1 and 1 in columns i and j .

Linearly dependent VI, ... , Vn.
A combination other than all Ci = 0 gives L Ci Vi = O.

Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q 1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •

Plane (or hyperplane) in Rn.
Vectors x with aT x = O. Plane is perpendicular to a =1= O.

Pseudoinverse A+ (MoorePenrose inverse).
The n by m matrix that "inverts" A from column space back to row space, with N(A+) = N(AT). A+ A and AA+ are the projection matrices onto the row space and column space. Rank(A +) = rank(A).

Reflection matrix (Householder) Q = I 2uuT.
Unit vector u is reflected to Qu = u. All x intheplanemirroruTx = o have Qx = x. Notice QT = Q1 = Q.

Saddle point of I(x}, ... ,xn ).
A point where the first derivatives of I are zero and the second derivative matrix (a2 II aXi ax j = Hessian matrix) is indefinite.

Singular matrix A.
A square matrix that has no inverse: det(A) = o.

Singular Value Decomposition
(SVD) A = U:E VT = (orthogonal) ( diag)( orthogonal) First r columns of U and V are orthonormal bases of C (A) and C (AT), AVi = O'iUi with singular value O'i > O. Last columns are orthonormal bases of nullspaces.