- 34.34.1: Find the expected value of the random variables X, Y , and Z in Exe...
- 34.34.2: Let .S; P / be the sample space with S D fa; b; cg and P .s/ D 1 3 ...
- 34.34.3: A pair of tetrahedral dice are rolled (see Exercise 30.8). Let X be...
- 34.34.4: You play a game in which you roll a die and you win (in dollars) th...
- 34.34.5: A basket holds 100 chips that are labeled with the integers 1 throu...
- 34.34.6: A coin is flipped 100 times. Let XH be the number of HEADS and XT t...
- 34.34.7: Skeeball is an arcade game in which a player rolls a ball along a r...
- 34.34.8: The term expected value can be a bit deceiving. The following quest...
- 34.34.9: Prove Proposition 34.11.
- 34.34.10: Suppose X is a zero-one random variable. Prove that E.X / D E.X2 /.
- 34.34.11: Let X be a binomial random variable as in Example 33.5. Prove that ...
- 34.34.12: Let n be a positive integer. A random integer N between 0 and 2 n 1...
- 34.34.13: Let X be a random variable whose value is never zero. Prove or disp...
- 34.34.14: n Theorem 34.14 we learn that if X and Y are independent random var...
- 34.34.15: Let X and Y be real-valued random variables defined on a sample spa...
- 34.34.16: Let .S; P / be a sample space and let A S be an event. Define a ran...
- 34.34.17: Markovs inequality. Let.S; P / be a sample space and let X W S ! N ...
- 34.34.18: Find the variance of the random variables X, Y , and Z in Exercise ...
- 34.34.19: Let .S; P / be the sample space in which S D f1; 2; 3; 4g and P is ...
- 34.34.20: Let X be the number produced in a toss of a tetrahedral die. Calcul...
- 34.34.21: Let X be defined on a sample space .S; P / and suppose that P .s/ 6...
- 34.34.22: Suppose X and Y are independent random variables defined on a sampl...
- 34.34.23: A pair of dice are tossed. Let X be the sum of the numbers on the t...
- 34.34.24: Chebyshevs inequality. Let X be a nonnegative-integer-valued random...
- 34.34.25: Let X and Y be random variables defined on a common sample space. T...
Solutions for Chapter 34: Expectation
Full solutions for Mathematics: A Discrete Introduction | 3rd Edition
Tv = Av + Vo = linear transformation plus shift.
A matrix can be partitioned into matrix blocks, by cuts between rows and/or between columns. Block multiplication ofAB is allowed if the block shapes permit.
z = a - ib for any complex number z = a + ib. Then zz = Iz12.
Determinant IAI = det(A).
Defined by det I = 1, sign reversal for row exchange, and linearity in each row. Then IAI = 0 when A is singular. Also IABI = IAIIBI and
A = S-1 AS. A = eigenvalue matrix and S = eigenvector matrix of A. A must have n independent eigenvectors to make S invertible. All Ak = SA k S-I.
Dimension of vector space
dim(V) = number of vectors in any basis for V.
Elimination matrix = Elementary matrix Eij.
The identity matrix with an extra -eij in the i, j entry (i #- j). Then Eij A subtracts eij times row j of A from row i.
A = L U. If elimination takes A to U without row exchanges, then the lower triangular L with multipliers eij (and eii = 1) brings U back to A.
Full row rank r = m.
Independent rows, at least one solution to Ax = b, column space is all of Rm. Full rank means full column rank or full row rank.
Krylov subspace Kj(A, b).
The subspace spanned by b, Ab, ... , Aj-Ib. Numerical methods approximate A -I b by x j with residual b - Ax j in this subspace. A good basis for K j requires only multiplication by A at each step.
Left nullspace N (AT).
Nullspace of AT = "left nullspace" of A because y T A = OT.
Ln = 2,J, 3, 4, ... satisfy Ln = L n- l +Ln- 2 = A1 +A~, with AI, A2 = (1 ± -/5)/2 from the Fibonacci matrix U~]' Compare Lo = 2 with Fo = O.
Markov matrix M.
All mij > 0 and each column sum is 1. Largest eigenvalue A = 1. If mij > 0, the columns of Mk approach the steady state eigenvector M s = s > O.
Matrix multiplication AB.
The i, j entry of AB is (row i of A)·(column j of B) = L aikbkj. By columns: Column j of AB = A times column j of B. By rows: row i of A multiplies B. Columns times rows: AB = sum of (column k)(row k). All these equivalent definitions come from the rule that A B times x equals A times B x .
Every v in V is orthogonal to every w in W.
Orthonormal vectors q 1 , ... , q n·
Dot products are q T q j = 0 if i =1= j and q T q i = 1. The matrix Q with these orthonormal columns has Q T Q = I. If m = n then Q T = Q -1 and q 1 ' ... , q n is an orthonormal basis for Rn : every v = L (v T q j )q j •
Permutation matrix P.
There are n! orders of 1, ... , n. The n! P 's have the rows of I in those orders. P A puts the rows of A in the same order. P is even or odd (det P = 1 or -1) based on the number of row exchanges to reach I.
Singular matrix A.
A square matrix that has no inverse: det(A) = o.
If x gives the movements of the nodes, K x gives the internal forces. K = ATe A where C has spring constants from Hooke's Law and Ax = stretching.
Symmetric matrix A.
The transpose is AT = A, and aU = a ji. A-I is also symmetric.