 4.1: Five men and 5 women are ranked according to their scores on an exa...
 4.2: Let X represent the difference between the number of heads and the ...
 4.3: In 2, if the coin is assumed fair, for n = 3, what are the probabil...
 4.4: The distribution function of the random variable X is given F(x) = ...
 4.5: Suppose the random variable X has probability density function f (x...
 4.6: The amount of time, in hours, that a computer functions before brea...
 4.7: The lifetime in hours of a certain kind of radio tube is a random v...
 4.8: If the density function of X equals f (x) = % c e2x 0 < x < 0 x < 0...
 4.9: A set of five transistors are to be tested, one at a time in a rand...
 4.10: The joint probability density function of X and Y is given by f (x,...
 4.11: Let X1, X2, . . . , Xn be independent random variables, each having...
 4.12: The joint density of X and Y is given by f (x, y) = _ x e(x+y) x > ...
 4.13: The joint density of X and Y is f (x, y) = _ 2 0< x < y, 0 < y < 1 ...
 4.14: If the joint density function of X and Y factors into one part depe...
 4.15: Is consistent with the results of 12 and 13?
 4.16: Suppose that X and Y are independent continuous random variables. S...
 4.17: When a current I (measured in amperes) flows through a resistance R...
 4.18: In Example 4.3b, determine the conditional probability mass functio...
 4.19: Compute the conditional density function of X given Y = y in (a) an...
 4.20: Show that X and Y are independent if and only if (a) pXY (xy) = p...
 4.21: Compute the expected value of the random variable in 1.
 4.22: Compute the expected value of the random variable in 3.
 4.23: Each night different meteorologists give us the probability that it...
 4.24: An insurance company writes a policy to the effect that an amount o...
 4.25: A total of 4 buses carrying 148 students from the same school arriv...
 4.26: Suppose that two teams play a series of games that end when one of ...
 4.27: The density function of X is given by f (x) = _ a + b x2 0 x 1 0 ot...
 4.28: The lifetime in hours of electronic tubes is a random variable havi...
 4.29: Let X1, X2, . . . , Xn be independent random variables having the c...
 4.30: Suppose that X has density function f (x) = _ 1 0< x < 1 0 otherwis...
 4.31: The time it takes to repair a personal computer is a random variabl...
 4.32: If E[X] = 2 and E[X2] = 8, calculate (a) E[(2+4X)2] and (b) E[X2+(X...
 4.33: Ten balls are randomly chosen from an urn containing 17 white and 2...
 4.34: If X is a continuous random variable having distribution function F...
 4.35: The median, like the mean, is important in predicting the value of ...
 4.36: We say that mp is the 100p percentile of the distribution function ...
 4.37: A community consists of 100 married couples. If 50 members of the c...
 4.38: Compute the expectation and variance of the number of successes in ...
 4.39: Suppose that X is equally likely to take on any of the values 1, 2,...
 4.40: Let pi = P{X = i} and suppose that p1 + p2 + p3 = 1. If E[X] = 2, w...
 4.41: Compute the mean and variance of the number of heads that appear in...
 4.42: Argue that for any random variable X E[X2] (E[X])2 When does one ha...
 4.43: A random variable X, which represents the weight (in ounces) of an ...
 4.44: Let Xi denote the percentage of votes cast in a given election that...
 4.45: A product is classified according to the number of defects it conta...
 4.46: Find Corr(X1, X2) for the random variables of 44.
 4.47: Verify Equation 4.7.4.
 4.48: Prove Equation 4.7.5 by using mathematical induction.
 4.49: Let X have variance 2 x and let Y have variance 2 y . Starting with...
 4.50: Consider n independent trials, each of which results in any of the ...
 4.51: In Example 4.5f, compute Cov(Xi, Xj) and use this result to show th...
 4.52: If X1 and X2 have the same probability distribution function, show ...
 4.53: Suppose that X has density function f (x) = ex, x > 0 Compute the m...
 4.54: If the density function of X is f (x) = 1, 0 < x < 1 determine E[et...
 4.55: Suppose that X is a random variable with mean and variance both equ...
 4.56: From past experience, a professor knows that the test score of a st...
 4.57: Let X and Y have respective distribution functions FX and FY , and ...
Solutions for Chapter 4: Random Variables and Expectation
Full solutions for Introduction to Probability and Statistics for Engineers and Scientists  5th Edition
ISBN: 9780123948113
Solutions for Chapter 4: Random Variables and Expectation
Get Full SolutionsSince 57 problems in chapter 4: Random Variables and Expectation have been answered, more than 8794 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Introduction to Probability and Statistics for Engineers and Scientists, edition: 5. Chapter 4: Random Variables and Expectation includes 57 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. Introduction to Probability and Statistics for Engineers and Scientists was written by and is associated to the ISBN: 9780123948113.

2 k p  factorial experiment
A fractional factorial experiment with k factors tested in a 2 ? p fraction with all factors tested at only two levels (settings) each

`error (or `risk)
In hypothesis testing, an error incurred by rejecting a null hypothesis when it is actually true (also called a type I error).

Analysis of variance (ANOVA)
A method of decomposing the total variability in a set of observations, as measured by the sum of the squares of these observations from their average, into component sums of squares that are associated with speciic deined sources of variation

Average
See Arithmetic mean.

Binomial random variable
A discrete random variable that equals the number of successes in a ixed number of Bernoulli trials.

Bivariate normal distribution
The joint distribution of two normal random variables

Block
In experimental design, a group of experimental units or material that is relatively homogeneous. The purpose of dividing experimental units into blocks is to produce an experimental design wherein variability within blocks is smaller than variability between blocks. This allows the factors of interest to be compared in an environment that has less variability than in an unblocked experiment.

Chance cause
The portion of the variability in a set of observations that is due to only random forces and which cannot be traced to speciic sources, such as operators, materials, or equipment. Also called a common cause.

Coeficient of determination
See R 2 .

Conditional probability distribution
The distribution of a random variable given that the random experiment produces an outcome in an event. The given event might specify values for one or more other random variables

Confounding
When a factorial experiment is run in blocks and the blocks are too small to contain a complete replicate of the experiment, one can run a fraction of the replicate in each block, but this results in losing information on some effects. These effects are linked with or confounded with the blocks. In general, when two factors are varied such that their individual effects cannot be determined separately, their effects are said to be confounded.

Continuity correction.
A correction factor used to improve the approximation to binomial probabilities from a normal distribution.

Correlation matrix
A square matrix that contains the correlations among a set of random variables, say, XX X 1 2 k , ,…, . The main diagonal elements of the matrix are unity and the offdiagonal elements rij are the correlations between Xi and Xj .

Counting techniques
Formulas used to determine the number of elements in sample spaces and events.

Degrees of freedom.
The number of independent comparisons that can be made among the elements of a sample. The term is analogous to the number of degrees of freedom for an object in a dynamic system, which is the number of independent coordinates required to determine the motion of the object.

Event
A subset of a sample space.

Generating function
A function that is used to determine properties of the probability distribution of a random variable. See Momentgenerating function

Generator
Effects in a fractional factorial experiment that are used to construct the experimental tests used in the experiment. The generators also deine the aliases.

Geometric random variable
A discrete random variable that is the number of Bernoulli trials until a success occurs.

Hat matrix.
In multiple regression, the matrix H XXX X = ( ) ? ? 1 . This a projection matrix that maps the vector of observed response values into a vector of itted values by yˆ = = X X X X y Hy ( ) ? ? ?1 .