 3.9.1: Suppose that X1 and X2 are i.i.d. random variables and that each of...
 3.9.2: For the conditions of Exercise 1, find the p.d.f. of the average (X...
 3.9.3: Suppose that three random variables X1, X2, and X3 have a continuou...
 3.9.4: Suppose that X1 and X2 have a continuous joint distribution for whi...
 3.9.5: Suppose that the joint p.d.f. of X1 and X2 is as given in Exercise ...
 3.9.6: Let X and Y be random variables for which the joint p.d.f. is as fo...
 3.9.7: Suppose that X1 and X2 are i.i.d. random variables and that the p.d...
 3.9.8: Suppose that X1,...,Xn form a random sample of size n from the unif...
 3.9.9: Suppose that the n variables X1,...,Xn form a random sample from th...
 3.9.10: For the conditions of Exercise 9, determine the value of Pr(Y1 0.1 ...
 3.9.11: For the conditions of Exercise 9, determine the probability that th...
 3.9.12: Let W denote the range of a random sample of n observations from th...
 3.9.13: Determine the p.d.f. of the range of a random sample of n observati...
 3.9.14: Suppose that X1,...,Xn form a random sample of n observations from ...
 3.9.15: Show that if X1, X2,...,Xn are independent random variables and if ...
 3.9.16: Suppose that X1, X2,...,X5 are five random variables for which the ...
 3.9.17: In Example 3.9.10, use the Jacobian method (3.9.13) to verify that ...
 3.9.18: Let the conditional p.d.f. of X given Y be g1(xy) = 3x2/y3 for 0
 3.9.19: Let X1 and X2 be as in Exercise 7. Find the p.d.f. of Y = X1 + X2.
 3.9.20: If a2 = 0 in Theorem 3.9.4, show that Eq. (3.9.2) becomes the same ...
 3.9.21: In Examples 3.9.9 and 3.9.11, find the marginal p.d.f. of Z1 = X1/X...
Solutions for Chapter 3.9: Random Variables and Distributions
Full solutions for Probability and Statistics  4th Edition
ISBN: 9780321500465
Solutions for Chapter 3.9: Random Variables and Distributions
Get Full SolutionsProbability and Statistics was written by and is associated to the ISBN: 9780321500465. Since 21 problems in chapter 3.9: Random Variables and Distributions have been answered, more than 15828 students have viewed full stepbystep solutions from this chapter. This expansive textbook survival guide covers the following chapters and their solutions. This textbook survival guide was created for the textbook: Probability and Statistics, edition: 4. Chapter 3.9: Random Variables and Distributions includes 21 full stepbystep solutions.

Acceptance region
In hypothesis testing, a region in the sample space of the test statistic such that if the test statistic falls within it, the null hypothesis cannot be rejected. This terminology is used because rejection of H0 is always a strong conclusion and acceptance of H0 is generally a weak conclusion

Addition rule
A formula used to determine the probability of the union of two (or more) events from the probabilities of the events and their intersection(s).

Analysis of variance (ANOVA)
A method of decomposing the total variability in a set of observations, as measured by the sum of the squares of these observations from their average, into component sums of squares that are associated with speciic deined sources of variation

Assignable cause
The portion of the variability in a set of observations that can be traced to speciic causes, such as operators, materials, or equipment. Also called a special cause.

Attribute
A qualitative characteristic of an item or unit, usually arising in quality control. For example, classifying production units as defective or nondefective results in attributes data.

Bayes’ theorem
An equation for a conditional probability such as PA B (  ) in terms of the reverse conditional probability PB A (  ).

Bivariate normal distribution
The joint distribution of two normal random variables

Box plot (or box and whisker plot)
A graphical display of data in which the box contains the middle 50% of the data (the interquartile range) with the median dividing it, and the whiskers extend to the smallest and largest values (or some deined lower and upper limits).

Chance cause
The portion of the variability in a set of observations that is due to only random forces and which cannot be traced to speciic sources, such as operators, materials, or equipment. Also called a common cause.

Conditional mean
The mean of the conditional probability distribution of a random variable.

Conditional probability distribution
The distribution of a random variable given that the random experiment produces an outcome in an event. The given event might specify values for one or more other random variables

Continuous distribution
A probability distribution for a continuous random variable.

Cook’s distance
In regression, Cook’s distance is a measure of the inluence of each individual observation on the estimates of the regression model parameters. It expresses the distance that the vector of model parameter estimates with the ith observation removed lies from the vector of model parameter estimates based on all observations. Large values of Cook’s distance indicate that the observation is inluential.

Correlation coeficient
A dimensionless measure of the linear association between two variables, usually lying in the interval from ?1 to +1, with zero indicating the absence of correlation (but not necessarily the independence of the two variables).

Covariance
A measure of association between two random variables obtained as the expected value of the product of the two random variables around their means; that is, Cov(X Y, ) [( )( )] =? ? E X Y ? ? X Y .

Cumulative normal distribution function
The cumulative distribution of the standard normal distribution, often denoted as ?( ) x and tabulated in Appendix Table II.

Dependent variable
The response variable in regression or a designed experiment.

Exhaustive
A property of a collection of events that indicates that their union equals the sample space.

Expected value
The expected value of a random variable X is its longterm average or mean value. In the continuous case, the expected value of X is E X xf x dx ( ) = ?? ( ) ? ? where f ( ) x is the density function of the random variable X.

Finite population correction factor
A term in the formula for the variance of a hypergeometric random variable.