 5.4.1E: Let X1,X2,X3 be a random sample of size 3 from the distribution wit...
 5.4.2E: Let X1 and X2 have independent distributions b(n1, p) and b(n2, p)....
 5.4.3E: Let X1, X2, X3 be mutually independent random variables with Poisso...
 5.4.4E: Generalize Exercise 5.43 by showing that the sum of n independent ...
 5.4.5E: Let Z1,Z2, . . . ,Z7 be a random sample from the standard normal di...
 5.4.6E: Let X1,X2,X3,X4,X5 be a random sample of size 5 from a geometric di...
 5.4.7E: Let X1,X2,X3 denote a random sample of size 3 from a gamma distribu...
 5.4.10E: Let X equal the outcome when a fair foursided die that has its fac...
 5.4.11E: Let X and Y equal the outcomes when two fair sixsided dice are rol...
 5.4.12E: Let X and Y be the outcomes when a pair of fair eightsided dice is...
 5.4.14E: The number of accidents in a period of one week follows a Poisson d...
 5.4.15E: Given a fair foursided die, let Y equal the number of rolls needed...
 5.4.16E: The number X of sick days taken during a year by an employee follow...
 5.4.17E: In a study concerning a new treatment of a certain disease, two gro...
 5.4.18E: The number of cracks on a highway averages 0.5 per mile and follows...
 5.4.19E: A doorman at a hotel is trying to get three taxicabs for three diff...
 5.4.20E: The time X in minutes of a visit to a cardiovascular disease specia...
 5.4.5.41: Let X1, X2, X3 be a random sample of size 3 from the distribution w...
 5.4.5.42: Let X1 and X2 have independent distributions b(n1, p) and b(n2, p)....
 5.4.5.43: Let X1, X2, X3 be mutually independent random variables with Poisso...
 5.4.5.44: Generalize Exercise 5.43 by showing that the sum of n independent ...
 5.4.5.45: Let Z1, Z2, ... , Z7 be a random sample from the standard normal di...
 5.4.5.46: Let X1, X2, X3, X4, X5 be a random sample of size 5 from a geometri...
 5.4.5.47: Let X1, X2, X3 denote a random sample of size 3 from a gamma distri...
 5.4.5.48: Let W = X1 + X2 ++ Xh, a sum of h mutually independent and identica...
 5.4.5.49: Let X and Y, with respective pmfs f(x) and g( y), be independent di...
 5.4.5.410: Let X equal the outcome when a fair foursided die that has its fac...
 5.4.5.411: Let X and Y equal the outcomes when two fair sixsided dice are rol...
 5.4.5.412: Let X and Y be the outcomes when a pair of fair eightsided dice is...
 5.4.5.413: Let X1, X2, ... , X8 be a random sample from a distribution having ...
 5.4.5.414: The number of accidents in a period of one week follows a Poisson d...
 5.4.5.415: Given a fair foursided die, let Y equal the number of rolls needed...
 5.4.5.416: The number X of sick days taken during a year by an employee follow...
 5.4.5.417: In a study concerning a new treatment of a certain disease, two gro...
 5.4.5.418: The number of cracks on a highway averages 0.5 per mile and follows...
 5.4.5.419: A doorman at a hotel is trying to get three taxicabs for three diff...
 5.4.5.420: The time X in minutes of a visit to a cardiovascular disease specia...
 5.4.5.421: Let X and Y be independent with distributions N(5, 16) and N(6, 9),...
 5.4.5.422: Let X1 and X2 be two independent random variables. Let X1 and Y = X...
 5.4.5.423: Let X be N(0, 1). Use the mgf technique to show that Y = X2 is 2(1)...
Solutions for Chapter 5.4: Distributions of Functions of Random Variables
Full solutions for Probability and Statistical Inference  9th Edition
ISBN: 9780321923271
Solutions for Chapter 5.4: Distributions of Functions of Random Variables
Get Full SolutionsProbability and Statistical Inference was written by and is associated to the ISBN: 9780321923271. Since 40 problems in chapter 5.4: Distributions of Functions of Random Variables have been answered, more than 81572 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Probability and Statistical Inference , edition: 9. Chapter 5.4: Distributions of Functions of Random Variables includes 40 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions.

`error (or `risk)
In hypothesis testing, an error incurred by rejecting a null hypothesis when it is actually true (also called a type I error).

Axioms of probability
A set of rules that probabilities deined on a sample space must follow. See Probability

Binomial random variable
A discrete random variable that equals the number of successes in a ixed number of Bernoulli trials.

C chart
An attribute control chart that plots the total number of defects per unit in a subgroup. Similar to a defectsperunit or U chart.

Central tendency
The tendency of data to cluster around some value. Central tendency is usually expressed by a measure of location such as the mean, median, or mode.

Chisquare test
Any test of signiicance based on the chisquare distribution. The most common chisquare tests are (1) testing hypotheses about the variance or standard deviation of a normal distribution and (2) testing goodness of it of a theoretical distribution to sample data

Conditional probability
The probability of an event given that the random experiment produces an outcome in another event.

Contour plot
A twodimensional graphic used for a bivariate probability density function that displays curves for which the probability density function is constant.

Cook’s distance
In regression, Cook’s distance is a measure of the inluence of each individual observation on the estimates of the regression model parameters. It expresses the distance that the vector of model parameter estimates with the ith observation removed lies from the vector of model parameter estimates based on all observations. Large values of Cook’s distance indicate that the observation is inluential.

Covariance
A measure of association between two random variables obtained as the expected value of the product of the two random variables around their means; that is, Cov(X Y, ) [( )( )] =? ? E X Y ? ? X Y .

Covariance matrix
A square matrix that contains the variances and covariances among a set of random variables, say, X1 , X X 2 k , , … . The main diagonal elements of the matrix are the variances of the random variables and the offdiagonal elements are the covariances between Xi and Xj . Also called the variancecovariance matrix. When the random variables are standardized to have unit variances, the covariance matrix becomes the correlation matrix.

Crossed factors
Another name for factors that are arranged in a factorial experiment.

Cumulative distribution function
For a random variable X, the function of X deined as PX x ( ) ? that is used to specify the probability distribution.

Cumulative normal distribution function
The cumulative distribution of the standard normal distribution, often denoted as ?( ) x and tabulated in Appendix Table II.

Defectsperunit control chart
See U chart

Erlang random variable
A continuous random variable that is the sum of a ixed number of independent, exponential random variables.

Error mean square
The error sum of squares divided by its number of degrees of freedom.

Expected value
The expected value of a random variable X is its longterm average or mean value. In the continuous case, the expected value of X is E X xf x dx ( ) = ?? ( ) ? ? where f ( ) x is the density function of the random variable X.

Fixed factor (or fixed effect).
In analysis of variance, a factor or effect is considered ixed if all the levels of interest for that factor are included in the experiment. Conclusions are then valid about this set of levels only, although when the factor is quantitative, it is customary to it a model to the data for interpolating between these levels.

Harmonic mean
The harmonic mean of a set of data values is the reciprocal of the arithmetic mean of the reciprocals of the data values; that is, h n x i n i = ? ? ? ? ? = ? ? 1 1 1 1 g .