 4.1: Five men and 5 women are ranked according to their scores on an exa...
 4.2: Let X represent the difference between the number of heads and the ...
 4.3: In 2, if the coin is assumed fair, for n = 3, what are the probabil...
 4.4: The distribution function of the random variable X is given F(x) = ...
 4.5: Suppose the random variable X has probability density function f (x...
 4.6: The amount of time, in hours, that a computer functions before brea...
 4.7: The lifetime in hours of a certain kind of radio tube is a random v...
 4.8: If the density function of X equals f (x) = % c e2x 0 < x < 0 x < 0...
 4.9: A set of five transistors are to be tested, one at a time in a rand...
 4.10: The joint probability density function of X and Y is given by f (x,...
 4.11: Let X1, X2, . . . , Xn be independent random variables, each having...
 4.12: The joint density of X and Y is given by f (x, y) = _ x e(x+y) x > ...
 4.13: The joint density of X and Y is f (x, y) = _ 2 0< x < y, 0 < y < 1 ...
 4.14: If the joint density function of X and Y factors into one part depe...
 4.15: Is consistent with the results of 12 and 13?
 4.16: Suppose that X and Y are independent continuous random variables. S...
 4.17: When a current I (measured in amperes) flows through a resistance R...
 4.18: In Example 4.3b, determine the conditional probability mass functio...
 4.19: Compute the conditional density function of X given Y = y in (a) an...
 4.20: Show that X and Y are independent if and only if (a) pXY (xy) = p...
 4.21: Compute the expected value of the random variable in 1.
 4.22: Compute the expected value of the random variable in 3.
 4.23: Each night different meteorologists give us the probability that it...
 4.24: An insurance company writes a policy to the effect that an amount o...
 4.25: A total of 4 buses carrying 148 students from the same school arriv...
 4.26: Suppose that two teams play a series of games that end when one of ...
 4.27: The density function of X is given by f (x) = _ a + b x2 0 x 1 0 ot...
 4.28: The lifetime in hours of electronic tubes is a random variable havi...
 4.29: Let X1, X2, . . . , Xn be independent random variables having the c...
 4.30: Suppose that X has density function f (x) = _ 1 0< x < 1 0 otherwis...
 4.31: The time it takes to repair a personal computer is a random variabl...
 4.32: If E[X] = 2 and E[X2] = 8, calculate (a) E[(2+4X)2] and (b) E[X2+(X...
 4.33: Ten balls are randomly chosen from an urn containing 17 white and 2...
 4.34: If X is a continuous random variable having distribution function F...
 4.35: The median, like the mean, is important in predicting the value of ...
 4.36: We say that mp is the 100p percentile of the distribution function ...
 4.37: A community consists of 100 married couples. If 50 members of the c...
 4.38: Compute the expectation and variance of the number of successes in ...
 4.39: Suppose that X is equally likely to take on any of the values 1, 2,...
 4.40: Let pi = P{X = i} and suppose that p1 + p2 + p3 = 1. If E[X] = 2, w...
 4.41: Compute the mean and variance of the number of heads that appear in...
 4.42: Argue that for any random variable X E[X2] (E[X])2 When does one ha...
 4.43: A random variable X, which represents the weight (in ounces) of an ...
 4.44: Let Xi denote the percentage of votes cast in a given election that...
 4.45: A product is classified according to the number of defects it conta...
 4.46: Find Corr(X1, X2) for the random variables of 44.
 4.47: Verify Equation 4.7.4.
 4.48: Prove Equation 4.7.5 by using mathematical induction.
 4.49: Let X have variance 2 x and let Y have variance 2 y . Starting with...
 4.50: Consider n independent trials, each of which results in any of the ...
 4.51: In Example 4.5f, compute Cov(Xi, Xj) and use this result to show th...
 4.52: If X1 and X2 have the same probability distribution function, show ...
 4.53: Suppose that X has density function f (x) = ex, x > 0 Compute the m...
 4.54: If the density function of X is f (x) = 1, 0 < x < 1 determine E[et...
 4.55: Suppose that X is a random variable with mean and variance both equ...
 4.56: From past experience, a professor knows that the test score of a st...
 4.57: Let X and Y have respective distribution functions FX and FY , and ...
Solutions for Chapter 4: Random Variables and Expectation
Full solutions for Introduction to Probability and Statistics for Engineers and Scientists  5th Edition
ISBN: 9780123948113
Solutions for Chapter 4: Random Variables and Expectation
Get Full SolutionsSince 57 problems in chapter 4: Random Variables and Expectation have been answered, more than 3676 students have viewed full stepbystep solutions from this chapter. This textbook survival guide was created for the textbook: Introduction to Probability and Statistics for Engineers and Scientists, edition: 5. Chapter 4: Random Variables and Expectation includes 57 full stepbystep solutions. This expansive textbook survival guide covers the following chapters and their solutions. Introduction to Probability and Statistics for Engineers and Scientists was written by Patricia and is associated to the ISBN: 9780123948113.

aerror (or arisk)
In hypothesis testing, an error incurred by failing to reject a null hypothesis when it is actually false (also called a type II error).

Acceptance region
In hypothesis testing, a region in the sample space of the test statistic such that if the test statistic falls within it, the null hypothesis cannot be rejected. This terminology is used because rejection of H0 is always a strong conclusion and acceptance of H0 is generally a weak conclusion

Analytic study
A study in which a sample from a population is used to make inference to a future population. Stability needs to be assumed. See Enumerative study

Arithmetic mean
The arithmetic mean of a set of numbers x1 , x2 ,…, xn is their sum divided by the number of observations, or ( / )1 1 n xi t n ? = . The arithmetic mean is usually denoted by x , and is often called the average

Attribute control chart
Any control chart for a discrete random variable. See Variables control chart.

Bias
An effect that systematically distorts a statistical result or estimate, preventing it from representing the true quantity of interest.

Bimodal distribution.
A distribution with two modes

Bivariate distribution
The joint probability distribution of two random variables.

Central tendency
The tendency of data to cluster around some value. Central tendency is usually expressed by a measure of location such as the mean, median, or mode.

Chisquare test
Any test of signiicance based on the chisquare distribution. The most common chisquare tests are (1) testing hypotheses about the variance or standard deviation of a normal distribution and (2) testing goodness of it of a theoretical distribution to sample data

Control limits
See Control chart.

Correction factor
A term used for the quantity ( / )( ) 1 1 2 n xi i n ? = that is subtracted from xi i n 2 ? =1 to give the corrected sum of squares deined as (/ ) ( ) 1 1 2 n xx i x i n ? = i ? . The correction factor can also be written as nx 2 .

Cumulative sum control chart (CUSUM)
A control chart in which the point plotted at time t is the sum of the measured deviations from target for all statistics up to time t

Curvilinear regression
An expression sometimes used for nonlinear regression models or polynomial regression models.

Discrete random variable
A random variable with a inite (or countably ininite) range.

Distribution function
Another name for a cumulative distribution function.

Error of estimation
The difference between an estimated value and the true value.

Expected value
The expected value of a random variable X is its longterm average or mean value. In the continuous case, the expected value of X is E X xf x dx ( ) = ?? ( ) ? ? where f ( ) x is the density function of the random variable X.

Exponential random variable
A series of tests in which changes are made to the system under study

Ftest
Any test of signiicance involving the F distribution. The most common Ftests are (1) testing hypotheses about the variances or standard deviations of two independent normal distributions, (2) testing hypotheses about treatment means or variance components in the analysis of variance, and (3) testing signiicance of regression or tests on subsets of parameters in a regression model.