- 13.1: Many fast-food restaurants use automatic soft-drink dispensing mach...
- 13.2: Lighting Effect on Plant Growth Which type of light results in the ...
- 13.3: A rock climber wishes to test the tensile strength of three differe...
- 13.4: A manufacturing researcher wants to determine if age or gender sign...
- 13.5: The following table summarizes the sample mean earnings, in thousan...
Solutions for Chapter 13: Comparing Three or More Means
Full solutions for Statistics: Informed Decisions Using Data | 4th Edition
A method of variable selection in regression that begins with all of the candidate regressor variables in the model and eliminates the insigniicant regressors one at a time until only signiicant regressors remain
An equation for a conditional probability such as PA B ( | ) in terms of the reverse conditional probability PB A ( | ).
An effect that systematically distorts a statistical result or estimate, preventing it from representing the true quantity of interest.
Central limit theorem
The simplest form of the central limit theorem states that the sum of n independently distributed random variables will tend to be normally distributed as n becomes large. It is a necessary and suficient condition that none of the variances of the individual random variables are large in comparison to their sum. There are more general forms of the central theorem that allow ininite variances and correlated random variables, and there is a multivariate version of the theorem.
The portion of the variability in a set of observations that is due to only random forces and which cannot be traced to speciic sources, such as operators, materials, or equipment. Also called a common cause.
The probability of an event given that the random experiment produces an outcome in another event.
The variance of the conditional probability distribution of a random variable.
If it is possible to write a probability statement of the form PL U ( ) ? ? ? ? = ?1 where L and U are functions of only the sample data and ? is a parameter, then the interval between L and U is called a conidence interval (or a 100 1( )% ? ? conidence interval). The interpretation is that a statement that the parameter ? lies in this interval will be true 100 1( )% ? ? of the times that such a statement is made
A method to derive the probability density function of the sum of two independent random variables from an integral (or sum) of probability density (or mass) functions.
A dimensionless measure of the linear association between two variables, usually lying in the interval from ?1 to +1, with zero indicating the absence of correlation (but not necessarily the independence of the two variables).
Formulas used to determine the number of elements in sample spaces and events.
Another name for factors that are arranged in a factorial experiment.
The response variable in regression or a designed experiment.
Another name for a cumulative distribution function.
Error mean square
The error sum of squares divided by its number of degrees of freedom.
Estimate (or point estimate)
The numerical value of a point estimator.
Finite population correction factor
A term in the formula for the variance of a hypergeometric random variable.
Another name for the normal distribution, based on the strong connection of Karl F. Gauss to the normal distribution; often used in physics and electrical engineering applications
The geometric mean of a set of n positive data values is the nth root of the product of the data values; that is, g x i n i n = ( ) = / w 1 1 .