- 11.4.1: Complete the proof of Theorem 11.1 by showing part (c).
- 11.4.2: Compute 90% confidence intervals for the parameters a and b from th...
Solutions for Chapter 11.4: Confidence Intervals In Linear Regression
Full solutions for Probability and Statistics with Reliability, Queuing, and Computer Science Applications | 2nd Edition
In hypothesis testing, a region in the sample space of the test statistic such that if the test statistic falls within it, the null hypothesis cannot be rejected. This terminology is used because rejection of H0 is always a strong conclusion and acceptance of H0 is generally a weak conclusion
Additivity property of x 2
If two independent random variables X1 and X2 are distributed as chi-square with v1 and v2 degrees of freedom, respectively, Y = + X X 1 2 is a chi-square random variable with u = + v v 1 2 degrees of freedom. This generalizes to any number of independent chi-square random variables.
The portion of the variability in a set of observations that can be traced to speciic causes, such as operators, materials, or equipment. Also called a special cause.
In experimental design, a group of experimental units or material that is relatively homogeneous. The purpose of dividing experimental units into blocks is to produce an experimental design wherein variability within blocks is smaller than variability between blocks. This allows the factors of interest to be compared in an environment that has less variability than in an unblocked experiment.
A chart used to organize the various potential causes of a problem. Also called a ishbone diagram.
The mean of the conditional probability distribution of a random variable.
If it is possible to write a probability statement of the form PL U ( ) ? ? ? ? = ?1 where L and U are functions of only the sample data and ? is a parameter, then the interval between L and U is called a conidence interval (or a 100 1( )% ? ? conidence interval). The interpretation is that a statement that the parameter ? lies in this interval will be true 100 1( )% ? ? of the times that such a statement is made
A square matrix that contains the correlations among a set of random variables, say, XX X 1 2 k , ,…, . The main diagonal elements of the matrix are unity and the off-diagonal elements rij are the correlations between Xi and Xj .
In hypothesis testing, this is the portion of the sample space of a test statistic that will lead to rejection of the null hypothesis.
Cumulative normal distribution function
The cumulative distribution of the standard normal distribution, often denoted as ?( ) x and tabulated in Appendix Table II.
An experiment in which the tests are planned in advance and the plans usually incorporate statistical models. See Experiment
Discrete uniform random variable
A discrete random variable with a inite range and constant probability mass function.
The amount of variability exhibited by data
Distribution free method(s)
Any method of inference (hypothesis testing or conidence interval construction) that does not depend on the form of the underlying distribution of the observations. Sometimes called nonparametric method(s).
A function used in the probability density function of a gamma random variable that can be considered to extend factorials
Gamma random variable
A random variable that generalizes an Erlang random variable to noninteger values of the parameter r
Geometric random variable
A discrete random variable that is the number of Bernoulli trials until a success occurs.
The harmonic mean of a set of data values is the reciprocal of the arithmetic mean of the reciprocals of the data values; that is, h n x i n i = ? ? ? ? ? = ? ? 1 1 1 1 g .
In multiple regression, the matrix H XXX X = ( ) ? ? -1 . This a projection matrix that maps the vector of observed response values into a vector of itted values by yˆ = = X X X X y Hy ( ) ? ? ?1 .