- 4.7.1: The average CPU time per request is known to be 4.39 s for a comput...
- 4.7.2: Using the normal tables, plot P(|X| ) for 0 << 3, where X N(0, 1). ...
- 4.7.3: Consider a random variable X with the Cauchy pdf:f(x) = 1(1 + x2), ...
- 4.7.4: Construct an example of a discrete random variable X that takes on ...
- 4.7.5: In order to represent a nonnegative real number X in a computer wit...
Solutions for Chapter 4.7: Inequalities And Limit Theorems
Full solutions for Probability and Statistics with Reliability, Queuing, and Computer Science Applications | 2nd Edition
`-error (or `-risk)
In hypothesis testing, an error incurred by rejecting a null hypothesis when it is actually true (also called a type I error).
A study in which a sample from a population is used to make inference to a future population. Stability needs to be assumed. See Enumerative study
Axioms of probability
A set of rules that probabilities deined on a sample space must follow. See Probability
An effect that systematically distorts a statistical result or estimate, preventing it from representing the true quantity of interest.
The joint probability distribution of two random variables.
An attribute control chart that plots the total number of defects per unit in a subgroup. Similar to a defects-per-unit or U chart.
Central limit theorem
The simplest form of the central limit theorem states that the sum of n independently distributed random variables will tend to be normally distributed as n becomes large. It is a necessary and suficient condition that none of the variances of the individual random variables are large in comparison to their sum. There are more general forms of the central theorem that allow ininite variances and correlated random variables, and there is a multivariate version of the theorem.
The tendency of data to cluster around some value. Central tendency is usually expressed by a measure of location such as the mean, median, or mode.
The portion of the variability in a set of observations that is due to only random forces and which cannot be traced to speciic sources, such as operators, materials, or equipment. Also called a common cause.
The mean of the conditional probability distribution of a random variable.
See Control chart.
A term used for the quantity ( / )( ) 1 1 2 n xi i n ? = that is subtracted from xi i n 2 ? =1 to give the corrected sum of squares deined as (/ ) ( ) 1 1 2 n xx i x i n ? = i ? . The correction factor can also be written as nx 2 .
An expression sometimes used for nonlinear regression models or polynomial regression models.
W. Edwards Deming (1900–1993) was a leader in the use of statistical quality control.
Another name for a cumulative distribution function.
Error mean square
The error sum of squares divided by its number of degrees of freedom.
The variance of an error term or component in a model.
Fixed factor (or fixed effect).
In analysis of variance, a factor or effect is considered ixed if all the levels of interest for that factor are included in the experiment. Conclusions are then valid about this set of levels only, although when the factor is quantitative, it is customary to it a model to the data for interpolating between these levels.
A method of variable selection in regression, where variables are inserted one at a time into the model until no other variables that contribute signiicantly to the model can be found.
In multiple regression, the matrix H XXX X = ( ) ? ? -1 . This a projection matrix that maps the vector of observed response values into a vector of itted values by yˆ = = X X X X y Hy ( ) ? ? ?1 .