- 1.8.1: Give the proof of the relation (Re) in this section.
- 1.8.2: Consider a pool of six I/O (input/output) buffers. Assume that any ...
- 1.8.3: Show that if event B is contained in event A, then P(B) P(A).
- 1.8.5: In a party of five persons, compute the probability that at least t...
- 1.8.6: A series of n jobs arrive at a multiprocessor computer with n proce...
Solutions for Chapter 1.8: Combinatorial Problems
Full solutions for Probability and Statistics with Reliability, Queuing, and Computer Science Applications | 2nd Edition
a-error (or a-risk)
In hypothesis testing, an error incurred by failing to reject a null hypothesis when it is actually false (also called a type II error).
In hypothesis testing, a region in the sample space of the test statistic such that if the test statistic falls within it, the null hypothesis cannot be rejected. This terminology is used because rejection of H0 is always a strong conclusion and acceptance of H0 is generally a weak conclusion
In statistical hypothesis testing, this is a hypothesis other than the one that is being tested. The alternative hypothesis contains feasible conditions, whereas the null hypothesis speciies conditions that are under test
An effect that systematically distorts a statistical result or estimate, preventing it from representing the true quantity of interest.
An attribute control chart that plots the total number of defects per unit in a subgroup. Similar to a defects-per-unit or U chart.
A chart used to organize the various potential causes of a problem. Also called a ishbone diagram.
Chi-square (or chi-squared) random variable
A continuous random variable that results from the sum of squares of independent standard normal random variables. It is a special case of a gamma random variable.
Conditional probability mass function
The probability mass function of the conditional probability distribution of a discrete random variable.
A tabular arrangement expressing the assignment of members of a data set according to two or more categories or classiication criteria
A correction factor used to improve the approximation to binomial probabilities from a normal distribution.
Formulas used to determine the number of elements in sample spaces and events.
A square matrix that contains the variances and covariances among a set of random variables, say, X1 , X X 2 k , , … . The main diagonal elements of the matrix are the variances of the random variables and the off-diagonal elements are the covariances between Xi and Xj . Also called the variance-covariance matrix. When the random variables are standardized to have unit variances, the covariance matrix becomes the correlation matrix.
Discrete uniform random variable
A discrete random variable with a inite range and constant probability mass function.
Error mean square
The error sum of squares divided by its number of degrees of freedom.
Error of estimation
The difference between an estimated value and the true value.
An analysis of how the variance of the random variable that represents that output of a system depends on the variances of the inputs. A formula exists when the output is a linear function of the inputs and the formula is simpliied if the inputs are assumed to be independent.
The distribution of the random variable deined as the ratio of two independent chi-square random variables, each divided by its number of degrees of freedom.
A function used in the probability density function of a gamma random variable that can be considered to extend factorials
The geometric mean of a set of n positive data values is the nth root of the product of the data values; that is, g x i n i n = ( ) = / w 1 1 .
Goodness of fit
In general, the agreement of a set of observed values and a set of theoretical values that depend on some hypothesis. The term is often used in itting a theoretical distribution to a set of observations.