- 10.2.1: Ann cuts an ordinary deck of 52 cards and displays the exposed card...
- 10.2.2: Let the joint probability mass function of random variables X and Y...
- 10.2.3: Roll a balanced die and let the outcome be X. Then toss a fair coin...
- 10.2.4: Thieves stole four animals at random from a farm that had seven she...
- 10.2.5: In n independent Bernoulli trials, each with probability of success...
- 10.2.6: For random variables X, Y , and Z, prove that (a) Cov(X + Y, Z) = C...
- 10.2.7: Show that if X and Y are independent random variables, then for all...
- 10.2.8: For random variables X and Y , show that Cov(X + Y, X Y ) = Var(X) ...
- 10.2.9: Prove that Var(X Y ) = Var(X) + Var(Y ) 2 Cov(X, Y )
- 10.2.10: Let X and Y be two independent random variables. (a) Show that XY a...
- 10.2.11: Prove that if 5 is a random number from the interval [0, 2], then t...
- 10.2.12: Let X and Y be the coordinates of a random point selected uniformly...
- 10.2.13: Mr. Jones has two jobs. Next year, he will get a salary raise of X ...
- 10.2.14: Let X and Y be independent random variables with expected values 1 ...
- 10.2.15: A voltmeter is used to measure the voltage of voltage sources, such...
- 10.2.16: (Investment) Mr. Ingham has invested money in three assets; 18% in ...
- 10.2.17: (Investment) Mr. Kowalski has invested $50,000 in three uncorrelate...
- 10.2.18: Let X and Y have the following joint probability density function f...
- 10.2.19: Find the variance of a sum of n randomly and independently selected...
- 10.2.20: Let X and Y be jointly distributed with joint probability density f...
- 10.2.21: Let X be a random variable. Prove that Var(X) = min t E 4 (X t)2 5 ...
- 10.2.22: Let S be the sample space of an experiment. Let A and B be two even...
- 10.2.23: Show that for random variables X, Y , Z, and W and constants a, b, ...
- 10.2.24: Prove the following generalization of Exercise 23: Cov*.n i=1 aiXi,...
- 10.2.25: A fair die is thrown n times. What is the covariance of the number ...
- 10.2.26: Show that if X1, X2, . . . , Xn are random variables and a1, a2, . ...
- 10.2.27: Let X be a hypergeometric random variable with probability mass fun...
- 10.2.28: Exactly n married couples are living in a small town. What is the v...
Solutions for Chapter 10.2: Covariance
Full solutions for Fundamentals of Probability, with Stochastic Processes | 3rd Edition
2 k p - factorial experiment
A fractional factorial experiment with k factors tested in a 2 ? p fraction with all factors tested at only two levels (settings) each
a-error (or a-risk)
In hypothesis testing, an error incurred by failing to reject a null hypothesis when it is actually false (also called a type II error).
All possible (subsets) regressions
A method of variable selection in regression that examines all possible subsets of the candidate regressor variables. Eficient computer algorithms have been developed for implementing all possible regressions
In statistical hypothesis testing, this is a hypothesis other than the one that is being tested. The alternative hypothesis contains feasible conditions, whereas the null hypothesis speciies conditions that are under test
The arithmetic mean of a set of numbers x1 , x2 ,…, xn is their sum divided by the number of observations, or ( / )1 1 n xi t n ? = . The arithmetic mean is usually denoted by x , and is often called the average
Asymptotic relative eficiency (ARE)
Used to compare hypothesis tests. The ARE of one test relative to another is the limiting ratio of the sample sizes necessary to obtain identical error probabilities for the two procedures.
A qualitative characteristic of an item or unit, usually arising in quality control. For example, classifying production units as defective or nondefective results in attributes data.
The portion of the variability in a set of observations that is due to only random forces and which cannot be traced to speciic sources, such as operators, materials, or equipment. Also called a common cause.
A subset selected without replacement from a set used to determine the number of outcomes in events and sample spaces.
A probability distribution for a continuous random variable.
Continuous uniform random variable
A continuous random variable with range of a inite interval and a constant probability density function.
A two-dimensional graphic used for a bivariate probability density function that displays curves for which the probability density function is constant.
In regression, Cook’s distance is a measure of the inluence of each individual observation on the estimates of the regression model parameters. It expresses the distance that the vector of model parameter estimates with the ith observation removed lies from the vector of model parameter estimates based on all observations. Large values of Cook’s distance indicate that the observation is inluential.
A measure of association between two random variables obtained as the expected value of the product of the two random variables around their means; that is, Cov(X Y, ) [( )( )] =? ? E X Y ? ? X Y .
The value of a statistic corresponding to a stated signiicance level as determined from the sampling distribution. For example, if PZ z PZ ( )( .) . ? =? = 0 025 . 1 96 0 025, then z0 025 . = 1 9. 6 is the critical value of z at the 0.025 level of signiicance. Crossed factors. Another name for factors that are arranged in a factorial experiment.
Defects-per-unit control chart
See U chart
Degrees of freedom.
The number of independent comparisons that can be made among the elements of a sample. The term is analogous to the number of degrees of freedom for an object in a dynamic system, which is the number of independent coordinates required to determine the motion of the object.
Discrete random variable
A random variable with a inite (or countably ininite) range.
Discrete uniform random variable
A discrete random variable with a inite range and constant probability mass function.
A model to relate a response to one or more regressors or factors that is developed from data obtained from the system.