> > > Chapter 1.3

Solutions for Chapter 1.3: Describing Quantitative Data with Numbers

Full solutions for The Practice of Statistics | 5th Edition

ISBN: 9781464108730

Solutions for Chapter 1.3: Describing Quantitative Data with Numbers

Solutions for Chapter 1.3
4 5 0 376 Reviews
30
5
ISBN: 9781464108730

The Practice of Statistics was written by and is associated to the ISBN: 9781464108730. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 1.3: Describing Quantitative Data with Numbers includes 35 full step-by-step solutions. This textbook survival guide was created for the textbook: The Practice of Statistics, edition: 5. Since 35 problems in chapter 1.3: Describing Quantitative Data with Numbers have been answered, more than 8360 students have viewed full step-by-step solutions from this chapter.

Key Statistics Terms and definitions covered in this textbook
• 2 k factorial experiment.

A full factorial experiment with k factors and all factors tested at only two levels (settings) each.

• a-error (or a-risk)

In hypothesis testing, an error incurred by failing to reject a null hypothesis when it is actually false (also called a type II error).

A formula used to determine the probability of the union of two (or more) events from the probabilities of the events and their intersection(s).

• Bayes’ estimator

An estimator for a parameter obtained from a Bayesian method that uses a prior distribution for the parameter along with the conditional distribution of the data given the parameter to obtain the posterior distribution of the parameter. The estimator is obtained from the posterior distribution.

• Central limit theorem

The simplest form of the central limit theorem states that the sum of n independently distributed random variables will tend to be normally distributed as n becomes large. It is a necessary and suficient condition that none of the variances of the individual random variables are large in comparison to their sum. There are more general forms of the central theorem that allow ininite variances and correlated random variables, and there is a multivariate version of the theorem.

• Conditional mean

The mean of the conditional probability distribution of a random variable.

• Conidence interval

If it is possible to write a probability statement of the form PL U ( ) ? ? ? ? = ?1 where L and U are functions of only the sample data and ? is a parameter, then the interval between L and U is called a conidence interval (or a 100 1( )% ? ? conidence interval). The interpretation is that a statement that the parameter ? lies in this interval will be true 100 1( )% ? ? of the times that such a statement is made

• Contour plot

A two-dimensional graphic used for a bivariate probability density function that displays curves for which the probability density function is constant.

• Control limits

See Control chart.

• Counting techniques

Formulas used to determine the number of elements in sample spaces and events.

• Covariance

A measure of association between two random variables obtained as the expected value of the product of the two random variables around their means; that is, Cov(X Y, ) [( )( )] =? ? E X Y ? ? X Y .

• Covariance matrix

A square matrix that contains the variances and covariances among a set of random variables, say, X1 , X X 2 k , , … . The main diagonal elements of the matrix are the variances of the random variables and the off-diagonal elements are the covariances between Xi and Xj . Also called the variance-covariance matrix. When the random variables are standardized to have unit variances, the covariance matrix becomes the correlation matrix.

• Defects-per-unit control chart

See U chart

• Discrete distribution

A probability distribution for a discrete random variable

• Discrete uniform random variable

A discrete random variable with a inite range and constant probability mass function.

• Eficiency

A concept in parameter estimation that uses the variances of different estimators; essentially, an estimator is more eficient than another estimator if it has smaller variance. When estimators are biased, the concept requires modiication.

• Fixed factor (or fixed effect).

In analysis of variance, a factor or effect is considered ixed if all the levels of interest for that factor are included in the experiment. Conclusions are then valid about this set of levels only, although when the factor is quantitative, it is customary to it a model to the data for interpolating between these levels.

• Gamma random variable

A random variable that generalizes an Erlang random variable to noninteger values of the parameter r

• Gaussian distribution

Another name for the normal distribution, based on the strong connection of Karl F. Gauss to the normal distribution; often used in physics and electrical engineering applications

• Hat matrix.

In multiple regression, the matrix H XXX X = ( ) ? ? -1 . This a projection matrix that maps the vector of observed response values into a vector of itted values by yˆ = = X X X X y Hy ( ) ? ? ?1 .

×

I don't want to reset my password

Need help? Contact support

Need an Account? Is not associated with an account
We're here to help