Log in to StudySoup
Get Full Access to Statistics - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Statistics - Textbook Survival Guide

Solutions for Chapter 13.2: Elementary Statistics 12th Edition

Elementary Statistics | 12th Edition | ISBN: 9780321836960 | Authors: Mario F. Triola

Full solutions for Elementary Statistics | 12th Edition

ISBN: 9780321836960

Elementary Statistics | 12th Edition | ISBN: 9780321836960 | Authors: Mario F. Triola

Solutions for Chapter 13.2

Solutions for Chapter 13.2
4 5 0 246 Reviews
Textbook: Elementary Statistics
Edition: 12
Author: Mario F. Triola
ISBN: 9780321836960

Elementary Statistics was written by and is associated to the ISBN: 9780321836960. This textbook survival guide was created for the textbook: Elementary Statistics, edition: 12. This expansive textbook survival guide covers the following chapters and their solutions. Chapter 13.2 includes 44 full step-by-step solutions. Since 44 problems in chapter 13.2 have been answered, more than 129692 students have viewed full step-by-step solutions from this chapter.

Key Statistics Terms and definitions covered in this textbook
  • Analysis of variance (ANOVA)

    A method of decomposing the total variability in a set of observations, as measured by the sum of the squares of these observations from their average, into component sums of squares that are associated with speciic deined sources of variation

  • Average

    See Arithmetic mean.

  • Average run length, or ARL

    The average number of samples taken in a process monitoring or inspection scheme until the scheme signals that the process is operating at a level different from the level in which it began.

  • Backward elimination

    A method of variable selection in regression that begins with all of the candidate regressor variables in the model and eliminates the insigniicant regressors one at a time until only signiicant regressors remain

  • Bivariate distribution

    The joint probability distribution of two random variables.

  • Central limit theorem

    The simplest form of the central limit theorem states that the sum of n independently distributed random variables will tend to be normally distributed as n becomes large. It is a necessary and suficient condition that none of the variances of the individual random variables are large in comparison to their sum. There are more general forms of the central theorem that allow ininite variances and correlated random variables, and there is a multivariate version of the theorem.

  • Conidence coeficient

    The probability 1?a associated with a conidence interval expressing the probability that the stated interval will contain the true parameter value.

  • Covariance

    A measure of association between two random variables obtained as the expected value of the product of the two random variables around their means; that is, Cov(X Y, ) [( )( )] =? ? E X Y ? ? X Y .

  • Covariance matrix

    A square matrix that contains the variances and covariances among a set of random variables, say, X1 , X X 2 k , , … . The main diagonal elements of the matrix are the variances of the random variables and the off-diagonal elements are the covariances between Xi and Xj . Also called the variance-covariance matrix. When the random variables are standardized to have unit variances, the covariance matrix becomes the correlation matrix.

  • Crossed factors

    Another name for factors that are arranged in a factorial experiment.

  • Cumulative distribution function

    For a random variable X, the function of X deined as PX x ( ) ? that is used to specify the probability distribution.

  • Degrees of freedom.

    The number of independent comparisons that can be made among the elements of a sample. The term is analogous to the number of degrees of freedom for an object in a dynamic system, which is the number of independent coordinates required to determine the motion of the object.

  • Empirical model

    A model to relate a response to one or more regressors or factors that is developed from data obtained from the system.

  • Error mean square

    The error sum of squares divided by its number of degrees of freedom.

  • Error sum of squares

    In analysis of variance, this is the portion of total variability that is due to the random component in the data. It is usually based on replication of observations at certain treatment combinations in the experiment. It is sometimes called the residual sum of squares, although this is really a better term to use only when the sum of squares is based on the remnants of a model-itting process and not on replication.

  • Estimate (or point estimate)

    The numerical value of a point estimator.

  • Fisher’s least signiicant difference (LSD) method

    A series of pair-wise hypothesis tests of treatment means in an experiment to determine which means differ.

  • Forward selection

    A method of variable selection in regression, where variables are inserted one at a time into the model until no other variables that contribute signiicantly to the model can be found.

  • Fraction defective control chart

    See P chart

  • Harmonic mean

    The harmonic mean of a set of data values is the reciprocal of the arithmetic mean of the reciprocals of the data values; that is, h n x i n i = ? ? ? ? ? = ? ? 1 1 1 1 g .

Log in to StudySoup
Get Full Access to Statistics - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Statistics - Textbook Survival Guide
Reset your password