×
Log in to StudySoup
Get Full Access to Statistics - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Statistics - Textbook Survival Guide

Solutions for Chapter 4.1: Elementary Statistics: A Step By Step Approach 9th Edition

Elementary Statistics: A Step By Step Approach | 9th Edition | ISBN: 9780073534985 | Authors: Allan Bluman

Full solutions for Elementary Statistics: A Step By Step Approach | 9th Edition

ISBN: 9780073534985

Elementary Statistics: A Step By Step Approach | 9th Edition | ISBN: 9780073534985 | Authors: Allan Bluman

Solutions for Chapter 4.1

Solutions for Chapter 4.1
4 5 0 271 Reviews
24
3
Textbook: Elementary Statistics: A Step By Step Approach
Edition: 9
Author: Allan Bluman
ISBN: 9780073534985

This textbook survival guide was created for the textbook: Elementary Statistics: A Step By Step Approach , edition: 9. Elementary Statistics: A Step By Step Approach was written by and is associated to the ISBN: 9780073534985. Chapter 4.1 includes 33 full step-by-step solutions. This expansive textbook survival guide covers the following chapters and their solutions. Since 33 problems in chapter 4.1 have been answered, more than 153482 students have viewed full step-by-step solutions from this chapter.

Key Statistics Terms and definitions covered in this textbook
  • a-error (or a-risk)

    In hypothesis testing, an error incurred by failing to reject a null hypothesis when it is actually false (also called a type II error).

  • Acceptance region

    In hypothesis testing, a region in the sample space of the test statistic such that if the test statistic falls within it, the null hypothesis cannot be rejected. This terminology is used because rejection of H0 is always a strong conclusion and acceptance of H0 is generally a weak conclusion

  • Additivity property of x 2

    If two independent random variables X1 and X2 are distributed as chi-square with v1 and v2 degrees of freedom, respectively, Y = + X X 1 2 is a chi-square random variable with u = + v v 1 2 degrees of freedom. This generalizes to any number of independent chi-square random variables.

  • Analysis of variance (ANOVA)

    A method of decomposing the total variability in a set of observations, as measured by the sum of the squares of these observations from their average, into component sums of squares that are associated with speciic deined sources of variation

  • Analytic study

    A study in which a sample from a population is used to make inference to a future population. Stability needs to be assumed. See Enumerative study

  • Arithmetic mean

    The arithmetic mean of a set of numbers x1 , x2 ,…, xn is their sum divided by the number of observations, or ( / )1 1 n xi t n ? = . The arithmetic mean is usually denoted by x , and is often called the average

  • Bayes’ estimator

    An estimator for a parameter obtained from a Bayesian method that uses a prior distribution for the parameter along with the conditional distribution of the data given the parameter to obtain the posterior distribution of the parameter. The estimator is obtained from the posterior distribution.

  • Bivariate normal distribution

    The joint distribution of two normal random variables

  • Chi-square (or chi-squared) random variable

    A continuous random variable that results from the sum of squares of independent standard normal random variables. It is a special case of a gamma random variable.

  • Chi-square test

    Any test of signiicance based on the chi-square distribution. The most common chi-square tests are (1) testing hypotheses about the variance or standard deviation of a normal distribution and (2) testing goodness of it of a theoretical distribution to sample data

  • Combination.

    A subset selected without replacement from a set used to determine the number of outcomes in events and sample spaces.

  • Conditional probability

    The probability of an event given that the random experiment produces an outcome in another event.

  • Contrast

    A linear function of treatment means with coeficients that total zero. A contrast is a summary of treatment means that is of interest in an experiment.

  • Correlation coeficient

    A dimensionless measure of the linear association between two variables, usually lying in the interval from ?1 to +1, with zero indicating the absence of correlation (but not necessarily the independence of the two variables).

  • Correlation matrix

    A square matrix that contains the correlations among a set of random variables, say, XX X 1 2 k , ,…, . The main diagonal elements of the matrix are unity and the off-diagonal elements rij are the correlations between Xi and Xj .

  • Covariance matrix

    A square matrix that contains the variances and covariances among a set of random variables, say, X1 , X X 2 k , , … . The main diagonal elements of the matrix are the variances of the random variables and the off-diagonal elements are the covariances between Xi and Xj . Also called the variance-covariance matrix. When the random variables are standardized to have unit variances, the covariance matrix becomes the correlation matrix.

  • Decision interval

    A parameter in a tabular CUSUM algorithm that is determined from a trade-off between false alarms and the detection of assignable causes.

  • Defect concentration diagram

    A quality tool that graphically shows the location of defects on a part or in a process.

  • Fisher’s least signiicant difference (LSD) method

    A series of pair-wise hypothesis tests of treatment means in an experiment to determine which means differ.

  • Generating function

    A function that is used to determine properties of the probability distribution of a random variable. See Moment-generating function

×
Log in to StudySoup
Get Full Access to Statistics - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Statistics - Textbook Survival Guide
×
Reset your password