×
Log in to StudySoup
Get Full Access to Statistics - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Statistics - Textbook Survival Guide

Solutions for Chapter 3.1: MEASURES OF CENTRAL TENDENCY

Statistics: Informed Decisions Using Data | 4th Edition | ISBN: 9780321757272 | Authors: Michael Sullivan, III

Full solutions for Statistics: Informed Decisions Using Data | 4th Edition

ISBN: 9780321757272

Statistics: Informed Decisions Using Data | 4th Edition | ISBN: 9780321757272 | Authors: Michael Sullivan, III

Solutions for Chapter 3.1: MEASURES OF CENTRAL TENDENCY

Solutions for Chapter 3.1
4 5 0 397 Reviews
19
1
Textbook: Statistics: Informed Decisions Using Data
Edition: 4
Author: Michael Sullivan, III
ISBN: 9780321757272

Since 100 problems in chapter 3.1: MEASURES OF CENTRAL TENDENCY have been answered, more than 145125 students have viewed full step-by-step solutions from this chapter. Chapter 3.1: MEASURES OF CENTRAL TENDENCY includes 100 full step-by-step solutions. Statistics: Informed Decisions Using Data was written by and is associated to the ISBN: 9780321757272. This textbook survival guide was created for the textbook: Statistics: Informed Decisions Using Data , edition: 4. This expansive textbook survival guide covers the following chapters and their solutions.

Key Statistics Terms and definitions covered in this textbook
  • Alternative hypothesis

    In statistical hypothesis testing, this is a hypothesis other than the one that is being tested. The alternative hypothesis contains feasible conditions, whereas the null hypothesis speciies conditions that are under test

  • Analytic study

    A study in which a sample from a population is used to make inference to a future population. Stability needs to be assumed. See Enumerative study

  • Attribute control chart

    Any control chart for a discrete random variable. See Variables control chart.

  • Central limit theorem

    The simplest form of the central limit theorem states that the sum of n independently distributed random variables will tend to be normally distributed as n becomes large. It is a necessary and suficient condition that none of the variances of the individual random variables are large in comparison to their sum. There are more general forms of the central theorem that allow ininite variances and correlated random variables, and there is a multivariate version of the theorem.

  • Chi-square (or chi-squared) random variable

    A continuous random variable that results from the sum of squares of independent standard normal random variables. It is a special case of a gamma random variable.

  • Conditional probability density function

    The probability density function of the conditional probability distribution of a continuous random variable.

  • Confounding

    When a factorial experiment is run in blocks and the blocks are too small to contain a complete replicate of the experiment, one can run a fraction of the replicate in each block, but this results in losing information on some effects. These effects are linked with or confounded with the blocks. In general, when two factors are varied such that their individual effects cannot be determined separately, their effects are said to be confounded.

  • Contour plot

    A two-dimensional graphic used for a bivariate probability density function that displays curves for which the probability density function is constant.

  • Control chart

    A graphical display used to monitor a process. It usually consists of a horizontal center line corresponding to the in-control value of the parameter that is being monitored and lower and upper control limits. The control limits are determined by statistical criteria and are not arbitrary, nor are they related to speciication limits. If sample points fall within the control limits, the process is said to be in-control, or free from assignable causes. Points beyond the control limits indicate an out-of-control process; that is, assignable causes are likely present. This signals the need to ind and remove the assignable causes.

  • Correction factor

    A term used for the quantity ( / )( ) 1 1 2 n xi i n ? = that is subtracted from xi i n 2 ? =1 to give the corrected sum of squares deined as (/ ) ( ) 1 1 2 n xx i x i n ? = i ? . The correction factor can also be written as nx 2 .

  • Covariance matrix

    A square matrix that contains the variances and covariances among a set of random variables, say, X1 , X X 2 k , , … . The main diagonal elements of the matrix are the variances of the random variables and the off-diagonal elements are the covariances between Xi and Xj . Also called the variance-covariance matrix. When the random variables are standardized to have unit variances, the covariance matrix becomes the correlation matrix.

  • Critical region

    In hypothesis testing, this is the portion of the sample space of a test statistic that will lead to rejection of the null hypothesis.

  • Discrete random variable

    A random variable with a inite (or countably ininite) range.

  • Dispersion

    The amount of variability exhibited by data

  • Distribution free method(s)

    Any method of inference (hypothesis testing or conidence interval construction) that does not depend on the form of the underlying distribution of the observations. Sometimes called nonparametric method(s).

  • Error sum of squares

    In analysis of variance, this is the portion of total variability that is due to the random component in the data. It is usually based on replication of observations at certain treatment combinations in the experiment. It is sometimes called the residual sum of squares, although this is really a better term to use only when the sum of squares is based on the remnants of a model-itting process and not on replication.

  • Finite population correction factor

    A term in the formula for the variance of a hypergeometric random variable.

  • Fraction defective

    In statistical quality control, that portion of a number of units or the output of a process that is defective.

  • Generating function

    A function that is used to determine properties of the probability distribution of a random variable. See Moment-generating function

  • Goodness of fit

    In general, the agreement of a set of observed values and a set of theoretical values that depend on some hypothesis. The term is often used in itting a theoretical distribution to a set of observations.

×
Log in to StudySoup
Get Full Access to Statistics - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Statistics - Textbook Survival Guide
×
Reset your password