Log in to StudySoup
Get Full Access to Statistics - Textbook Survival Guide
Join StudySoup for FREE
Get Full Access to Statistics - Textbook Survival Guide

Already have an account? Login here
Reset your password

Textbooks / Statistics / Contemporary Mathematics 6

Contemporary Mathematics 6th Edition - Solutions by Chapter

Contemporary Mathematics | 6th Edition | ISBN: 9780538481267 | Authors: Robert Brechner

Full solutions for Contemporary Mathematics | 6th Edition

ISBN: 9780538481267

Contemporary Mathematics | 6th Edition | ISBN: 9780538481267 | Authors: Robert Brechner

Contemporary Mathematics | 6th Edition - Solutions by Chapter

The full step-by-step solution to problem in Contemporary Mathematics were answered by , our top Statistics solution expert on 03/13/18, 06:38PM. This textbook survival guide was created for the textbook: Contemporary Mathematics, edition: 6. This expansive textbook survival guide covers the following chapters: 25. Contemporary Mathematics was written by and is associated to the ISBN: 9780538481267. Since problems from 25 chapters in Contemporary Mathematics have been answered, more than 10221 students have viewed full step-by-step answer.

Key Statistics Terms and definitions covered in this textbook
  • Adjusted R 2

    A variation of the R 2 statistic that compensates for the number of parameters in a regression model. Essentially, the adjustment is a penalty for increasing the number of parameters in the model. Alias. In a fractional factorial experiment when certain factor effects cannot be estimated uniquely, they are said to be aliased.

  • All possible (subsets) regressions

    A method of variable selection in regression that examines all possible subsets of the candidate regressor variables. Eficient computer algorithms have been developed for implementing all possible regressions

  • Bayes’ theorem

    An equation for a conditional probability such as PA B ( | ) in terms of the reverse conditional probability PB A ( | ).

  • Bias

    An effect that systematically distorts a statistical result or estimate, preventing it from representing the true quantity of interest.

  • Bimodal distribution.

    A distribution with two modes

  • Cause-and-effect diagram

    A chart used to organize the various potential causes of a problem. Also called a ishbone diagram.

  • Chi-square (or chi-squared) random variable

    A continuous random variable that results from the sum of squares of independent standard normal random variables. It is a special case of a gamma random variable.

  • Conidence interval

    If it is possible to write a probability statement of the form PL U ( ) ? ? ? ? = ?1 where L and U are functions of only the sample data and ? is a parameter, then the interval between L and U is called a conidence interval (or a 100 1( )% ? ? conidence interval). The interpretation is that a statement that the parameter ? lies in this interval will be true 100 1( )% ? ? of the times that such a statement is made

  • Control limits

    See Control chart.

  • Correlation

    In the most general usage, a measure of the interdependence among data. The concept may include more than two variables. The term is most commonly used in a narrow sense to express the relationship between quantitative variables or ranks.

  • Cumulative distribution function

    For a random variable X, the function of X deined as PX x ( ) ? that is used to specify the probability distribution.

  • Curvilinear regression

    An expression sometimes used for nonlinear regression models or polynomial regression models.

  • Dependent variable

    The response variable in regression or a designed experiment.

  • Designed experiment

    An experiment in which the tests are planned in advance and the plans usually incorporate statistical models. See Experiment

  • Enumerative study

    A study in which a sample from a population is used to make inference to the population. See Analytic study

  • Estimate (or point estimate)

    The numerical value of a point estimator.

  • F distribution.

    The distribution of the random variable deined as the ratio of two independent chi-square random variables, each divided by its number of degrees of freedom.

  • F-test

    Any test of signiicance involving the F distribution. The most common F-tests are (1) testing hypotheses about the variances or standard deviations of two independent normal distributions, (2) testing hypotheses about treatment means or variance components in the analysis of variance, and (3) testing signiicance of regression or tests on subsets of parameters in a regression model.

  • Fisher’s least signiicant difference (LSD) method

    A series of pair-wise hypothesis tests of treatment means in an experiment to determine which means differ.

  • Forward selection

    A method of variable selection in regression, where variables are inserted one at a time into the model until no other variables that contribute signiicantly to the model can be found.