- Chapter CHAPTER 12 : ANNUITIES
- Chapter CHAPTER 13 : CONSUMER AND BUSINESS CREDIT
- Chapter CHAPTER 14 : MORTGAGES
- Chapter CHAPTER 15 : FINANCIAL STATEMENTS AND RATIOS
- Chapter CHAPTER 17 : DEPRECIATION
- Chapter CHAPTER 3 : DECIMALS
- Chapter CHAPTER 7 : INVOICES, TRADE DISCOUNTS, AND CASH DISCOUNTS
- Chapter 1: WHOLE NUMBERS
- Chapter 10: SIMPLE INTEREST AND PROMISSORY NOTES
- Chapter 11: COMPOUND INTEREST AND PRESENT VALUE
- Chapter 16: INVENTORY
- Chapter 18: TAXES
- Chapter 19: INSURANCE
- Chapter 2: FRACTIONS
- Chapter 20: INVESTMENTS
- Chapter 21: BUSINESS STATISTICS ANDDATA PRESENTATION
- Chapter 4: CHECKING ACCOUNTS
- Chapter 5: USING EQUATIONS TO SOLVE BUSINESS PROBLEMS
- Chapter 6: PERCENTS AND THEIR APPLICATIONS IN BUSINESS
- Chapter 8: MARKUP AND MARKDOWN
- Chapter 9: PAYROLL
- Chapter SECTION I: THE DECIMAL NUMBER SYSTEM: WHOLE NUMBERS
- Chapter SECTION II: ADDITION AND SUBTRACTION OF WHOLE NUMBERS
- Chapter SECTION III: MULTIPLICATION AND DIVISION OF WHOLE NUMBERS
- Chapter SECTION IV: CASH DISCOUNTS AND TERMS OF SALE
Contemporary Mathematics 6th Edition - Solutions by Chapter
Full solutions for Contemporary Mathematics | 6th Edition
Adjusted R 2
A variation of the R 2 statistic that compensates for the number of parameters in a regression model. Essentially, the adjustment is a penalty for increasing the number of parameters in the model. Alias. In a fractional factorial experiment when certain factor effects cannot be estimated uniquely, they are said to be aliased.
All possible (subsets) regressions
A method of variable selection in regression that examines all possible subsets of the candidate regressor variables. Eficient computer algorithms have been developed for implementing all possible regressions
An equation for a conditional probability such as PA B ( | ) in terms of the reverse conditional probability PB A ( | ).
An effect that systematically distorts a statistical result or estimate, preventing it from representing the true quantity of interest.
A distribution with two modes
A chart used to organize the various potential causes of a problem. Also called a ishbone diagram.
Chi-square (or chi-squared) random variable
A continuous random variable that results from the sum of squares of independent standard normal random variables. It is a special case of a gamma random variable.
If it is possible to write a probability statement of the form PL U ( ) ? ? ? ? = ?1 where L and U are functions of only the sample data and ? is a parameter, then the interval between L and U is called a conidence interval (or a 100 1( )% ? ? conidence interval). The interpretation is that a statement that the parameter ? lies in this interval will be true 100 1( )% ? ? of the times that such a statement is made
See Control chart.
In the most general usage, a measure of the interdependence among data. The concept may include more than two variables. The term is most commonly used in a narrow sense to express the relationship between quantitative variables or ranks.
Cumulative distribution function
For a random variable X, the function of X deined as PX x ( ) ? that is used to specify the probability distribution.
An expression sometimes used for nonlinear regression models or polynomial regression models.
The response variable in regression or a designed experiment.
An experiment in which the tests are planned in advance and the plans usually incorporate statistical models. See Experiment
A study in which a sample from a population is used to make inference to the population. See Analytic study
Estimate (or point estimate)
The numerical value of a point estimator.
The distribution of the random variable deined as the ratio of two independent chi-square random variables, each divided by its number of degrees of freedom.
Any test of signiicance involving the F distribution. The most common F-tests are (1) testing hypotheses about the variances or standard deviations of two independent normal distributions, (2) testing hypotheses about treatment means or variance components in the analysis of variance, and (3) testing signiicance of regression or tests on subsets of parameters in a regression model.
Fisher’s least signiicant difference (LSD) method
A series of pair-wise hypothesis tests of treatment means in an experiment to determine which means differ.
A method of variable selection in regression, where variables are inserted one at a time into the model until no other variables that contribute signiicantly to the model can be found.