- 10-4.1: What is the dependent variable?
- 10-4.2: What are the independent variables?
- 10-4.3: What are the multiple regression assumptions?
- 10-4.4: Explain what 4540 and 1290 in the equation tell us.
- 10-4.5: What is the predicted income if a person took 8 math classes and wo...
- 10-4.6: What does a multiple correlation coefficient of 0.77 mean?
- 10-4.7: Compute R2 .
- 10-4.8: Compute the adjusted R2 .
- 10-4.9: Would the equation be considered a good predictor of income?
- 10-4.10: What are your conclusions about the relationship among courses take...
- 10-4.1: Explain the similarities and differences between simple linear regr...
- 10-4.2: What is the general form of the multiple regression equation? What ...
- 10-4.3: Why would a researcher prefer to conduct a multiple regression stud...
- 10-4.4: What are the assumptions for multiple regression?
- 10-4.5: How do the values of the individual correlation coefficients compar...
- 10-4.6: Age, GPA, and Income A researcher has determined that a significant...
- 10-4.7: Assembly Line Work A manufacturer found that a significant relation...
- 10-4.8: Fat, Calories, and Carbohydrates A nutritionist established a signi...
- 10-4.9: Aspects of StudentsAcademic Behavior A college statistics professor...
- 10-4.10: Age, Cholesterol, and Sodium A medical researcher found a significa...
- 10-4.11: Explain the meaning of the multiple correlation coefficient R.
- 10-4.12: What is the range of values R can assume?
- 10-4.13: Define R2 and .
- 10-4.14: What are the hypotheses used to test the significance of R?
- 10-4.15: What test is used to test the significance of R?
- 10-4.16: What is the meaning of the adjusted R2 ? Why is it computed?
Solutions for Chapter 10-4: Correlation and Regression
Full solutions for Elementary Statistics: A Step by Step Approach | 7th Edition
The arithmetic mean of a set of numbers x1 , x2 ,…, xn is their sum divided by the number of observations, or ( / )1 1 n xi t n ? = . The arithmetic mean is usually denoted by x , and is often called the average
A method of variable selection in regression that begins with all of the candidate regressor variables in the model and eliminates the insigniicant regressors one at a time until only signiicant regressors remain
Binomial random variable
A discrete random variable that equals the number of successes in a ixed number of Bernoulli trials.
Central composite design (CCD)
A second-order response surface design in k variables consisting of a two-level factorial, 2k axial runs, and one or more center points. The two-level factorial portion of a CCD can be a fractional factorial design when k is large. The CCD is the most widely used design for itting a second-order model.
Conditional probability mass function
The probability mass function of the conditional probability distribution of a discrete random variable.
When a factorial experiment is run in blocks and the blocks are too small to contain a complete replicate of the experiment, one can run a fraction of the replicate in each block, but this results in losing information on some effects. These effects are linked with or confounded with the blocks. In general, when two factors are varied such that their individual effects cannot be determined separately, their effects are said to be confounded.
The probability 1?a associated with a conidence interval expressing the probability that the stated interval will contain the true parameter value.
A probability distribution for a continuous random variable.
A square matrix that contains the correlations among a set of random variables, say, XX X 1 2 k , ,…, . The main diagonal elements of the matrix are unity and the off-diagonal elements rij are the correlations between Xi and Xj .
In hypothesis testing, this is the portion of the sample space of a test statistic that will lead to rejection of the null hypothesis.
The value of a statistic corresponding to a stated signiicance level as determined from the sampling distribution. For example, if PZ z PZ ( )( .) . ? =? = 0 025 . 1 96 0 025, then z0 025 . = 1 9. 6 is the critical value of z at the 0.025 level of signiicance. Crossed factors. Another name for factors that are arranged in a factorial experiment.
Another name for factors that are arranged in a factorial experiment.
An expression sometimes used for nonlinear regression models or polynomial regression models.
Defects-per-unit control chart
See U chart
A matrix that provides the tests that are to be conducted in an experiment.
The amount of variability exhibited by data
Another name for a cumulative distribution function.
Error sum of squares
In analysis of variance, this is the portion of total variability that is due to the random component in the data. It is usually based on replication of observations at certain treatment combinations in the experiment. It is sometimes called the residual sum of squares, although this is really a better term to use only when the sum of squares is based on the remnants of a model-itting process and not on replication.
Extra sum of squares method
A method used in regression analysis to conduct a hypothesis test for the additional contribution of one or more variables to a model.
The geometric mean of a set of n positive data values is the nth root of the product of the data values; that is, g x i n i n = ( ) = / w 1 1 .