- Chapter 1: The Role of Statistics and the Data Analysis Process
- Chapter 10: Hypothesis Testing Using a Single Sample
- Chapter 11: Comparing Two Populations or Treatments
- Chapter 12: The Analysis of Categorical Data and Goodness-of-Fit Tests
- Chapter 13: Simple Linear Regression and Correlation: Inferential Methods
- Chapter 14: Multiple Regression Analysis
- Chapter 15: Analysis of Variance
- Chapter 2: Collecting Data Sensibly
- Chapter 3: Graphical Methods for Describing Data
- Chapter 4: Numerical Methods for Describing Data
- Chapter 5: Summarizing Bivariate Data
- Chapter 6: Probability
- Chapter 7: Random Variables and Probability Distributions
- Chapter 8: Sampling Variability and Sampling Distributions
- Chapter 9: Estimation Using a Single Sample
Introduction to Statistics and Data Analysis (with CengageNOW Printed Access Card) (Available Titles CengageNOW) 3rd Edition - Solutions by Chapter
Full solutions for Introduction to Statistics and Data Analysis (with CengageNOW Printed Access Card) (Available Titles CengageNOW) | 3rd Edition
Introduction to Statistics and Data Analysis (with CengageNOW Printed Access Card) (Available Titles CengageNOW) | 3rd Edition - Solutions by ChapterGet Full Solutions
Binomial random variable
A discrete random variable that equals the number of successes in a ixed number of Bernoulli trials.
Central limit theorem
The simplest form of the central limit theorem states that the sum of n independently distributed random variables will tend to be normally distributed as n becomes large. It is a necessary and suficient condition that none of the variances of the individual random variables are large in comparison to their sum. There are more general forms of the central theorem that allow ininite variances and correlated random variables, and there is a multivariate version of the theorem.
The tendency of data to cluster around some value. Central tendency is usually expressed by a measure of location such as the mean, median, or mode.
An experiment in which the treatments (experimental conditions) that are to be studied are included in the experiment. The data from the experiment are used to evaluate the treatments.
Components of variance
The individual components of the total variance that are attributable to speciic sources. This usually refers to the individual variance components arising from a random or mixed model analysis of variance.
The probability of an event given that the random experiment produces an outcome in another event.
If it is possible to write a probability statement of the form PL U ( ) ? ? ? ? = ?1 where L and U are functions of only the sample data and ? is a parameter, then the interval between L and U is called a conidence interval (or a 100 1( )% ? ? conidence interval). The interpretation is that a statement that the parameter ? lies in this interval will be true 100 1( )% ? ? of the times that such a statement is made
An estimator that converges in probability to the true value of the estimated parameter as the sample size increases.
Continuous random variable.
A random variable with an interval (either inite or ininite) of real numbers for its range.
Continuous uniform random variable
A continuous random variable with range of a inite interval and a constant probability density function.
A two-dimensional graphic used for a bivariate probability density function that displays curves for which the probability density function is constant.
A method to derive the probability density function of the sum of two independent random variables from an integral (or sum) of probability density (or mass) functions.
A term used for the quantity ( / )( ) 1 1 2 n xi i n ? = that is subtracted from xi i n 2 ? =1 to give the corrected sum of squares deined as (/ ) ( ) 1 1 2 n xx i x i n ? = i ? . The correction factor can also be written as nx 2 .
Cumulative distribution function
For a random variable X, the function of X deined as PX x ( ) ? that is used to specify the probability distribution.
A probability distribution for a discrete random variable
A concept in parameter estimation that uses the variances of different estimators; essentially, an estimator is more eficient than another estimator if it has smaller variance. When estimators are biased, the concept requires modiication.
Another name for the normal distribution, based on the strong connection of Karl F. Gauss to the normal distribution; often used in physics and electrical engineering applications
A function that is used to determine properties of the probability distribution of a random variable. See Moment-generating function
Geometric random variable
A discrete random variable that is the number of Bernoulli trials until a success occurs.
In multiple regression, the matrix H XXX X = ( ) ? ? -1 . This a projection matrix that maps the vector of observed response values into a vector of itted values by yˆ = = X X X X y Hy ( ) ? ? ?1 .