Theory and Examples.Least squares and regression lines

Chapter 14, Problem 65E

(choose chapter or problem)

Problem 65E

Theory and Examples.

Least squares and regression lines When we try to fit a line y = mx + b to a set of numerical data points (x1, y1), (x2 , y2), … , (xn , yn) (Figure 14.48), we usually choose the line that minimizes the sum of the squares of the vertical distances from the points to the line. In theory, this means finding the values of m and b that minimize the value of the function

with all sums running from k = 1 to k = n Many scientific calculators have these formulas built in, enabling you to find m and b with only a few keystrokes after you have entered the data.

The line y = mx + b determined by these values of m and b is called the least squares line, regression line, or trend line for the data under study. Finding a least squares line lets you

1. summarize data with a simple expression,

2. predict values of y for other, experimentally untried values of x,

3. handle data analytically.

FIGURE 14.48 To fit a line to noncollinear points, we choose the line that minimizes the sum of the squares of the deviations.

Unfortunately, we don't have that question answered yet. But you can get it answered in just 5 hours by Logging in or Becoming a subscriber.

Becoming a subscriber
Or look for another answer

×

Login

Login or Sign up for access to all of our study tools and educational content!

Forgot password?
Register Now

×

Register

Sign up for access to all content on our site!

Or login if you already have an account

×

Reset password

If you have an active account we’ll send you an e-mail for password recovery

Or login if you have your password back