The standard error of the estimator 1 in a simple linear regression model gets smaller
Chapter 12, Problem 12.8(choose chapter or problem)
The standard error of the estimator 1 in a simple linear regression model gets smaller as Sxx increases, that is, as the x-values become more spread out. Why dont we always spread the x-values out as much as possible?
Unfortunately, we don't have that question answered yet. But you can get it answered in just 5 hours by Logging in or Becoming a subscriber.
Becoming a subscriber
Or look for another answer