a. Suppose T (w) = w. Prove that (T I)k(w) = ( )kw. b. Suppose 1, . . . , k are distinct scalars and v1, . . . , vk are generalized eigenvectors of T with corresponding eigenvalues 1, . . . , k, respectively. (See Exercise 14 for the definition.) Prove that {v1, . . . , vk} is a linearly independent set. (Hint: Let i be the smallest positive integer so that (T iI)i (vi) = 0, i = 1, . . . , k. Proceed as in the proof of Theorem 2.1 of Chapter 6. If vm+1 = c1v1 + +cmvm, note that w = (T m+1I)m+11(vm+1) is an eigenvector. Using the result of part a, calculate (T 1I)1(T 2I)2 . . . (T mI)m (w) in two ways.)

Slope: ➔ change in y over change in x ➔ anything with x in the y=mx+b formula Y-intercept: ➔ the point where the x = 0 least-squares regression line ➔ ŷ=a+bx ◆ A is the y intercept and b is slope ➔ Predicted = y-intercept + slope *If r^2 is greater...