Let {v1, v2} be a basis for the vector space V, and suppose that T1 : V V and T2 : V V
Chapter 6, Problem 7(choose chapter or problem)
Let {v1, v2} be a basis for the vector space V, and suppose that T1 : V V and T2 : V V are the linear transformations satisfying T1(v1) = v1 v2, T1(v2) = 2v1 + v2 T2(v1) = v1 + 2v2, T2(v2) = 3v1 v2. Determine (T2T1)(v) for an arbitrary vector v in V.
Unfortunately, we don't have that question answered yet. But you can get it answered in just 5 hours by Logging in or Becoming a subscriber.
Becoming a subscriber
Or look for another answer