Let u, vt> ... , vn be vectors in a given vector space. Let a1, ... , an be scalars
Chapter 1, Problem 31(choose chapter or problem)
Let u, vt> ... , vn be vectors in a given vector space. Let a1, ... , an be scalars. Prove that U' (a1V1 + ''' + anvn) = aiU' V1 + ''' + anU' Vn
Unfortunately, we don't have that question answered yet. But you can get it answered in just 5 hours by Logging in or Becoming a subscriber.
Becoming a subscriber
Or look for another answer