Modern Algebra MATH 542
Popular in Course
Popular in Mathematics (M)
This 24 page Class Notes was uploaded by Zechariah Hilpert on Thursday September 17, 2015. The Class Notes belongs to MATH 542 at University of Wisconsin - Madison taught by Staff in Fall. Since its upload, it has received 23 views. For similar materials see /class/205294/math-542-university-of-wisconsin-madison in Mathematics (M) at University of Wisconsin - Madison.
Reviews for Modern Algebra
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 09/17/15
Linear Algebra for Math 542 JWR Spring 2001 Contents H Preliminaries 11 Sets and Maps 12 Matrix Theory Vector Spaces 21 Vector Spaces 22 Linear Maps 23 Space of Linear Maps 24 Frames and Matrix Representation 25 Null Space and Range 26 Subspaces 27 Examples 271 Matrices 272 Polynomials 273 Trigonometric Polynomials 274 Derivative and Integral 28 Exercises Bases and Frames 31 Maps and Sequences 32 Independence 33 Span 34 Basis and Frame 35 Examples and Exercises 36 Cardinality 37 The Dimension Theorem 38 lsomorphism 39 Extraction 41 50 51 52 CONTENTS 310 Extension 54 311 One sided lnverses 55 312 Independence and Span 57 313 Rank and Nullity 57 314 Exercises 58 Matrix Representation 63 41 The Representation Theorem 63 42 The Transition Matrix 67 43 Change of Frames 69 44 Flags 73 45 Normal Forms 74 451 Zero One Normal Form 75 452 Row Echelon Form 78 453 Reduced Row Echelon Form 79 454 Diagonalization 80 455 Triangular Matrices 81 456 Strictly Triangular Matrices 82 46 Exercises 83 Block Diagonalization 89 51 Direct Sums 89 52 ldempotents 93 53 lnvariant Decomposition 97 54 Block Diagonalization 99 55 Eigenspaces 101 56 Generalized Eigenspaces 102 57 Minimal Polynomial 106 58 Exercises 111 Jordan Normal Form 113 61 Similarity lnvariants 113 62 Jordan Normal Form 115 63 lndecomposable Jordan Blocks 116 64 Partitions 118 65 Weyr Characteristic 119 66 Segre Characteristic 120 67 Jordan Segre Basis 123 CONTENTS 68 Improved Rank Nullity Relation 69 Proof of the Jordan Normal Form Theorem 610 Exercises Groups and Normal Forms 71 Matrix Groups Matrix Invariants Normal Forms Exercises 8 Index CONTENTS Chapter 1 Preliminaries 11 Sets and Maps We assume that the reader is familiar with the language of sets and maps The most important concepts are the following De nition 111 Let V and W be sets and T V a W be a map between them The map T is called oneone iff 1 x2 whenever Tz1 Tx2 The map T is called onto iff for every y E W there is an x E V such that T y A map is called oneone onto iff it is both one one and onto Remark 112 Think of the equation y Tz as a problem to be solved for x Then one one the map T V a W is onto one one onto if and only if for every y E W the equation at most y T has at least one solution x E V exactly Example 113 The map RaRxgt gtz3 8 CHAPTER 1 PRELIMINARIES is both one one and onto since the equation y 963 possesses the unique solution y E R for every y E R In contrast7 the map R a R z a x2 is not one one since the equation 4 x2 has two distinct solutions7 namely z 2 and z 72 It is also not onto since 74 E R7 but the equation 74 x2 has no solution x E R The equation 74 2 does have a complex solution x 2239 E C but that solution is not relevant to the question of whether themapR Rxgt gtx2isonto ThemapsCaCCxgt gtz2and R a R z gt gt 2 are different they have a different source and target The mapCCaCgt gtx2 isonto De nition 114 The composition T o S of two maps S U a V7 T V a W is the map T o S U a W de ned by T 0 SW TSu for u E U For any set V the identity map IV V a V is de ned by IV1 u for 1 E V lt satis es the identities IV 0 S S for S U a V and T 0 IV T forTVaW 11 SETS AND MAPS 9 De nition 115 Left Inverse Let T V a W A left inverse to T is a map S W a V such that SoTIVV Theorem 116 Left Inverse Principle A map is one one if and only if it has a left inverse Proof If S W a V is a left inverse to T V a W7 then the problem y Tz has at most one solution if y Tz1 Tz2 then Sy STz1 STx27 hence 1 x2 since STx IVx x Conversely7 if the problem y T has at most one solution7 then any map S W a V which assigns to y E W a solution x of y Tz when there is one is a left inverse to T It does not matter what value S assigns to y when there is no solution QED Remark 117 If T is one one but not onto the left inverse is not unique7 provided that its source has at least two distinct elements This is because when T is not onto7 there is a y in the target of T which is not in the range of T We can always make a given left inverse S into a different one by changing 59 De nition 118 Right Inverse Let T V a W A right inverse to TisarnapRWaV suehthat TORIW Theorem 119 Right Inverse Principle A map is onto if and only if it has a right inverse Proof If R W a V is a right inverse to T V a W7 then x Ry is a solution to y Tz since TRy IWy y In other words7 if T has a right inverse7 it is onto The examples below should convince the reader of the truth of the converse Remark 1110 The assertion that there is a right inverse R W a V to any onto map T V a W may not seem obvious to someone who thinks of a map as a computer program even though the problem y T has a solution x it may have many7 and how is a computer program to choose 10 CHAPTER 1 PRELIMINARIES If V Q N one could de ne Ry to be the smallest z E V which solves y But this will not work if V Z in this case there may not be a smallest x In fact this converse assertion is generally taken as an axiom the so called axiom of choice and can neither be proved Cohen showed this in 1963 nor disproved Godel showed this in 1939 from the other axioms of mathematics It can however be proved in certain cases for example when V Q N we just did this We shall also see that it can be proved in the case of matrix maps which are the most important maps studied in these notes Remark 1111 If T is onto but not one one the right inverse is not unique Indeed if T is not one one then there will be 1 31 2 with Tz1 Tz2 Let y Tx1 Given a right inverse R we may change its value at y to produce two distinct right inverses one which sends y to 1 and another which sends y to 2 De nition 1112 Inverse Let T V a W A twosided inverse to T is a map T l W a V which is both a left inverse to T and a right inverse to T T 1 oTIV ToT l IW The word inverse unmodi ed means two sided inverse A map is called invertible iff it has a two sided inverse As the notation suggests inverse T 1 to T is unique when it exists The following easy proposition explains why this is so Theorem 1113 Unique Inverse Principle If a map T has both a left inverse and a right inverse then it has a two sided inverse This two sided inverse is the only one sided inverse to T Proof Let S W a V be a left inverse to T and R W a V be a right inverse Then S o T IV and T o R IW Compose on the right by R in the rst equation to obtain S o T o R IV 0 R and use the second to obtain S o W IV 0 R Now composing a map with the identity on either side does not change the map so we have S B This says that S R is a two sided identity Now if S1 is another left inverse to T then this same argument shows that S1 R that is S1 S Similarly R is the only right inverse to T QED 12 MATRIX THEORY 11 De nition 1114 Iteration A map T V a V from a set to itself can be iterated for each non negative integer p de ne T17 V a V by TpToTooT Z7 The iterate T17 is meaningful for negative integers p as well when T is an isomorphism Note the formulas TM T17 o Tquot T0 IV TW Tm 12 Matrix Theory Throughout l9 denotes a eld such as the rational numbers Q the real num bers R or the complex numbers C We assume the reader is familiar with the following operations from matrix theory rm gtlt rm a rm X Y H X Y Addition l9 gtlt IFqu a IFqu aX gt gt aX Scalar Multiplication 0 01m 6 IFqu Zero Matrix Fm gtlt IF a Fm A7 B gt gt AB Matrix Multiplication Fm a PM A gt gt A Transpose Fm a PM A gt gt Ar Conjugate Transpose I In 6 PM Identity Matrix PM a PM A gt gt A Power PM a PM A gt gt fA Polynomial Evaluation We shall assume that the reader knows the following fact which is proved by Gaussian Elimination Lemma 121 Suppose that A E Em and n gt in Then there is an X E Fm with AX 0 but X 7 0 The equation AX 0 represents a homogeneous system of in linear equations in n unknowns so the theorem says that a homogeneous linear system with more unknowns than equations possesses a non trivial solution Using this lemma we shall prove the all important 12 CHAPTER 1 PRELIMINARIES Theorem 122 Dimension Theorem Let A E Em and A Fm a Fm be the corresponding matrm map AX AX forX E Fle Then 1 IfA 239s one one then n S m 2 IfA 239s onto then m S n 3 IfA is invertible then m n Proof of Assume n gt m The lemma gives X 31 0 with AX A0 so A is not one one Proof of Assume m gt n The lemma applied to A gives H 31 0 with HA 0 Choose Y E Fm with HY 31 0 Then for X 6 En we have HAX HAX 0 Hence AX 31 Y for all X 6 En so A is not onto Proof of This follows from 1 and QED Chapter 2 Vector Spaces A vector space is simply a space endowed with two operations7 addition and scalar multiplication7 which satisfy the same algebraic laws as matrix addition and scalar multiplication The archetypal example of a vector space is the space IFqu of all matrices of size p gtlt q7 but there are many other examples Another example is the space PolynlF of all polynomials with coef cients from F of degree 3 n The vector space Poly2lF of all polynomials f ft of form ft a0a1ta2t2 and the vector space F1 of all row matrices A a0 a1 a2 are not the same the elements of the former space are polynomials and the elements of the latter space are matrices7 and a polynomial and a matrix are different things But there is a correspondence between the two spaces to specify an element of either space is to specify three numbers a07a17a2 This correspondence preserves the vector space operations in the sense that if the polynomial f corresponds to the matrix A and the polynomial g corre sponds to the matrix B then the polynomial f 9 corresponds to the matrix A B and the polynomial bf corresponds to the matrix bA This is just another way of saying that to add matrices we add their entries and to add polynomials we add their coef cients and similarly for multiplication by a scalar b What this means is that calculations involving polynomials can of ten be reduced to calculations involving matrices This is why we make the de nition of vector space to help us understand what apparently different mathematical objects have in common 13 14 CHAPTER 2 VECTOR SPACES 21 Vector Spaces De nition 211 A vector space over 1 l is a set V endowed with two operations addition VXVaVuVgt gtuv scalar multiplication l9 gtlt V a V 07V gt gt 1V and having a distinguished element 0 E V called the zero vector of the vector space and satisfying the following axioms u V W u V W additive associative law u V V u additive commutative law u 0 u additive identity au V au aV left distributive law a bu au bu right distributive law abu abu multiplicative associative law 1V V multiplicative identity 0V 0 for uVW E V and ab 6 F The elements of a vector space are sometimes called vectors For vectors u and V we introduce the abbreviations iu 71u additive inverse u 7 V u 7V subtraction A great many other algebraic laws follow from the axioms and de nitions but we shall not prove any of them This is because for the vector spaces we study these laws are as obvious as the axioms Example 212 The archetypal example is V PM 1A vector space over R is also called a real vector space and a vector space over C is also called a complex vector space 22 LINEAR MAPS 15 the space of all p gtlt q matrices with elements from l with the operations prq gtlt prq a prq X7Y H X Y of matrix addition and l gtlt prq a prq a7X gt gt aX of scalar multiplication and zero element 0 017th the p gtlt q zero matrix 22 Linear Maps De nition 221 Let V and W be vector spaces A linear map from V to W is a map T V a W de ned on V with values in W which preserves the operations of addition and scalar multiplication in the sense that Tu V Tu TV and Tau aTu for uV EV andaElF The archetypal example is given by the following Theorem 222 A map A Fm a Fm is linear if and only if there is a necessarily unique matrid A E Fm such that AX AX for all X E Fm The linear map A is called the matrix map determined by A 16 CHAPTER 2 VECTOR SPACES Proof First assume A is a matrix map Then AaX bY AaX bY aAX bAY aAX bAY where we have used the distributive law for matrix multiplication This proves that A is linear Assume that A is linear We must nd the matrix A Let M be the j th column of the n gtlt 71 identity matrix In 39 COljltIn so that X 117 2172 71m for X E 19 where entryjX is the j th entry of X Let A 6 19mm be the matrix whose j th column is AIW COljltAgt This formula shows the uniqueness of A Then for X E 19 we have AX A951In1 9521M 96717g 1AltIm1gt 2AltIm2gt 39 39 39 zlcoljL A 200l2 A mncoln A AX QED Example 223 For a given linear map A the proof of the Theorem 222 shows how to nd the matrix A substitute in the columns Ink colkIn of the identity matrix Here7s an example De ne A F3 a F2 by 3961 963 1 2 AX for X 6 F3 where x entryjX We nd a matrix A 6 F2 such that AX AX 1 0 0 A 0 i A 1 jflA 0 m 0 0 1 22 LINEAR MAPS 17 soA3 01 1710 Proposition 224 The identity map IV V a V of a vector space is linear Proposition 225 A composition of linear maps is linear Corollary 226 The iterates T of a linear map T V a V from a vector space to itself are linear maps De nition 227 Let V and W be vector spaces An isomorphism2 from V to W is a linear map T V a W which is invertible We say that V is isomorphic to W iff there is an isomorphism from V to W Theorem 228 The inverse of an isomorphism is an isomorphism Proof Exercise Proposition 229 Isomorphisms satisfy the following properties identity The identity map IV V a V of any vector space V is an isomorphism inverse IfT V a W is an isomorphism then so is its inverse T 1L W a V composition IfS U a V and T V a W are isomorphisms then so is the composition T o S U a W Corollary 2210 Isomorphism is an equivalence relation This means that it satis es the following conditions re exivity Every vector space is isomorphic to itself symmetry IfV is isomorphic to W then W is isomorphic to V transitivity IfU is isomorphic to V and V is isomorphic to W then U is isomorphic to W 2The word isommphism is commonly used in mathematics7 With a variety of analogous but different meanmgs lt comes from the Greek iso meaning same and morphos meaning structure The idea is that isomorphic objects should have the same properties 18 CHAPTER 2 VECTOR SPACES 23 Space of Linear Maps Let V and W be vector spaces Denote by V7W the space of linear maps from V to W Thus T E V7 W if and only if i T V a w ii TV1 V2 TV1 TV2 for V17V2 6 V7 iii TaV aTV for V 6 V7 a 6 IF Linear operations on maps from V to W are de ned pointWise This means 1 IfTSV W7 then TS V W is de ned by T SV TV SV 2 If T V a W and a 6 F then aT V a W is de ned by aT V aTV 3 0 V a W is de ned by 0V 0 Proposition 231 These operations preserve linearity In other words 1 TS e cvW T s e cvW 2 T e VWa e F gt aT e VW 3 0 E VW Here gt means implies Hint for proof For example7 to prove 1 assume that T and S satisfy ii and iii above and show that T S also does By similar methods one can also prove that Proposition 232 These operations make VW a vector space The last two propositions make possible the following Corollary 233 The map 1W a IF X1IWX1A H A which assigns to each matricc A the matricc map A determined by A is an isomorphism 24 FRAMES AND MATRIX REPRESENTATION 19 24 ames and Matrix Representation The space FL of all column matrices of a given size is the standard example of a vector space7 but not the only example This space is well suited to calculations with the computer since computers are good at manipulating arrays of numbers Now we7ll introduce a device for converting problems about vector spaces into problems in matrix theory De nition 241 A frame for a vector space V is an isomorphism Q En a V from the standard vector space Fle to the given vector space V The idea is that Q assigns co ordinates X 6 En to a vector V E V via the equation V QX These co ordinates enable us to transform problems about vectors into prob lems about matrices The frame is a way of naming7 the vectors V the names7 are the column matrices X The following propositions are immedi ate consequences of the lsomorphism Laws and show that there are lots of frames for a vector space Let Q lFW1 a V7 be a frame for the vector space V7 11 Fm a W7 be a frame for the vector space W7 and T V a W be a linear map These determine a linear map A En a Fm by AIl 10ToQ 1 According to the Theorem 222 a linear map for Fm to Fm is a matrix map Thus there is a matrix A E Fm with AX AX 2 for X 6 En De nition 242 Matrix Representation We call the matrix A deter mined by 1 and 2 matrix representing T in the frames Q and II and say A represents T in the frames Q and 11 When V W and Q II we also call the matrix A the matrix representing T in the frame Q and say that A represents T in the frame Q 20 CHAPTER 2 VECTOR SPACES Equation 1 says that WAX T 1 X for X 6 FL The following diagram provides a handy way of summarizing this V Q 11 anl me1 Matrix representation is used to convert problems in linear algebra to problems in matrix theory The laws in this section justify the use of matrix representation as a computational tool Proposition 243 Fip frames Q Fm a V arid Il Fm a W as above Theri the map rmxr cVWAHTpvolt1r1 is an isomorphism The irwerse of this isomorphism is the map which assigris to each liriear map T the matricc A which represents T in the frames Q arid 11 Proof This isomorphism is the composition of two isomorphisms The rst is the isomorphism rm a IWX1IWX1A H A of the Theorem 233 and the second is the isomorphism cmmkrmxl a cVW A H x11 0 A 0 1 The rest of the argument is routine QED 24 FRAMES AND MATRIX REPRESENTATION 21 Remark 244 The theorem asserts two kinds of linearity In the rst place the expression TV II o A o Q 1V is linear in V for xed A This is the meaning of the assertion that T E V7 In the second place the expression is linear in A for xed V This is the meaning of the assertion that the map A gt gt T is linear Exercise 245 Show that for any frame Q Ple a V the identity matrix In represents the identity transformation IV V a V in the frame Q Exercise 246 Show that for any frame Q Ple a V the identity matrix In represents the identity transformation IV V a V in the frame Q Exercise 247 Suppose 39I leX1 UQlF X1 VIIlFmX1 W are frames for vector spaces U7 V7 W7 respectively and that SU V7 TV W7 are linear maps Let A E Fm represent T in the frames Q and II and B 6 PM represent S in the frames 391 and Q Show that the product AB 6 IF represents the composition T o S U a W in the frames 391 and Q In other words composition of linear maps corre sponds to multiplication of the representing matrices Exercise 248 Suppose that T V a V is a linear map from a vector space to itself7 that Q Fm a V is a frame7 and that A 6 PM represents T in the frame Q Show that for every non negative integer p7 the power A represents the iterate T in the frame Q If T is invertible so that A is invertible7 then this holds for negative integers p as well Exercise 249 Let W Z W 120 22 CHAPTER 2 VECTOR SPACES be a polynomial We can evaluate f on a linear map T V a V from a vector space to itself The result is the linear map fT V a V de ned by fT Z prP 120 Suppose that T7 1 A7 are as in Exercise 248 Show that the matrix fA represents the map fT in the frame 1 Exercise 2410 The dual space of a vector space V is the space V VlF of linear maps with values in F Show that the map pm a WWW H H H de ned by HX HX for X E Fle is an isomorphism between F1 and the dual space of FL We do not distinguish FM1 and Exercise 2411 A linear map T V a W determines a dual linear map T W a V via the formula T104 aoT for 04 E W Suppose that A is the matrix representing T in the frames 1 Fm a V and Il Fm a W Find frames 3 Fm a V and Il Fm a W such that the matrix representing T in this frames is the transpose A 25 Null Space and Range Let V and W be vector spaces and T V a W be a linear map The null space of the linear map T V a W is the set NT of all vectors V E V which are mapped to 0 by T NTVEVTV0 26 SUBSPACES 23 The null space is also called the kernel by some authors The range of T is the set 7ZT of all vectors W E W of form w TV for some V E V 7ZT TV V E V To decide if a vector V is an element of the null space of T we rst check that it lies in V if V fails this test it is not in and then apply T to V if we obtain 0 then V E JT7 otherwise V 6E To decide if a vector W is an element of the range of T we rst check that it lies in W if W fails this test it is not in and then attempt to solve the equation W TV for V E V If we obtain a solution V 6 V7 then W E 7ZT otherwise W Warning It is conceivable that the formula de ning TV makes sense for certain V which are not elements of V in this case the equation W TV may have a solution V but not a solution with V E V If this happens W Theorem 251 OneOneNullSpace A linear map T V a W is one one if and only ifNT Proof If NT 0 and V1 and V2 are two solutions of W TV then TV1 W TV2 so 0 To17To2 TV1V2 so Vlng E NT 0 so V1 7 V2 0 so V1 V2 Conversely if NT 31 0 then there is a V1 6 NT with V1 31 0 so the equation 0 TV has two distinct solutions namely V V1 and V 0 QED Remark 252 OntoRange A map T V a W is onto if and only if W RT 26 Subspaces De nition 261 Let V be a vector space A subspace of V is a sub set W Q V which contains the zero vector of V and is closed under the operations of addition and scalar multiplication7 that is7 which satis es zero 0 E W addition u V E W whenever u E W and V E W scalar multiplication au 6 W whenever a E l and u E W
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'