Linear Algebra MATH 115A
Popular in Course
Popular in Mathematics (M)
verified elite notetaker
This 13 page Class Notes was uploaded by Kaylin Wehner on Friday September 4, 2015. The Class Notes belongs to MATH 115A at University of California - Los Angeles taught by Staff in Fall. Since its upload, it has received 35 views. For similar materials see /class/177816/math-115a-university-of-california-los-angeles in Mathematics (M) at University of California - Los Angeles.
Reviews for Linear Algebra
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 09/04/15
SVD AND Con mm A Z ALv uj A quotan L BEST KANK rL AfaKOXIMA HOAI IS 5 Z ALVLMI 4 A HA um mms B KiQUWR 1 n rmrh m Pm 0 lNFcKmTiw H I R mmm IF 139 S A LOT SI MLLEK THAN mm Tm CgmeEQSEy g WFDMM MN NTEKESTWG SpotZSme i FOR A MMOX A Mm HOW PAST 00 A3195 FALL OFF DOES lT MATTEL WHieg A COMES FKON IF A IS A KANOom Mm quotWW 1 A g A FAK I ILULAR TYPE OF VA YA FOL EXAMVLEJ p A 1 An lMAGE THE VASTEK THE No FALL OFF H EAUUL T U T6 Commas we WFo MATtou IN A S YMMEIKIL WW 3 REV WED AAT R1641 nm T Rio 1 Au Mmeoosz 61m umMm SUCH Tm M A Jux W A REAK mgo m A 2 Law I m In Flow v Zltv auc Zukufv L L15 1 l 43 11 T 1 AV 4 AA hut 12 V 1 4 lt LJqJ 7 O k j COR IF chi l Ah mt DamnLT gleammum OF A A V4 vl Av v 1 T HQENSMCC THE A fjdipv 4 Pv MWOGOAML Pugmlolv OM k Ream mm F VquotquotVm quot 4390 OLTHoNoLWM SW ASK OF A Ul f gz Wl 1m 139quot rm PwV mm Vdfv 4i w v L L I So quot I PW Z VLVS 11 NOW 1 S 1 A Mku T UM 0F MM oufk k L A39 AA OLTHUDDQMAI P MW OF 1 J V41 POLAK DECOMPOSITION 1 A q ML mm MATNX THEN AQS Q60 S SYN UM J ElevaUrs 70 FoLAi WMVOSNON OF A M 4291 A UJLV DIVE OW J15 lt5 5 A 3 0 UVVTJLV SET CQzUV SszJLV V ULV TWA QEOOH As Powwe 0 OKTHOGONAL Wm H Mew4L S is S Imm m HM SAME ELGiUVALU Y A JLJWHKH Mi 20 A f X VLMAVT mums kg M 4I 4 m H Q 2m s 2km zw Lt t 43 It QS u 2A a q 4 1 i J 2 HT T l m Ada4T A L 43 1716042 0 3 m Roma 0 K EFLECU S DLAltU 0K SHAW H A ALOU6 M Q mam Gummy mm A Wg Q 613E 9 A3628 0 Om OSTOU IS UWQUE PROOF ASSUME Ah w svo m ALL 9mm mam THEN I I MI MAXIMILgS ML MAXImi 13 1311 Suoaiq fouu7o ETC Now IlAuuvllguu FOL ALLuECAUSE Q FEWLuz Labm 56 A 0 4 mir awn w I MJ 3 mum Wilts Sadn hm l M Sc 5 i ALMA L l IF THE hf M WT ALL 01mm THE 3 Emmyg k ME DETEMIMED 6y csL WE CAN us THU To GET A F001qu 20 R 4 S Ji i H 441 44cc ounucr mewAm Aldo f v4 Ol tHoeouAL Floiiicnw ON m V k The 4 Elemma So A DiffAwn 5 Ago AIM G9AS PRNUPAL COWOAJWT ANNA6 PCA To ng FKIENDs A rman MATRIX OF DATA Rows s9 Wmvwmu OK Expzemzm 09 TV 956005 OK CowM es MEAsuzzmzmg mm 0 MEASUKfnfzm or J TH quurm l FOK L901 Wmvmum ExamLg 0ng am am CoLumwS s 030 ng o mm at Scam OF swam L ow moms EXAMPLE KCng 9 CITIES COIN MAW DAYS OF 1009 A at HIGH TEh EgATUg poA any 1 cu 0476 NOTE THng IS USUALLy No Clamu WHY A Mama 5 SQUAKE W W 39 W A COLLECDON OF quotWELBHK39 I Mel WM WE HAVE XL QWES A WQQWYED UMUL wmlN TltJ SUM J 0 Tu MEASUKEIMEU K FOIL WDNVIUAL L THE VA A Idi OF XXM5 h V b0 ME ATE1 Mfme PX 39 Y fb zi vii OW X 0 l1 79quot 45 m L I VI QUESTION FM WHAT W L Do wi GU Tug Wear VAMAuqz39 W 1ch is OQX Wandquot CszLX 80 WE wokmaze llwllzl Aw TM To MAXWLE VMCXVa AW Now 55 211quot i3x l quot Aw Va lx A wlIL So PmaLEh Is MAXInle A wlll Mum T0 llwnlz THIS IS WHAT WE SDLUEO MN SVD LET a WA 139 T A E LVL MK quotMduut39h Van 4 2A32quot2AL 4 THEN W MI 13 THE Soumou IT 15 UNLQUf u ARA THE Sif o wagu m al Is CALLED T HE FKSC MIAWm COM OMZMI OF A W 2339 A Is HIE Fm Iava Waldvar 01 m 0mm 0 To 0 To 0 To 0 To 0 To 0 To 0 To 0 To 0 To 0 To SOME METHODS test whether a set S of vectors is a subspace of a vector space Determine whether 5 is nonempty check to see if 0 E 5 Determine whether 5 is closed under scalar multiplication and vector addition 7vn Determine whether the equation mm anvn u has any solution for the ails Such a vector equation in an m dimensional space generally converts to a system of m linear equations in the n unknowns This system can then be solved by the methods of Math 33A test whether u E spanv1 test whether v1 1 are linearly dependent See whether the equation mm anvn 0 has nontrivial solutions for the ails calculate the coordinate vector Mg where 6 consists of u1 un in that order Solve the equation alul anun v for the ails determine whether a function T V A W is linear Compare Tu v with Tu Tv and compare Tav with aTv For starters check whether T0v 0W calculate the matrix representing T Apply T to the basis vectors of the domain V then take coordinate vectors relative to the basis for W This gives the columns of the matrix solve the equation Tv w for 1 Find the representing matrix A relative to some bases 6 and 7 solve the equation AI 10 for z decoordinatize There may be more direct methods depending on the situation nd a basis for ker T Find a basis for the nullspace of a representing matrix and decoordinatize it There may be more direct methods depending on the situation If all you need is the nullity of T then all you need is the dimension of the nullspace for the matrix nd a basis for ran T Find a basis for the column space of a representing matrix and decoordinatize it There may be more direct methods depending on the situation If all you need is the rank of T then all you need is the rank of the matrix test whether a linear transformation T is an isomorphism Is it onetoone what is its nullity Is it onto what is its rank Because nullity rank dimension of the domain the two questions are related test whether two vector spaces over the same eld are isomorphic Do they have the same dimension nd the coordinate vector with respect to a new basis Forget the old basis and calculate the coordinate vector as before Or else use the equation vlg Q 1v0 where Q idg so that columnj of Q is ujlm where uj if the jth basis vec tor in The latter method gives the connection between the old coordinate vector and the new one oTo nd the matrix representation of a linear operator with respect to a new basisi Forget the old basis and calculate the representing matrix as before Or else use the equation The Q 1TgQ where Q is as above The latter method gives the connection between the old matrix and the new one Now suppose that we have a linear operator T on an n dimensional vector space Vi In order to understand T we seek a diagonalization if one exists 0 1 Calculate the representing matrix A for T relative to some convenient basis a such as the standard basis or the best basis we know of n n X n matrixi If T is LA matrix multiplication by A then of course the representing matrix relative to the standard basis for F is simply Al 0 2 Find the characteristic polynomial p of T which is det A 7 AI and its roots A1 i i Aki Let mi be the multiplicity of the root Ail For large n this step might need to be altered This polynomial does not depend on our choice of basis It is a polynomial of degree n in the variable A The roots A1 i Ak are the eigenvalues of T and of Al A scalar A is an eigenvalue if and only if the matrix A 7 AI is singular Over R we have pt it 7 x1m1 t 7 Akmk irreducible quadraticsi And m1 mk S n Over C we have simply pt it 7 x1m1 t 7 Akmk and m1 mk n 0 3 For each eigenvalue Ai nd a basis for the nullspace of A 7 ML By decoordinatizing nd a basis for EAI EAI is isomorphic to the nullspace of A 7 All under the coordinate map lUi We know that 1 S dim EAI S mi that is the geometric multiplicity dimEAl does not exceed the algebraic multiplicity mil Combine the bases for EA i i i Ekki This gives a maximal linearly independent set of eigenvectors for T It is linearly independent by the theorem on independence of eigenvectors for different eigenvalues And the number of independent eigenvectors is dimEA1dimEAkSm1mmkSn 0 4 Now there are two cases 4A The good case dim E n iiei equality holds in This happens if and only if both a the character tic r 39 splits mp 1 into linear factors no irreducible quadratics and b for each i for which mi gt 1 the geometric multiplicity dim EAI equals the algebraic multiplicity mil For example a always happens if the eld is C And b always happens if mi 1 for each 239 ie if every root of the characteristic polynomial is a simple rooti Then we have a basis 5 of n eigenvectorsi The matrix D is an n X n diagonal matrix and the n entries on the diagonal are exactly the eigenvalues each repeated according to its multiplicity ie A is repeated mi times The diagonal matrix D will equal Q lAQ where Q id so that the columns of Q are the coordinate vectors relative to a of the eigenvectors in 5 ie the column vectors from step 3 43 The bad case dim EAI lt n This happens if and only if either a the characteristic polynomial does not split completely into linear factors or b for some i the eigenspace E has dimension less than mil In this case T is not diagonalizablei The maximal linearly independent set from step 3 fails to span Vi Even in the bad case we can form a basis with as many eigenvectors as possible The resulting matrix may not be diagonal but it might be better than the one we started with Beyond that there is something called Jordan normal form see Math 1153 Example Consider the two matrices 5 3 5 0 Al0 sl and Bl0 sl A and B have the same characteristic polynomial namely p 5 7 A So in both cases there is a single eigenvalue namely 5 with multiplicity 2 But the eigenspace of A for 5 is a line the zaxis while the eigenspace of B for 5 is the entire planer Hence A is not diagonalizable its eigenvectors do not span the plane while B is a diagonal matrix This example shows that we cannot always tell from the characteristic polynomial alone whether or not a matrix is diagonalizablei lft ere are roots that are not simple ie have multiplicity greater than 1 then we must investigate the dimension of the eigenspace before we can be sure if the matrix is diagonalizablei By the way the word eigen comes from the German Sometimes instead of eigenvalue the English phrases characteristic value or proper value are used but usually the German word is pre ferred Literally the German Eigenwert can be translated as its own value The idea is that the eigenvalues and eigenvectors of T are determined by T itself independent of any choice of basis vectors or representations Eigenvalues and eigenvectors were introduced here to deal with the question of diagonalizationi It should be noted that eigenvectors are useful in other situations as well although we focus primarily on diagonalizationi oTo calculate the usual inner product in R zy y zi To calculate the usual inner product in C I y y zi oTo nd Si Solve the system 1 l s l s E S for Ur Often the theorem that column space of Ai nullspace of A is usefuli oTo nd the coordinate vector 11 relative to an orthonormal ordered basis 6 uh i i i un We can proceed as before or we can use the Fourier coef cients ltv7 u1gt lvl i lt1 ungt so that v v u1u1 vununi oTo nd the representing matrix relative to an orthonormal ordered basis 6 uh i i i un We can proceed as in Chapter 5 or we can use Tlgm ltTuj7uigt oTo nd the orthogonal projection p of a vector 1 onto an m dimensional subspace W having basis who i i wm m p Z ltHv7 wj for an orthogonal basis w j1 J
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'