Class Note for EECS 841 with Professor Potetz at KU 12
Class Note for EECS 841 with Professor Potetz at KU 12
Popular in Course
Popular in Department
This 11 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at Kansas taught by a professor in Fall. Since its upload, it has received 36 views.
Reviews for Class Note for EECS 841 with Professor Potetz at KU 12
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 02/06/15
EECS 841 Computer Vision Brian Potetz Fall 2008 Lecture 30 Graphical Models in Computer Vision PostMidterm Grades I 3 quot Seriesl 2 0 V 1 II I gt95 gt90 gt85 gt80 gt75 gt70 gt50 gt25 gt0 Line Fitting using PCA X 11 24 22 31 29 39 41 52 x0 meanX 39 A Na 1 Xo1 XI 2 Xo2 U S eigA A nd the larger eigenvalue in S and extract from U the correct eigenvector s i maxdiagS39 a V i a1xO40 a a2x040a el a1 a2 plote11 e12 39r3939 this is the line that reduce the shortest distance from the point to the line Joint Distribution Gradient A Magnitude Pixel nten5ity hist fullhist2DGM2 im hist flipudhist hist histsumhist imshowhist Joint Distribution Normalized for Display Purposes Gradient A Magnitude hist fullhist2DGM2 im hist flipudhist hist histsumhist imshowhist maxmaxhist Pixel gntensity Conditional Distribution PG I Gradient A Magnitude Pixel intensity hist fullhist2DGM2 im hist flipudhist hist histsumhist PGgivenI hist repmatsumhistl sizehistl 1 imshowPGgivenI Conditional Distribution PI G Gradient A Magnitude Pixel ntensity hist fullhist2D GM2 im hist flipud hist hist histsum hist PiligiveniG hist repmatsurnhist2 1 sizehist2 imshowP717giveniG Marginal PI m u 02 AJ plot sumhist1 Marginal PG plot surn hist2 Conditional Probability Assume you are a doctor This is the sample space of patients you might see on any given day Outcomes Nonsmoker female diabetic headache good insurance etc Smoker male herniated disk back pain mildly schizophrenic delinquent medical bills etc Carnegie Mellon Conditional Probability Event Flu PF 002 Carnegie Mellon Conditional Probability Event Headache H PH o1o Carnegie Mellon Conditional Proba y PF 002 PH 010 H F we still need to specify the interaction between flu and headache Define PHF Fraction of F s outcomes which are also in H Carnegie Mellon Condi ional Probab ty 089 PF 002 PH 010 009 PHF 050 H F 001 001 H hmdache F u Define PHF Fraction of F s outcomes which are also in H Carnegie Mellon Conditional Proba y 0 89 PHF Fraction of u worlds in which patient has a hadache 039 worlds with u and hadache F worlds with u 0 01 0 01 Size of H and Fquot region Condi ional Probab ty De nition Iannd B are events in S and PB gt 0 then the conditional probability of Agiven B Written PAlB is PAB PA B PB E l adaChe The Chain Rule A simple rearrangement ofthe above equation yields PA B PA BPB Main Bayes Net concept Carnegie Mellon Carnegieiheuuu Probabilistic Inference Have a headache Coming down with Fluquot H PH 010 F PF 002 PHF 050 One day you wake up with a headache You think Drat 50 of us are associated with headaches so I must have a 5050 chance of coming down with flu Is this reasoning good Carnegie Mellon Probabilistic Inference H Have a hmdachequot F Coming down with Fluquot H PH 010 F PF 002 PHF 050 PFiHMw 010 PH PH 01 Carnegie Mellon What we iust did PAB PAB PB PBA PA PA This is Bayes Rule Bays Thomas 1763 An essay towards solving a problem in the doctrine of chances PI II39IOSDphI39CEI ansactions of the Royal Society of London 53 370 418 CarnegieMellon 2 More General Forms of Baves Rule PB APA PM 393 PB APA PB AP A PB A CPA C PA wc HE C CarnegieMellun More General Forms of Baves Rule PM 393 quotPal10PM A 2P0 AkPAk CHrnegieMellon 21 22 Independence De nition Two events Aand B are statis ically independent if PAB PAPB Which is equivalent to PA B PA Important for Bayes Nets CarnegieMellun Representing PABC 0 How can we represent the function PA PAB PABC 23 24 The Joint Probab t1 Table Example PA B C A B c Prob Recipe for making a joint distribution U U n man of M variables U n l nus U l El EllEI 1 Make a truth table listing all D t t nus combinations of values of your l U D DDS variables if there are M boolean l U l W variables then the table will have 1 1 j39 2M rows 2 For each combination ofvalues say how probable it is 03905 A 010 005 3 lfyou subscribe tothe axioms of 0251110 005 c probability those numbers must 39 010 sum to 1 a 30 B Carnegie Mellon CarnegieMellun gender hoursworked wealth Female v0405 poor 0253122 I J o i rich 00245395 I v1405 poor 00421763 I rich 00116293 Male v0405 poor 0331313 I rich 00971295 I v1405 poor 0134106 I rich 0105933 I One you have the JPT you PE E Prow can ask for the probability of rows matchng any logical express10n what is PPoorMale Carnegie Mellon 26 U I gender hoursworked wealth S I n e Female v0405 poor 0253122 I J o i rich 00245395 I v1405 poor 00421763 I rich 00116293 Male v0405 poor 0331313 I rich 00971295 I v1405 poor 0134106 I rich 0105933 I PPoor Male 04654 PE 2Prow rowsmamhingE what is PPoor Carnegie Mellon gender hoursworked wealth Female v0405 poor 0253122 I J o i rich 00245395 I v1405 poor 00421763 I rich 00116293 Male v0405 poor 0331313 I rich 00971295 I v1405 poor 0134106 I rich 0105933 I PPoor 07604 P E E Prow rows matching E what is PMalelPoor Carnegie Mellon 28 gender hoursworked wealth M Female v0405 poor 0253122 m rich 00245395I v1405 poor 00421763 I rich 00116293 Male v0405 poor 0331313 I rich 00971295 I v1405 poor 0134106 I rich 0105933 I P E E E Prow IEV1 IE2 1 2 W 111851de PEz mggrm Carnegie Mellon gender hoursworked wealth M Female v0405 poor 0253122 m rich 00245395I v1405 poor 00421763 I rich 00116293 Male v0405 poor 0331313 I rich 00971295 I v1405 poor 0134106 I rich 0105933 I Prow PEpEz rawsmatc ElandEz 1 2 PEz Prow rows ingEz PMale Poor 04654 07604 0612 Carnegie Mellon 3O Inference is a big deal I ve got this evidence What s the chance that this conclusion is true I ve got a sore neck how likely am to have meningitis There s a thriving set of industries growing based around Bayesian Inference Highlights are Medicine Pharma Help Desk Support Engine Fault Diagnosis Carnegie Mellon 31 Learn39ng a JPT Build a Joint Probability table for your attributes in which the Then ll in each row with probabilities are unspeci ed records matching row Prow 39 A a c Prob total number of records 1 u u 2 u u 1 2 PA B c Prob 11 1 11 1 u u 1 Dan 11 1 1 1 u u 1 m 1 11 11 1 u 1 u 11 1 11 1 1 u 1 1 m 1 1 11 1 1 u u m 1 1 1 1 D 1 1 1 u 1 1 1 Fraction of all records in which A and B are True but C is False Carnegie Mellon 32 Where are we 0 We have recalled the fundamentals of probability 0 We have become content with what JPTs are and how to use them 0 And we even know how to learn JPTs from data Carnegie Mellon Dens39ty Estimation 0 Our Joint Probability Table JPT learner is our first example of something called Density Estimation 0 A Density Estimator learns a mapping from a set of attributes to a Probability Input P b 1311 Attributes m a quot V Can do Bayesran classrflcatlon Predict the class ofa record eg predict cancerousnotcancerous CarnegieMellon CarnegieMellon 34 Summary The Good News The JPT allows us to learn PX from data Can do inference PE1E2 Automatic Doctor Recommender etc Can do anomaly detection spot suspicious incorrect records eg credit card fraud 35 35 Summary The Bad News 0 Density estimation with JPTs is trivial mindless and dangerous Carnegie Mellon Using a test set Set Size Log likelihood Training Set 196 4661905 Test Set 196 6146157 An independent test set with 196 cars has a much worse log likelihood than it had on the training set actually it s a billion quintillion quintillion quintillion quintillion times less likely Density estimators can over t And the JPT estimator is the over ttiest of them all Carnegie Mellon Overfitting Density Esti aInrc mpg modelvear maker had my amen3 M7551 If thIs ever happens It means m MSW there are certain combinations we WSW I that we learn are impossible mm mm mm m unzssmz Europe mama maaa amen3 mamm ESE Never New gm 7mm amen3 mm m uoaoslzz Europe mumm 75m77 amen3 nuaumz I m momma eumpe mama maaa amen3 oMzzoS m mumm Europe u mama I Carnegie Mellon 37 33 Is there a better way The problem with the JPT is that it just mirrors the training data In fact it is just another way of storing the data we could reconstruct the original dataset perfectly from it We need to represent the probability function with fewer parameters Carnegie Mellon Bayes Nets CarnegieMellon 39 4n Bayes Nets 0 What are they Bayesian nets are a framework for representing and analyzing models involving uncertainty 0 What are they used for Intelligent decision aids data fusion 3E feature recognition intelligent diagnostic aids automated free text understanding data mining 0 How are they different from other knowledge representation and probabilistic analysis tools Uncertainty is handled in a mathematically rigorous yet ef cient and simple way Carnegie Mellon Bayes Net Concepts 1Chain Rule PAB PA PBA 260nditional Independence PABC PAB Carnegie Mellon 41 42 A S mgle Bayes Net 0 Let s assume that we already have PMpgHorse Pgood low Pgoodh igh P had low P badhigh PMpg Horse How would you rewrite this using the Chain rule Carnegie Mellon 43 Review Chain Rule PMpg PMpg Horse Pgood 04 P bad 06 Pgood 10w 036 P goodhigh 0 04 t plt bad 10w 012 p badhigh 043 PHorsempg P lowlgood plt lowl bad Phighlgood Phighl bad 079 CumegiieMellun Pgood Plowlgood Review Chain W039 Pgood Phighgood 04011 PMpai PMpg Horse good 04 9 bad 06 Pgood 10w 036 Pgoodhigh 004 gt t p bad 10w 012 p badhigh 049 PHorsempg p lowlgood 039 p 1am bad 021 Phighlgood 011 39highl bad 079 Pbad Phighlbad Pbad Plowbad V079 0e021 Carnegie lelju How to Make a Baves Net How to Make a Baves Net PMpg Horse PMpg PHorse Mpg PMpg Horse PMpg PHorse Mpg PMpg 39233 8 PHorselMpg 39 4 39 P lowlgood 090 Horse Horse P lowl bad 021 Phighlgood 010 Phighl bad 079 CamegieMellun CamegiLD lellun How to Make a Baves Net Each node is a Qrobability function How to Make a Baves Net 50 what have we accomplished thus far Each m denotes conditional degendence Nothing Mpg pmpg we ve just Bayes Netified the PMpg FhMpg iIiorse JPT usmg the v Mpg Pgood am r 9 Horse PHorselMpg P bad the real excitement starts P HorseIMpg when we wield conditional 39 p lowlgood 090 independence Hmse P lowl bad 021 Phighlgood 010 Phighl bad 079 CarnegiieMellun Carnegie lellun How to Make a Baves Net Before we continue we need a worthier opponent than puny PMpg Horse We ll use PMpg Horse Accel Note I made these up P MpgHorseAcce1 How to Make a Baves Net Step 1 Rewrite joint using the Chain rule PMpg HorseAccel PMpg PHorse Mpg PAccel Mpg Horse Note P ood 1owslow 037 Pigeon low39 fast 0 01 ObVIously we could have written this 36 different ways Pgoodhighs1ow 002 Pgoodhighfast 000 MHA Emma P bad 1owslow 010 WNW P bad 1owfast 012 pMHjA P badhighslow 015 P badhighfast 022 CumegiieMellun Carnegie lellun 51 How to Make a Baves Net Step 1 Rewrite joint using the Chain rule PMpg HorseAccel PMpg PHorse Mpg PAccel Mpg Horse How to Make a Baves Net Mpg Mpg PiMpg 4 p p 1 Horse Horse A L PHorseMpg Accel Accel PAcce1MpgHorse CamegieMellun Carilegieb lellun How to Make a Baves Net PMpg Pgood 04 Mpg P bad 06 P HorseIMpg P lowlgood 090 PAcce1MPgHorse P 1owi bad 021 Phighgood 010 Horse Pslowgood 1ow 097 Phighl bad 079 Psiow9 drhigh 015 Ps1owi bad 1ow 090 Ps1owi badhigh 005 ml PEastgood low 003 PEastgoodhigh 035 PEast bad low 010 PEastl badhigh 095 A Note I made these up too CarnegiieMellun How to Make a Baves Net PMpg Pgood 04 Mpg P bad 06 P Horse Mpg P 1owigood P lowl bad 039 021 P Accel IMpg Horse Phighgood o11 Horse Pslowgood low 097 Ph1ghi bad 079 Pslowl9 dhigh 015 lt Ps1owi bad 1w 090 Pslowl 39 39ihigh 005 Accel quot quotquot 1 low 003 A Miracle Occurs thigh 035 X loquot 010 YouaretoldbyGodoranotherdomain expert ihigh 095 that A0001 is independent of Mpg given Horse ie PAccel Mpg Horse PAccel Horse Carnegie lun How to Make a Baves Net How to Make a Baves Net PlMpg PlMpgl Mpg Pgood 04 Mpg Pgood 04 P bad 06 Thankyoudomain expert P bad 06 PHorseMpg PHorseMpg v P lowlgood P lowl bad P high Igood Phigh bad instead of7 from my data 39 P 1 quot 9 d P lowl bad P high Igoud 39My39para meterestir39nates P highl bad will bequotmore accurate as a result Horse Horse PAcce1Horse PAcce1Horse 39 P slowi low 39 Pslowl low 022 Accel P sicklhigh Accel P sloklhigh 0 64 P Eastl 1w PEast 10 078 PEasthigh PEasthigh 036 CurnegiieMeIlon Carnegie lellon 5 Indegendence Bayes Nets Forma zed The Acceleration does not depend on the Mpg once I know the Horsepowerquot A Bayes net also called a belief network is an augmented directed acyclic graph represented by the pair V E where This can be speci ed very simply I V is a set of vertices E is a set of directed edges joining vertices No loops of any length are allowed PAccel lMpg Horse PAccel Horse Each vertex in V contains the following information This is a powerful statement A Conditional Probability Table CPT indicating how this variable s probabilities depend on all possible It required extra domain knowledge A different kind of knowledge than combmatlons 0f parental values numerical probabilities It needed an understanding of causation CamegieMellun CamegiLD lellun Bayes Nets Summa The good news Bayes nets are a factorization of the full JPT which We can do inference uses the chain rule and conditional independence We can compute any conditional probability They can do everything a JPT can do like quick P Some variable l Some othervariable values cheap lookups of probabilities Pjoint entry P51 AEz oimam39iquEldez PE1Ez7 PE1 EPQomt entry jaintentxinmlmhingE CarnegiieMeIlon Carnegie lellun The good news We can do inference We can compute any conditional probability P Some variable Some other variable values P joint entry A E2 jointemriesm 39 Eldel 2 PE2 2 P joint entry jointentriesn mtchingli 2 Suppose you have rn binaryvalued variables in your Bayes Net and expression E2 mentions kvariables How much work is the above computation Carnegie Mellon 62 The sad bad news Doing inference JPTstyle by enumerating all matching entries in the joint are expensive Exponential in the number of variables But perhaps there are faster ways of querying Bayes nets In fact if I ever ask you to manually do a Bayes Net inference you ll find there are often many tricks to save you time So we ve just got to program our computer to do those tricks too right Sadder and worse news General querying of Bayes nets is NPcomplete Carnegie Mellon Case Study I Pathfinder system Heckerman 1991 Probabilistic Similarity Networks MIT Press Cambridge MA Diagnostic system for lymphnode diseases 60 diseases and 100 symptoms and testresults 14000 probabilities Expert consulted to make net 8 hours to determine variables 35 hours for net topology 40 hours for probability table values Apparently the experts found it quite easy to invent the causal links and probabilities Pathfinder is now outperforming the world experts in diagnosis Being extended to several dozen other medical domains Carnegie Mellon 64 Bayes Net Info GUI Packages Genie Free Netica Hugin NonGUI Packages All ofthe above have APls BNT for MATLAB AUTON code learning extremely large networks of tens of thousands of nodes Carnegie Mellon Bayes Nets in Computer Vision Yuille s Deformable Eye Template The First Eye Template Image y I Iris 5 Pupil Whites l image it 65 Bayes Nets in Computer Vision
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'