Advanced Software Engineering
Advanced Software Engineering CSI 5V93
Popular in Course
Popular in ComputerScienence
This 7 page Class Notes was uploaded by Melvin Bednar on Saturday October 3, 2015. The Class Notes belongs to CSI 5V93 at Baylor University taught by Staff in Fall. Since its upload, it has received 13 views. For similar materials see /class/217937/csi-5v93-baylor-university in ComputerScienence at Baylor University.
Reviews for Advanced Software Engineering
Report this Material
What is Karma?
Karma is the currency of StudySoup.
Date Created: 10/03/15
Lecture 13 Bayesian learning CSI 5v93 Introduction to machine learning Baylor University Computer Science Department Dr Greg Hamerly http cs baylor edu Nhamerly CSI 5m Introduction to rriacnine learning Lecture l3 7p lM Announcements Homework 3 due today 39 Homework 4 assigned soon CSI 5m Introduction to rriacnine learning Lecture l3 7 p 2l4 Questions CSI 5m Introduction to machine learning Lecture l3 7 p 3m Bayesian learning and naive Bayes Bayes rule and modelling naive Bayes smoothing binning continuous variables applications to text shaping probabilities See section 663 in your book also handouts from Mitchell CSI 5m Introduction to machine learning Lecture l3 7 p 4m Bayes rule PIM X PrXPl4PrM Breaking it apart 0 M O X PrMlX PrXlM PrM PrX CSI 5m Introduction to machine learning Lecture i3 7 p 5m Bayes rule PIM X PrXPl4PrM Questions Why are we modelling PrMlX How does each probability affect PrMlX CSI 5m Introduction to machine learning Lecture i3 7 p em Cancer example from Mitchell s text a test has two outcomes positive ea or negative 9 a patient either has cancer cancer or does not cancer Here are the known probabilities Pcancer 008 Pbcancer 992 P lcancer 98 Pelcancer 02 P l cancer 03 Pel cancer 97 If the test is positive what is the likely diagnosis cancer or not CSI 5m Introduction to rriacnine learning Lecture l3 7 p 7l4 Using Bayes rule for classification PIM X PrXPl4PrM How do we use this to classify an input cc Different methods depending on our assumptions 39 MAP Maximum A Posteriori 39 ML Maximum Likelihood CSI 5m Introduction to rriacnine learning Lecture l3 7 p 8l4 Naive Bayes classifier A naive Bayes classifier is based on Bayes rule It s a simple effective tool for learning from data It s called naive because of the CSI 5m Introduction to rriacnine learning Lecture l3 7 p 9m Naive conditional independence assumption Assumption input variable i is independent of input variable 9 given the class This is called conditional independence n probability terms if input ac has features 9ch and acj Prm1jlc PrmlcPr1jlc What does this mean CSI Eves Introduction to machine learning Lecture tarp l0l4 Back to Bayes rule PrXlM PrM PrMlX WX The naive assumption affects PrXlM which can be expanded as PrXlM PrX1lM PrX2lM a a a PrXdlM H PrX lM i1 So instead of having a multivariate model for ddimensional data we have d univariate models CSI Eyes Introduction to machine learning Lecture tarp iiM Naive Bayes probabilities Starting again with Bayes rule Pr X M Pr M r P MEQ and substituting our new definition 0 PrXlM H PanM 711 we get the new probability PIM Hi1 PrXilM PrMlX WX CSI Eyes Introduction to machine learning Lecture tarp i2i4 Naive Bayes classification PrMlX WX MAP classification mMAp arg max PrM le mEM arg max mEM PrX i1 WM 1121 PrtXrlM PrM m 11 PrX lM m d arg PrM m H PrX lM m CSI Eyes Introduction to machine learning Lecture tarp iGM 2minute journal Please write a response to the following on a piece of paper and hand it in immediately Please make it anonymous no names Write about 39 major points you learned today 39 areas not understood or requiring clarification CSI Eyes Introduction to machine learning Lecture tarp MM