### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# FOUNDTNSOFARTIFICLINTELLGNC CS2710

Pitt

GPA 3.77

### View Full Document

## 48

## 0

## Popular in Course

## Popular in ComputerScienence

This 23 page Class Notes was uploaded by Jonas Bartell on Monday October 26, 2015. The Class Notes belongs to CS2710 at University of Pittsburgh taught by Staff in Fall. Since its upload, it has received 48 views. For similar materials see /class/229388/cs2710-university-of-pittsburgh in ComputerScienence at University of Pittsburgh.

## Reviews for FOUNDTNSOFARTIFICLINTELLGNC

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 10/26/15

CS 2710 Foundations of AI Lecture 18 Inference in Bayesian belief networks Milos Hauskrecht milos cspitt edu 5329 Sennott Square CS 2710 Foundations ofAI TicTacToe competition Three winners CS 2710 Foundations ofAI TicTacToe competition Three winners 7 Bryan Mills 7 Yaw Gyamfi 7 Lei Jin CS 2710 Foundations ofAI Inference in Bayesian networks BBN models compactly the full joint distribution by taking advantage of existing independences between variables Simpli es the acquisition of a probabilistic model But we are interested in solving various inference tasks 7 Diagnostic task from effect to cause P Burglary l JohnCalls T 7 Prediction task from cause to effect P JohnCalls l Burglary T 7 Other probabilistic queries queries on joint distributions PAlarm Main issue Can we take advantage of independences to construct special algorithms and speeding up the inference CS 2710 Foundations ofAI But very often we can achieve signi cant improvements Bad news 7 Exact inference problem in BBNs is NPhard Cooper Inference in Bayesian network 7 Approximate inference is NPhard Dagum Luby Assume our Alarm network PJT Assume we want to compute CS 2710 Foundations ofAI Inference in Bayesian networks Computing PJ T Approach 1 Blind approach Sum out all uninstantiated variables from the full joint express the joint distribution as a product of conditionals PJ T ZPBbEeAaJTMm beTF eeTF aeTF mETF Z Z Z ZPJTAaPMmlAaPAalBbEePBbPEe bd ed Fa I Fm 39F Computational cost Number of additions Number of products CS 2710 Foundations ofAI Inference in Bayesian networks Computing PJ T Approach 1 Blind approach Sum out all uninstantiated variables from the full joint express the joint distribution as a product of conditionals PJT Z beTF eETF aETF mar PBbEeAaJTMm Z Z Z ZPJTAaPMmAaPAaLBbEePBbPEe b rf e rfb rfm Computational cost Number of additions 15 Number of products CS 2710 Foundations ofAI Inference in Bayesian networks Computing PJ T Approach 1 Blind approach Sum out all uninstantiated variables from the full joint express the joint distribution as a product of conditionals PJT 7 2 beTF eETF aETF m PBbEeAaJTMm Z Z Z ZPJTAaPMmjAaPAajBbEePBbPEe b rf e rfb rfm rf Computational cost Number of additions 15 Number of products 16464 CS 2710 Foundations ofAI Inference in Bayesian networks Approach 2 Interleave sums and products Combines sums and product in a smart way multiplications by constants can be taken out of the sum PJ T Z Z Z ZPU T l AaPMml AaPAalBbEePBbPEe barf earf barf mar F Z Z ZPJTAaPMmlAaPBbZPAalBbEePEe barf mar Fmarf ea ZPU T l A a Z PMml 14612 PBb Emma BbEePEe Computational cost Number of additions 12112 l Number of products 2 2212 1 CS 2710 Foundations ofAI Inference in Bayesian networks Approach 2 Interleave sums and products Combines sums and product in a smart way multiplications by constants can be taken out of the sum PJ T Z Z Z ZPU T AaPMml AaPAalBbEePBbPEe barf earf barf marf Z Z ZPJTAaPMmlAaPBbZPAalBbEePEe barf mar Fmar ZPJTAa Z PMml Aaz PBbZPAal BbEePEe barf marf barf earf Computational cost Number of additions 12112 l9 Number ofproducts 22212 1 CS 2710 Foundations ofAI Inference in Bayesian networks Approach 2 Interleave sums and products Combines sums and product in a smart way multiplications by constants can be taken out of the sum PJ T Z Z Z ZPU T AaPMml AaPAalBbEePBbPEe barf e rf adj m l39f Z Z ZPJTAaPMmlAaPBbZPAalBbEePEe barf 257 may 26 ZPU T l A a Z PMml Aa Z PBb Emma BbEePEe Computational cost Number of additions 12112 l9 Number of products 2 2212 116 CS 2710 Foundations ofAI Inference in Bayesian networks The smart interleaving of sums and products can help us to speed up the computation of joint probability queries What ifwe want to compute PB TJ T PB TJ T T ZPJTlAa Z PMmlAaPBIZIPAalBTEePEeL PM I I I 1 ZPU TlAa Z PMml Aa Z PBb ZPAal BbEePEe A lot of shared computation 7 Smart cashing of results can save the time for more queries CS 2710 Foundations ofAI Inference in Bayesian networks The smart interleaving of sums and products can help us to speed up the computation of joint probability queries What ifwe wantto compute PB TJ T PB T J T ZIPJTAal Z PMmAaMPBTZPAaBTEePEe mETF meTF eeTF PJ T ZPJTAaLZ PMmlAa Elma b ZPA a l B b E ePE e A lot of shared computation 7 Smart cashing of results can save the time if more queries CS 2710 Foundations ofAI Inference in Bayesian networks When cashing of results becomes handy What if we want to compute a diagnostic query PB TJ T PBTlJT PJT Exactly probabilities we have just compared There are other queries when cashing and ordering of sums and products can be shared and saves computation PBJ T P J T General technique Recursive decomposition PBlJT aPBJT CS 2710 Foundations ofAI Inference in Bayesian networks General idea PTrue l 7 ZZPltJ7ij7aZPltM7mwA7aZPB7bZPA7awB7bE7ePE7e aETF Jew mar bETF eETF ma fMa MM 1 Recursive A fB d 39t39 ecompOSl Ion M B Results cashed in the tree structure E CS 2710 Foundations ofAI Variable elimination Recursive decomposition 7 Interleave sum and products before inference Variable elimination 7 Similar idea but interleave sum and products one variable at the time during inference 7 Eg Query PJ T requires to eliminate ABEM and this can be done in different order PJT Z Z Z ZPJTAaPMmAaPAau3bEePBbPEe CS 2710 Foundations ofAI Variable elimination Assume order M E BA to calculate PJ T Z Z Z ZPJTlAaPMmAaPAaLBbEePBbPEe barf eETFaETFmeTF Z Z ZPU T l A aPAaLBbEePBbPEe ZPM ml Aa bd Fe 39F a rf mETF Z Z ZPUTlAaPAalBbEePBbPEe 1 Z ZPJTAaPBbZ PAaBbEePEe Z ZPJTAaPBbzAaBb Z PJTAa Z PBbz391AaBb adj e rf ZPJTAa TZAa CS 2710 Foundations ofAI Inference in Bayesian network Exact inference algorithms 7 Variable elimination BOOk 7 Recursive decomposition Cooper Darwiche 7 Symbolic inference D Ambrosio 7 Belief propagation algorithm Pearl 7 Clustering and joint tree approach Lauritzen 300k Spiegelhalter 7 Arc reversal Olmsted Schachter Approximate inference algorithms H 7 Monte Carlo methods BOOk Forward sampling Likelihood sampling 7 Variational methods CS 2710 Foundations ofAI Monte Carlo approaches MC approximation 7 The probability is approximated using sample frequencies 7 Example quot samples with BTJ T N N PBTJT N total samples BBN sampling One sample gives one assignment of values to all variables CS 2710 Foundations ofAI BBN sampling example PB Burglary 00010999 Earthquake PABE B E T F T T 095 005 T F 094 006 F T 029 071 F F 00010999 CS 2710 Foundations ofAI Generate sample in a top down manner following the links P E 0002 0998 BBN sampling example B Earthquake 0002 0998 PABE B E T F T T 095 005 T F 094 006 F T 029 071 F F 0001 0999 MaryCalls CS 2710 Foundations ofAI BBN sampling example PABE B E T F T T 095 005 T F 094 006 F T 029 071 F F 0001 0999 MaryCalls CS 2710 Foundations ofAI BBN sampling example PB PJ IA F 005 095 CS 2710 Foundations ofAI BBN sampling example CS 2710 Foundations ofAI BBN sampling example PB CS 2710 Foundations ofAI BBN sampling example PB CS 2710 Foundations ofAI Monte Carlo approaches MC approximation of conditional probabilities 7 The probability is approximated using sample frequencies 7 Example quot samples with BTJ T N 7 7 7 N BTJT PBTJTN J4 samples with J T Rejection sampling 7 Generate samples from the full joint by sampling BBN 7 Use only samples that agree with the condition the remaining samples are rejected Problem many samples can be rejected CS 2710 Foundations ofAI Likelihood weighting Avoids inef ciencies of rejection sampling 7 Idea generate only samples consistent with an evidence or conditioning event 7 If the value is set no sampling Problem using simple counts is not enough since these may occur with different probabilities Likelihood weighting 7 With every sample keep a weight with which it should count towards the estimate W377 135 T J T 7 samples mm BTand JT w samples with any value of B and JT 8 CS 2710 Foundations ofAI BBN likelihood weighting example PB Earthquake 0002 0998 PAIBE B E F T T 095 005 T F 094 006 F T 029 071 F F 0001 0999 JTset MFsetl CS 2710 Foundations ofAI BBN likelihood weighting example PB 095 005 094 006 029 071 0001 0999 JTset MFset CS 2710 Foundations ofAI BBN likelihood weighting example JTset MFsetl CS 2710 Foundations ofAI BBN likelihood weighting example PB M F set 111 CS 2710 Foundations ofAI BBN likelihood weighting example PB PMA T F T 07 03 1 099 JTsetzg CS 2710 Foundations 0 BBN likelihood weighting example CS 2710 Foundations 0 BBN likelihood weighting example PMA T F T 07 03 1 099 JT set 19 CS 2710 Foundations 0 BBN likelihood weighting example Second sample p P E i Earthquake 0002 0998 PABE B E T F T T 095 005 T F 094 006 F T 029 071 F F 0001 0999 JTset MFset CS 2710 Foundations ofAI BBN likelihood weighting example Second sample p PABE F B E T F T T 095 005 T F 094 006 F T 029 071 F F 0001 0999 JTset MFsetl CS 2710 Foundations ofAI BBN likelihood weighting example Second sample p JTset MFset CS 2710 Foundations ofAI BBN likelihood weighting example Second sample JTset MFset CS 2710 Foundations ofAI BBN likelihood weighting example Second sample p CS 2710 Foundations 0 BBN likelihood weighting example Second sample JTsetzg CS 2710 Foundations 0 BBN likelihood weighting example Second sample p CS 2710 Foundations 0 Likelihood weighting Assume we have generated the following M samples If we calculate the estimate PBT JTMFW totalisample a less likely sample from PX may be generated more often For example sample is generated more often than in 1 09 So the samples are not consistent with PX CS 2710 Foundations ofAI Likelihood weighting Assume we have generated the following M samples How to make the samples consistent Weight each sample by probability with which it agrees with the conditioning evidence Pe Weight 00495 lt Weight 027 CS 2710 Foundations ofAI Likelihood weighting How to compute weights for the sample Assumethequery PBT JTMF Likelihood weighting 7 With every sample keep a weight with which it should count towards the estimate M 2H3 Twlt gt BTJTMF 1 W 13BT JTMF samples mm any Value of B and JTMF CS 2710 Foundations ofAI

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

#### "I made $350 in just two days after posting my first study guide."

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.