New User Special Price Expires in

Let's log you in.

Sign in with Facebook


Don't have a StudySoup account? Create one here!


Create a StudySoup account

Be part of our community, it's free to join!

Sign up with Facebook


Create your account
By creating an account you agree to StudySoup's terms and conditions and privacy policy

Already have a StudySoup account? Login here

Machine Learning

by: Marian Kertzmann DVM

Machine Learning CS 5350

Marian Kertzmann DVM
The U
GPA 3.78

Harold Daume

Almost Ready


These notes were just uploaded, and will be ready to view shortly.

Purchase these notes here, or revisit this page.

Either way, we'll remind you when they're ready :)

Preview These Notes for FREE

Get a free preview of these Notes, just enter your email below.

Unlock Preview
Unlock Preview

Preview these materials now for free

Why put in your email? Get access to more of this material and other relevant free materials for your school

View Preview

About this Document

Harold Daume
Class Notes
25 ?




Popular in Course

Popular in ComputerScienence

This 2 page Class Notes was uploaded by Marian Kertzmann DVM on Monday October 26, 2015. The Class Notes belongs to CS 5350 at University of Utah taught by Harold Daume in Fall. Since its upload, it has received 11 views. For similar materials see /class/229971/cs-5350-university-of-utah in ComputerScienence at University of Utah.

Similar to CS 5350 at The U


Reviews for Machine Learning


Report this Material


What is Karma?


Karma is the currency of StudySoup.

You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 10/26/15
Machine Learning CS 5350CS 6350 05 Apr 2007 Bayesian inference The general problem we face in Bayesian inference is to compute the expectation of some function with respect to a probabilistic model with unknowns In the simplest case we want the expectation of a single variable In more complex cases we want an expectation of a complex function of all variables Suppose p9 is our distribution of interest some 9 will be known some not We want 2 EM m depr For some cases this integral will be available in closed form eg HMMs For many most cases however it will not Lets say that 9 is discrete univariate Then we can compute the expectation by just summing over all possible values Obviously though this won t scale well to highdimensional or nondiscrete variables But let s see what happens if we try Integration by Summation Suppose 9 is univariate bounded continuous Wlog 9 E 01 If we remember how we rst learned integration we can break 0 1 into R equallysized rectangles Then we have Pd z 7 piRfiR 0 As R A 0 Z becomes increasingly more accurate One way of thinking about this is that we have a set S containing Rmany equally spaced points and the integral is approximated y 1 Z R W mw Unfortunately if 9 is Ddimensional then we need to sum RD values of 9 Uniform Sampling Instead of spacing 9 E S evenly let s space them randomly This is the idea of Monte Carlo77 integration which essentially means randomized integration Uniform sampling is the simplest case Let S be a random sampling of 9s Then we still have m prm ass Machine Learning CS 535005 6350 2 This scales better computationally but still the number of samples required to guarantee that we get a close approximation is huge It s worth thinking about how hard this problem is Think of a boat on a lake We want to estimate the volume of the lake but cannot see the bottom We can drive the boat to any position in the lake and drop an anchor thereby measuring the depth there How can we approximate the volume Uniform sampling says to drive randomly around the lake dropping at the ip of a coin But there are many chases in which we can do better Importance Sampling Here we use prior knowledge in the form of a helper distribution 11 that we expect to be similar to p and from which we can sample It must have the same support as p ie not zero too often Then we compute Z EeNplfWH d6p9f9 209 d6 6 6 qlt q9f gt 209 E N 6 9 1 law So instead of computing an expectation wrt p we compute wrt 11 And then we weight each example Rejection Sampling The idea in rejection sampling is similar to importance sampling Let 1 be a proposal distribution that satis es px S Mqx for M lt 0 Now draw points from 11 and accept them with probability Compute expectations only over the accepted points


Buy Material

Are you sure you want to buy this material for

25 Karma

Buy Material

BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.


You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

Why people love StudySoup

Steve Martinelli UC Los Angeles

"There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

Anthony Lee UC Santa Barbara

"I bought an awesome study guide, which helped me get an A in my Math 34B class this quarter!"

Bentley McCaw University of Florida

"I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"


"Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

Become an Elite Notetaker and start selling your notes online!

Refund Policy


All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email


StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here:

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.