### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Class Note for EECS 841 with Professor Potetz at KU 12

### View Full Document

## 13

## 0

## Popular in Course

## Popular in Department

This 5 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at Kansas taught by a professor in Fall. Since its upload, it has received 13 views.

## Reviews for Class Note for EECS 841 with Professor Potetz at KU 12

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 02/06/15

Suggested Reading Graph Cuts Boykov Veksler amp Zabih quotFast approximate 8411 Computer VISIOl l energy minimization via graph cuts ICCV 2001 Bn39an Potetz Gibbs Sampling Fall 2008 Lecture 34 Graph Cuts amp Gibbs Sampling Common Methods of Statistical Gradient Descent for Computer Vision Inference in Computer Vision Optimization problems in computer VS0n are often 0 Gradient Descent highly nonlinear and very large When PX is smooth Be aware of smarter variations ofgradient descent Belief Propagation Powerful but can be slow can fail for graphs with tight loops LBFGS Graph Cuts Genetic algorithms simulated annealing Restrictive limitations on allowable potential functions 39 Conjugate Gradient Descent 0 Sampling Can be slow hard to detect convergence Conjugate Gradient Descent Conjugate Gradient Descent For classical gradient descent each search direction is 39 For CIaSSical gradient orthogonal to the previous descent eaCh searCh Can get zig zag while running along narrow channels direction is orthogonal to Conjugate each search direction so it is orthogonal in the preViOUS Mahalanobis space 12 39 Can get zigzag while running along narrow channels Recall Newton s Method Newton s method nds fx 0 Can be used to nd maxima of pX which occur when NO 8 E a XpX 0 Utilizing the 2nd derivative of P gives faster convergence amp better results Requires computing the DgtltD Hessian matrix of pX D is the dimensionality of X Intractable for large D Recall Newton s Method pX which occur when fX a E a XpX 0 P gives faster convergence amp better results 0 Requires computing the DgtltD Hessian matrix of pX D is the dimensionality of 0 Intractable for large DA Newton s method nds fx 0 0 Can be used to nd maxima of 0 Utilizing the 2nd derivative of X LBFGS L BFGS approximates the Hessian from the last M search directions M is usually around 820 L BFGS does not need to explicitly construct the approximate Hessian it can compute each search direction using only the gradient and the last M directions Many software packages exist for L BFGS Common Methods of Statistical Inference in Computer Vision 0 Gradient Descent When PX is smooth 0 Belief Propagation Powerful but can be slow can fail for graphs with tight loops Graph Cuts Restrictive limitations on allowable potential functions Sampling Can be slow hard to detect convergence lU Graph Cuts Using the MaxFlow MinCut Theorem to Optimize MRFs Max Flow Given a weighted directed graph with a sink and a source assign ow values to each edge so that the total ow source to sink is maximized At each vertex besides source amp sink ow in ow out O 3 ow along edge 3 edge capacity Max Flow Given a weighted diiected graph with a sink and a source assign ow values to each edge so that the total ow source to sink is maximized At eaeh vertex besides source a sink ow in ow out 0 5 ow aiong edge 5 edge eapaeitv Max Flow Given a weighted diiected graph with a sink and a source assign ow values to each edge so that the total ow source to sink is maximized At each vertex besides source a sink ow 1 ow out 0 5 ow aiong edge 5 edge eapaeitv Compute Max Flow by searching for augmenting paths simplest augmenting path a1gonthms are Modem methods are even faster O MaxFlow MmCut Given any ow F and any cut S the ow across the cut equals the ow into target node t Flow through pipes cannot exceed capacity so Flowf S CapacityS So mincut 2 max ow 9 gt o o 15 15 o 10 a a E a 10 0 10 o 15 o 10 10 Value 24 30 MaxFlow Mm Cut Suppose ow F is maximal There must be some cut S composed only of 1 full pipes moving towards T 2 empty pipes moving away from T Because otherwise I could construct an augmentingparh and F would not be optimal MK 6W9 1o 4 E 9 5 a 10 4 10 o 15 o 10 4 o cupcmly 15 4 0 aw 14 M WM 2E 30 MaxFlow MmCut Suppose ow F is maximal There must be some cut S composed only of 1 full pipes moving towards T 2 empty pipes moving away from T Because otherwise I could construct an augmentingpalh and F would not be optimal lt2 cupcmly 15 4 39 MW 39 4 14 Value 28 30 39 2 MaxFlow MinCut Suppose ow F is maximal There must be some cut S composed only of 1 full pipes moving towards T 2 empty pipes moving away from T Because otherwise I could construct an augmentingparh and F would not be optimal 9 9 1o 1 9 10 4 0 15 15 o 10 4 E 9 5 a as 10 4 1o 4 o o 15 o m capcmly 15 How 14 14 va ue 28 4 30 7 39 MaxFlow MmCut Suppose ow F is maximal There must be some cut S composed only of 1 full pipes moving towards T 2 empty pipes moving away from T Because otherwise I could construct an augmentingpalh and F would not be optimal capcmty 15 40 1 aw 14 M 30 739 MaxFlow Mm Cut Suppose ow F is maximal There must be some cut S composed only of 1 full pipes moving towards T 2 empty pipes moving away from T Because otherwise I could construct an augmentingparh and F would not be optimal The capacity ofS equals the ow ofF Since mincut Z maxflow then minicut maxi ow cupcmly 15 aw 14 Graph Cuts Cost of pixel p not belonging to class a 3 Cost ofpixel p and q not belonging to the same class Cost ofpixel p not belonging to class 3 Graph Cuts GraphBased Segmentation Felzenszwalb amp Huttenlocher Efficient GraphBased Image Segmentation IJCV 2004 27 Common Methods of Statistical Inference in Computer Vision Gradient Descent When PX is smooth Belief Propagation Powerful but can be slow can fail for graphs with tight loops 0 Graph Cuts Restrictive limitations on allowable potential functions 0 Sampling Can be slow hard to detect convergence 8 Gibbs Sampling Goal Given a multidimensional distribution try to draw samples from it IfI can accumulate a large database of samples I can easily find the most likely sample or the expected value of some variable Gibbs Sampling Guess X1 using the marginal PXl Then guess X2 using the conditional Plexr Then guess X3 using the conditional PX3lx1 X2 Etc 1 Initialize 1322 l 41 2 Forr1139 7 5amp1ejlitquot NPmp g Jf J J 7 Sample 19 N pragiavrt m m an yarn ivigt m Hp SampleJ mp x a71 mi N in 75am prairie Hag

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "I signed up to be an Elite Notetaker with 2 of my sorority sisters this semester. We just posted our notes weekly and were each making over $600 per month. I LOVE StudySoup!"

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.