### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# Class Note for EECS 841 with Professor Potetz at KU

### View Full Document

## 16

## 0

## Popular in Course

## Popular in Department

This 4 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at Kansas taught by a professor in Fall. Since its upload, it has received 16 views.

## Reviews for Class Note for EECS 841 with Professor Potetz at KU

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 02/06/15

Factor Node Factor Graphs Factor Graphs are the most general graphical model mog mo tea Just like MRFs but CD does not have to range over a graph clique Variable Node Shape from Shading Using LBP 82 82 q a y Image Constraint Nodes S39ng39e We Potential function is a level 0 G 0 set of the Shading a 9 equation 9 Any BRDF could be used 0 o o J ef PP gli e o o 0 9 H U o D o p Shape from Shading Using LBP Integrability Constraint Nodes Overcomplete Representation Linear dependencies between variables Real surfaces have zero curl er 4 4 9 pmay 1113774 qx l ly pxy l l O 9 P a7 Smoothness Nodes single pixe Shape from Shading is highly ambiguous D C D Requires a strong understanding of what 3D shapes are typical in real scenes E Known as a Statistical Prior HEP 4 U For now just exploit I Smoothness G V4R f A A n V Common Methods of Statistical Inference in Computer Vision Gradient Descent When PX is smooth and all else fails Belief Propagation Powerful but can be slow can fail for graphs with tight loops Graph Cuts Restrictive limitations on allowable potential functions Sampling Can be slow hard to detect convergence SumProduct Belief Propagation For any tree shaped factor graph exact marginals are computed by passing messages up then down Mme Mme h Nf were 2 W H iiiMm be H limo Nfm y Nfm heAm Finds all messages in 0NM2 time M is the number of states 0in N is the number of nodes ofX What if our graph is not a tree Loopy Belief Propagation o What if our graph is not a tree Equations for Belief Propagation are entirely local The proof of exactness depends on tree structure but the equations could be applied to any graph 1 h lisz 90 Z lawful f yENO t we H limo h Nac Loopy Belief Propagation Despite lack of formal justi cation loopy belief propagation performs well for many realworld problems LBP was laterjustified as minimizing an approximate error metric between marginals and factored distributions This approximation is the Bethe free energy popular in statistical physics The Bethe approximation tends to degrade for factor graphs with many tight loops nearlydeterministic potential functions However good performance has been achieved even under these conditions 14 Klaus Sormann Karner quotSegmentbased stereo matching using belief propagation and a selfadapting dissimilarity measure ICPR 2006 State of the Art g Right Image Left Image True Depth Inferred Depth Klaus Sormann Karner quotSegmentbased stereo matching using belief propagation and a selfadapting dissimilarity measure ICPR 2006 State of the Art 39W E True Depth Inferred iepth Shape From Shading Results b Linear Constraint Nodes c Lee amp Kuo Mean Squared Error 3390 d Zheng amp Chellappa Mean Squared Error 4240 2 Original Input 7 I mu 1 H V J a I Good image rerenderings imply that the Lambertian and Integrability constraints were wholly satisfied Improved performance can only be achieved through improving the spatial prior model pZ 17 Common Methods of Statistical Inference in Computer Vision Gradient Descent When PX is smooth and all else fails Belief Propagation Powerful but can be slow can fail for graphs with tight loops Graph Cuts Restrictive limitations on allowable potential functions Sampling Can be slow hard to detect convergence Gradient Descent for Computer Vision Optimization problems in computer vision are often highly nonlinear and very large Be aware of smarter variations of gradient descent Conjugate Gradient Descent L BFGS Genetic algorithms simulated annealing Conjugate Gradient Descent l For classical gradient descent each search direction is orthogonal to the previous Can get zigzag while running along narrow channeb Mahalanobis space Conjugate Gradient Descent For classical gradient descent each search direction is orthogonal to the previous Can get zigzag while running along narrow channels Conjugate each search direction so it is orthogonal in Recall Newton s Method Newton s method nds fx 0 0 Can be used to nd maxima of pX which occur when fX a E a XpX 0 Utilizing the 2nd derivative of P gives faster convergence amp better results 0 Requires computing the DgtltD Hessian matrix of pX D is the dimensionality of X 0 Intractable for large D Recall Newton s Method Newton s method nds fx 0 Can be used to nd maxima of pX which occur when a fX a X X 0 Utilizing the 2nd derivative of P gives faster convergence amp better results Requires computing the DgtltD Hessian matrix of pX D is the dimensionality of X Intractable for large DR LBFGS L BFGS approximates the Hessian from the last M search directions M is usually around 820 L BFGS does not need to explicitly construct the approximate Hessian it can compute each search direction using only the gradient and the last M directions Many software packages exist for L BFGS Common Methods of Statistical Inference in Computer Vision Gradient Descent when Ple is smooth and all else tails Graph Cuts o Belief Propagation Powerful but can be slow can tail for graphswith tight loops 0 Graph Cuts Restrictive limitationson allowable potential functions 0 Sampling Can be slow hard to detect convergence Using the MaxFlow MinCut Theorem to Optimize MRFs Max Flow Max Flow Given a weighted directed graph with a sink and a source assign ow values to each edge so that the total ow source to sink is maximized Given a weighted directed graph with a sink and a source assign ow values to each edge so that the total ow source to sink is maximized At each vertex At each vertex besides source amp sink besides source amp sink ow in ow out ow m ow out 0 5 ow along edge 5 edge capacity 0 5 ow along edge 5 edge capacity a a Max Flow Min Cut Given a weighted directed graph with a sink and a source partition the nodes into sets S andT so the sum ofedges between s andT is minimized Given a weighted directed graph with a sink and a source assign ow values to each edge so that the total ow source to sink is maximized At each vertex bes des mm 52 039 Min Cut seems harder to compute ow in ow out 0 5 ow along edge 5 edge capacity But Maximum Flow Minimum Cut Compute Max Flow by searching for augmenting paths simplest augmenting path algorithms are 0n2m Modern methods are even faster 3 3

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "When you're taking detailed notes and trying to help everyone else out in the class, it really helps you learn and understand the material...plus I made $280 on my first study guide!"

#### "I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

#### "Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.