Advanced Computer Vision
Advanced Computer Vision CAP 6412
University of Central Florida
Popular in Course
Popular in System Engineering
This 28 page Class Notes was uploaded by Khalil Conroy on Thursday October 22, 2015. The Class Notes belongs to CAP 6412 at University of Central Florida taught by Staff in Fall. Since its upload, it has received 35 views. For similar materials see /class/227222/cap-6412-university-of-central-florida in System Engineering at University of Central Florida.
Reviews for Advanced Computer Vision
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 10/22/15
Chapter 8 of Bishop39s Book Graphical Models Review of Probability Probability density over possible values Ofx 1993 Used to find probability Ofx falling in some ranch P1 E a 19 fba padx For continuous variables the probability of a single value is technically O For discrete values called probability mass Review of Probability The density function must satisfy two conditions 1956 2 0 p5131 Two important rules Sum Rule 1923 ydy 2956 9 px is marginal probability obtained by marginalizing outy Product Rule 2956 y pyvpv Graphical Models Consider the distribution pm 19 c Can be rewritten as Ma by C Mela 50PM 3 Which can be rewritten as Ma by C ptciaa bpblapa Making a Graphical Model Introduce one node per random variable Use the distribution Mela bpblapa Add one edge per conditional distribution I b What39s the distribution for this graph pl71pac2p1 3p174x112x3pr5 1E1 31136504gtp1 7 l74I5 Big Advantage Many Less Parameters 11 13233P4 2 1731 5 i551 I3PTo 174 507 5C4I5 Assume each variable can take K states how many numbers do we need to specify this distribution What if wejust wanted to express it as a giant joint distribution What do you do with distributions Draw Samples lnfer marginal distributions over variables Ancestral Sampling 1 Start at the low numbered nodes and draw samples 2Work your through the graph sampling from conditional distributions Generative Models The graph along with the ancestral sampling method is a model of how the data is generated The model expresses a causal process for generating the data These models are often called generative models They show how to generate the data Example of a Generative Model of Images of Objects MW 5 Fe 6 a Visual Scene TDP Hierarchical DP Transformed DP From Suddenh Torralba Freeman and VWIsky NIPSO5 Undirected Models Directed models are useful but have some complicating limitations Determining conditional independence can be hard Not all relationships can be expressed causally We can drop the arrows to make an undirected model 39039 O O Also called Markov Network or Markov Random Field Undirected Graphical Models To understand undirected models we need to introduce the notion of a clique Subset of nodes Links between all nodes in subset And Maximal Cliques If you add nodes to the clique it is no longer a clique OF Conditional Independence p a in 3 p I 5 p a b 3 p bquot it p a C p b it p a b j Conditional Independence in an MRF Conditioned on its neighbors a node is conditionally independent from the rest ofthe nodes in the graph Representing the Distribution in an Undirected Graph The form of the distribution is 1 PX E 11 WOW The distribution is formed from potential functions on the maximal cliques ofthe graph 7 They represent compatibility between the states of different variables INCX0 Z is a normalization constant and is also known as the partition function Converting a directed model to an undirected model You moralize the graph Marry the parents Can then use inference techniques for undirected graphs 11 1 3 11 1 3 Inference in Graphical Models Let39s say that I39ve analyzed the problem Have designed a graphical model that relates the observations to quantities that I would like to estimate Get my observations How do estimate the hidden or latent quan es Inference and Conditional Independence Conditional independence is important for MRF models because it makes inference much easier Consider a chain Task Find the marginal distribution of some variable x Na39l39ve Slow Way Use the sum rule and do the sums 117i1 Z I 2X 39 n 1 n f 1 1 EV Implication If you have K states per node and N nodes this will take KN operations Taking advantage of the structure The distribution ofthis chain x1 XN ls l i i i MK EWL Tly2W3 F273 39 E N71171Ll117N 4 Node Example O O O O 19332 Z Z Z Z 12172 23 I327 133 3437334 Start Re Arranging Sums 1 19562 Z Z Z 12SI317 132W23il727 5173 21934363 1174 331 2 333 334 Re Arranging Sums MM 2 Z Z Z 1217 2 23 I327 173 Z 3Alt37 134 131 132 333 334 Make Substitution m3333 21534333 134 134 Which leads to 19372 Z Z Z 12 L 17 332 23 U29 133m3333 1 2 3 Do it again PCT392 Z Z 12 U17 132 Z 23 I327 3m33 131 332 333 To Get PC1172 Z Z 12CE17 172m2 132 531 5132 How many operations did this require
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'