630 Class Note for CSE 598F at PSU
630 Class Note for CSE 598F at PSU
Popular in Course
Popular in Department
verified elite notetaker
One Day of Notes
verified elite notetaker
verified elite notetaker
One Day of Notes
verified elite notetaker
verified elite notetaker
verified elite notetaker
This 93 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at Pennsylvania State University taught by a professor in Fall. Since its upload, it has received 26 views.
Reviews for 630 Class Note for CSE 598F at PSU
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 02/06/15
Meanshift Tracking RColins CSE PSU CSE598G Spring 2006 AppearanceBased Tracking current frame revious location likelihood over S 0 Object ocation current location 39 39 44 Y5 gt r V 39 r ng f 1 t l 39 ModeSeeking eg meanshift LucasKanade particle filtering color intensity edge histograms M MeanShift The meanshift algorithm is an efficient approach to tracking objects whose appearance is defined by histograms not limited to only color Motivation Motivation to track nonrigid objects like a walking person it is hard to specify an explicit 2D parametric motion model Appearances of nonrigid objects can sometimes be modeled with color distributions H Credits Many Slides Borrowed from wwwwisdomweizmannacidenissvisionspring04filesmeanshiftmeanshiftppt Mean Shift Theory and Applications Yaron Ukrainitz amp Bernard Sarel weizmann institute Advanced topics in computer vision even the slides on my own work Bob Intuitive Description 0 Obiective Find the densest region Intuitive Description Region of 0 interest Center of mass 0 39 o O O O O o 39 o o O O O O O O Obiective Find the densest region Intuitive Description Region of 0 interest Center of mass 0 39 o O O O O o 39 o o O O O O O O Obiective Find the densest region Intuitive Description Region of 0 interest Center of mass 0 39 o O O O O o 39 o o O O O O O O Obiective Find the densest region Intuitive Description Region of 0 interest Center of mass 0 39 o O O O O O Obiective Find the densest region Intuitive Description Region of 0 interest Center of mass 0 39 o O O O O O Obiective Find the densest region Intuitive Description 0 O O Obiective Find the densest region What is Mean Shift A tool for Finding modes in a set of data samples manifesting an underlying probability density function PDF in R PDF in feature space Color space Scale space Actually any feature space you can conceive Nonparametric Density GRADIENT Estimation Mean Shift PDF Analysis NonParametric Density Estimation Assumption The data points are sampled from an underlying PDF 39 39 39 Assumed Underlying PDF Real Data Samples NonParametric Density Estimation Assumed Underlying PDF Real Data Samples NonParametric Density Estimation Assumed Underlying PDF Real Data Samples Appearance via Color Histograms IE UMUMMWWMMM G9 Color distribution 1D histogram dISCTCUZC normalized to have unit weight R9 R ltlt 8 nbits Total histogram size is 2A8nbitsA3 G G ltlt 8 nbits 39 B ltlt 8nbits example 4bit encoding of RG and B channels yields a histogram of size l6l6gt lt 16 4096 Smaller Color Histograms Histogram information can be much much smaller if we are Willing to accept a loss in color resolvability Marginal R distribution R9 M G9 I Marginal G distribution discretize M Marginal B distribution R R ltlt 8 nbits G G ltlt 8 nbits B B ltlt 8nbits Total histogram size is 32A8nbits example 4bit encoding of RG and B channels yields a histogram of size 316 48 Color Histogram Example Normalized Color Y ag 9b ragabfgb Normalized color divides out pixel luminance brightness leaving behind only chromaticity color information The result is less sensitive to variations due to illumination shading A Intro to Parzen Estimation Aka Kernel Density Estimation Mathematical model of how histograms are formed Assume continuous data points JI I ll LLJJJJI1LLJJ1LL Parzen Estimation Aka Kernel Density Estimation Mathematical model of how histograms are formed Assume continuous data points 01 t t TV quot7 V t t 005k o Convolve With box lter ofwidt W eg 1 1 1 Take samples of result With spacing W Resulting value at point 11 represents count of data points falling in range uW2 to uW2 Box lter 111 111111u 111111111111111111 Example Histograms 01 005 01 005 D 01 005 1D 20 30 40 50 60 7U 80 90 Increased smoothing Why Formulate it This Way Generalize from box filter to other filters for example Gaussian Gaussian acts as a smoothing filter Kernel Density Estimation Parzen windows Approximate probability density by estimating local density of points same idea as a histogram Convolve points with Windowkernel function eg Gaussian using scale parameter eg sigma UV 5 From Hm 1 1 Umsmlt mmEBmgo 2 0229 mommm mxm3omn Umsmg mmg3m mm 2 m 86 605 lt2 Qmm m2ltmomma xm3mm mama szcmsomm mooSmolt ltm omsm m 850338 3 H5 3 H m H 1i 1c 2 4k 9quot an q 5 a new uio at y L tom r 41 w gag r I 454 1 tr ihhhbomwhu oath t t argo l 03 chm m m mBOOESQ momm mmmoo Cs mmoZmQ mmcm 92 8 ooA 9m momm AmEBm ltmcmv 9 9m 5858 22 gt3mltltm 2 92 m mmemmem cm mBm m J hi J A h l rcob nhhyoa C D 33 t 2 01 A 1 03 chm m m Kernel Density Estimation Parzen Windows General Framework A function of some nite number of data points X1X Kernel Progerties Normalized Symmetric Exponential weight decay Kernel Density Estimation Parzen Windows Function Forms A function of some nite number of data points X1Xn Data In practice one uses the forms Same function on each dimension Function ofvector length only Kernel Density Estimation Various Kernels A function of some finite number of data points X1X Examples Epanechnikov Kernel ff Uniform Kernel Normal Kernel Key Idea Superposition of kernels centered at each data point is equivalent to convolving the data points with the kernel convolution Kernel Data Kernel Densit Estimation YA Gradient Give up estimating the PDF Estimate ONLY the gradient Using the Kernel form We get Key Idea Gradient of superposition of kernels centered at each data point is equivalent to convolving the data points with gradient of the kernel convolution Data gradient of Kernel Kenmumeg iymma lmi Gradient Computing TheMean Shift Yet another Kernel density estimation Simple Mean Shift procedure Compute mean shift vector Translate the Kernel window by mx Mean Shift Mode Detection hat happens ifwe reach a saddle point ll Perturb the mode position and check ifwe return back Updated Mean Shi Procedure Find all modes using the Simple Mean Shi Procedure Prune modes by perturbing them nd saddle points and plateaus Prune nearby take highest mode in the window Mean Shift Properties IgtgtgtgtEgt O 0 000000000 0 Automatic convergence speed the mean shift Adaptive vector size depends on the gradient itself Gradient Ascent Near maxima the steps are small and refined Convergence is guaranteed for infinitesimal steps only infinitely convergent therefore set a lower bound For Uniform Kernel convergence is achieved in a finite number of steps Normal Kernel w exhibits a smooth trajectory but is slower than Uniform Kernel Real Modality Analysis Tessellate the space Run the procedure in parallel with windows Real Modality Analysis The blue data points were traversed by the windows towards the mode Real Modality Analysis An example 39bhthqutug E Window tracks signify the steepest ascent directions Adaptive Mean Shift Mean Shift Strengths amp Weaknesses I gti fgti fgti gtgt Strengths Application independent tool Suitable for real data analysis Does not assume any prior shape eg elliptical on data clusters Can handle arbitrary feature spaces Only ONE parameter to choose h window size has a physical meaning unlike K Means Weaknesses The window size bandwidth selection is not trivial Inappropriate window size can cause modes to be merged or generate additional shallow modes Use adaptive window snze Clustering Cluster All data points in the attraction basin of a mode Attraction basin the region for which all trajectories lead to the same mode Clustering Synthetic Examples Simple Modal Structures Complex Modal Structures Clustering Real Example Feature space 39nitial window L u v representation enters pruning Clustering Real Example Luv space representation Clustering Real Example 1 w 2D Lu space representation 39 inn mnmntiziu DENSiW g Not all trajectories in the attraction basin reach the same mode u Final clusters NonRigid Object Tracking IIIIIIIIIIIIIIWIIIIIIIIIIIIII 391 Nonngld Object Tracking Real Time ObjectBased Suneillance DriverAssistan Qei Video Compression MeanShift Object Tracking General Framework Target Representation MeanShift Object Tracking General Framework Target Localization IIII IBIIIIII Using MeanShift for Tracking in Color Images Two approaches 1 Create a color likelihood image with pixels weighted by similarity to the desired color best for unicolored objects 2 Represent color distribution with a histogram Use meanshift to nd region that has most similar distribution of colors Meanshift on Weight Images Ideally We Want an indicator function that returns 1 for pixels on the object We are tracking and 0 for all other pixels Instead We compute likelihood maps Where the value at a pixel is proportional to the likelihood that the pixel comes from the object We are tracking Computation of likelihood can be based on 39 color 39 texture 39 shape boundary 39 predicted location Note So far we have described meanshi as operating over a set of point samples MeanShift Tracking Let pixels fonn auniform grid ofdata points each with aweight pixel value proportional to the likelihood that the pixel is on the object we want to track Perform standard meanshift algorithm using this weighted set ofpoints Nice Property Running meanshift With kernel K on weight image W is equivalent to performing gradient ascent in a Virtual image formed by convolving W With some shadow kernel H Note mode we are looking for is mode of location xy likelihood NOT mode of the color distribution KernelShadow Pairs Given a convolution kernel H What is the corresponding meanshift kernel K Perform change of variables r Haxll2 Rewrite Hax gt hHaxllz gt hr Then kernel K must satisfy Examples Shadow Epanichnikov Gaussran Kernel Flat Gaussian Using MeanShift on Color Models Two approaches 1 Create a color likelihood image with pixels weighted by similarity to the desired color best for unicolored objects 2 Represent color distribution with a histogram Use meanshift to nd region that has most similar distribution of colors HighLevel Overview Spatial smoothing of similarity function by introducing a spatial kernel Gaussian box lter Take derivative of similarity with respect to colors This tells what colors we need moreless of to make current hist more similar to reference hist Result is weighted mean shift we used before However the color weights are now computed onthe y and change from one iteration to the next MeanShift Object Tracking Target Representation Quantized Color Space color MeanShift Object Tracking PDF Representation Target Model centered at 0 Similarity Function MeanShift Object Tracking Smoothness of Similarity Function IIIIIIIIIIIIII Similarity Function Large similarity P re 23613 Spatialinfois 39 variationsfor r0 em39 p y lost adjacent color info only Iocations Gradient fis not smooth 0 has optimizations are not robust Mask the target with Solution an isotropic kernel in the spatial domain fy becomes smooth in y MeanShift Object Tracking Finding the PDF of the target model candidate R7 model y I A differentiable isotropic convex monotonically decreasing kernel Peripheral pixels are affected by occlusion and background interference I The color bin index 1 m of pixel x Probability of feature u in model Probability of feature u in candidate Normalization factor Normalization factor Pixel Weight Pixel Weight MeanShift Object Tracking Similarity Function Target model Target candidate j Similarity function MeanShift Object Tracking Target Localization Algorithm MeanShift Object Tracking Approximating the Similarity Function Model location y0 Candidate location 3 Linear approx around yo Independent of y Density estimate as a function of MeanShift Object Tracking Maximizing the Similarity Function sought maximum The target representation provides suf cient discrimination One mode in the searched neighborhood MeanShift Object Tracking Applying MeanShift The mode of 7 Original MeamShift Flnd mode of L 5 Extended MeanShift Fmd mode Of 7 MeanShift Object Tracking About Kernels and Profiles A special Class of radially symmetric kernels The profile of kernel K Extended MeanShift Fmd mode of u MeanShift Object Tracking Choosing the Kernel A special class of radially symmetric kernels Epanechnikov kernel Hl Ml X 39quot i l H a if in 11 L i 7 139 V V l l l m n 39 W 1 7 7 lu l r 1 1quot 11 quot fr will MeanShift Object Tracking Adaptive Scale Problem The scale h of the The scale of the target kernel must be adapted changes in time Solution Run Choose h localization 3 that achieves times with maximum different h similarity MeanShift Object Tracking Results From Comaniciu Ramesh Meer Feature space 16X16gtlt16 quantized RGB Target manually selected on 1st frame Average meanshift iterations 4 MeanShift Object Tracking Results MeanShift Object Tracking Results From Comaniciu Ramesh Meer MeanShift Object Tracking Results From Comaniciu Ramesh Meer Feature space 128x128 quantized RG MeanShift Object Tracking Results The man himself L VT From Comaniciu Ramesh Meer Feature space 128x128 quantized RG Handling Scale Changes MeanShift Object Tracking The Scale Selection Problem Kernel too big Kernel too small too small h mus t get Poor I too blg or In uniformly I colored regions Smaller h may Nothing keeps Problem similarity is gt achieve better gt h from shrinking similarity too small Invariant to h Some Approaches to Size Selection Choose one scale and stick with it Bradski s CAMSHIFT tracker computes principal axes and scales from the second moment matrix of the blob Assumes one blob little clutter CRM adapt window size by 10 and evaluate using Battacharyya coef cient Although this does stop the window from growing too big it is not suf cient to keep the window from shrinking too much Comaniciu s variable bandwidth methods Computationally complex Rasmussen and Hager add a border of pixels around the window and require that pixels in the window should look like the object while pixels in the border should not Centersurround Tracking Through Scale Space Motivation Previous method This method Scale Space Feature Selection Form a resolution scale space by convolving r i 1 image with Gaussians of increasing variance Ll fl Elf 17 4 f39ifJ Lindeberg proposes that the natural scale for describing a feature is the scale at which a WZDmlejl39rm fa 393 normalized differential operator for detecting a that feature achieves a local maximum both spatially and in scale For blob detection the Laplacian operator is used leading to a search for modes in a LOG scale space Actually we approximate the LOG operator by DOG Scale Lindeberg s Theory The Laplacian operator for selecting bloblike features F 20 LOG filter with scale 0 3D scalespace re presentation Best features are at Scalespace Laplacian of representation Gaussian LOG Xio that maX39m39ze L Lindeberg s Theory MultiScale Feature Selection Process 250 strongest responses 3D scaleSpace Large circle large scale function Original Image Tracking Through Scale Space Approximating LOG using DOG 2D Gaussian with p0 and scale 160 2D Gaussian with p0 and scale 0 2D LOG filter with scale 0 2D DOG filter with scale 0 Why DOG Gaussian pyramids are created faster DOG lters at multiple scales 39 Scale pace f er bank Gaussian can be used as a meanshift kernel 3D spatial kernel Tracking Through Scale Space Using Lindeberg s Theory Weig Recall 3D spatial k rnel Model Candidate Color bin Pixel weight A Ema 1D scale kernel 3D scalespace Epanechnikov representation 1 Modes are blobs in the scalespace neighborhood Centered at current location and scale Need a meanshift procedure that finds local modes in EXO The likelihood that each candidate pixel belongs to the tar et Outline of Approach General Idea build a designer shadow kernel chat generates the desired DOG scale space whch convolved wlch weight image wx Scale Scale Kernel Spatial Kernels Change variables and take dch39vau39vcs onhc shadow kernel to nd corresponding man shin kernels using chc relationship shown earlier Given an initial estimate xu sh apply chc mean shin algorithm to nd chc nearest local mode in scale space Note chat using man shall wc DO NOT have to explicidy gchchacc chc scale space ScaleSpace Kernel b A a w Scale Kernel ELY w va Mn Uumof Colmanc Tracking Through Scale Space Applying MeanShift Use interleaved spatialscale meanshift Spatial stage Scale stage Fix 0 and Fix x and look for the look for the best X best 0 Iterate stages until convergence of X and o Sample Results No Scaling Sample Results 10 rule Sample Results ScaleSpace
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'