### Create a StudySoup account

#### Be part of our community, it's free to join!

Already have a StudySoup account? Login here

# 611 Class Note for CSE 598F at PSU

### View Full Document

## 52

## 0

## Popular in Course

## Popular in Department

This 40 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at Pennsylvania State University taught by a professor in Fall. Since its upload, it has received 52 views.

## Similar to Course at Penn State

## Popular in Subject

## Reviews for 611 Class Note for CSE 598F at PSU

### What is Karma?

#### Karma is the currency of StudySoup.

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 02/06/15

Robert Collins CSE598G MeanShift Blob Tracking through Scale Space Robert Collins CVPR O3 Robert Collins CSE598G Abstract Meanshift tracking Choosing scale of kernel is an issue Scalespace feature selection provides inspiration Perform meanshift with scalespace kernel to optimize for blob location and scale Robert Collins CSE598G Nice Property Running meanshift With kernel K on weight image W is equivalent to performing gradient ascent in a Virtual image formed by convolving W With some shadow kernel H 2a KiiX Wa aX 2a KiiX Wa Ax c Vx Ea Hax wa Robert Collins CSE598G Size Does Matter Meanshift is related to kernel density estimation aka Parzen estimation s0 choosing correct scale of the meanshift kernel is important T00 small Robert Collins CSE598G Size Does Matter Fixedscale Robert Collins CSE598G Some Approaches to Size Selection Choose one scale and stick with it Bradski s CAMSHIFT tracker computes principal axes and scales from the second moment matrix of the blob Assumes one blob little clutter CRM adapt window size by 10 and evaluate using Battacharyya coef cient Although this does stop the window from growing too big it is not suf cient to keep the window from shrinking too much Comaniciu s variable bandwidth methods Computationally complex Rasmussen and Hager add a border of pixels around the window and require that pixels in the window should look like the object while pixels in the border should not Centersurround Robert Collins CSE598G ScaleSpace Theory Robert Collins CSE598G Scale Space Basic idea different scales are appropriate for describing different objects in the image and we may not know the correct scalesize ahead of time 39 39 l 7 l T a v g 256 7 I I E h l v f a j 64 32 512 Robert Collins CSE598G Scale Selection Scaie Selection Principie T Lindeberg Iii Hit macaw 0f ailicr C39L idcricc MSILNM iimi i aortic level iii which wirii priseiii viiimlinear immbimriiuii of viiiruiulizeri rimim iiirrm maximum if Mimi Vilmaiiirim uwm maxim arm iii Married an n rtmrz iig a thrimi eristir length of a 107 iff ji 39iir lrl39ny airiirzt im in lm data What are normalized derivatives Example using 2rd order derivatives airimi 72 V r quot T2 Wu nm U f 3 quotl quotl 039 quot U39J O39 Jr 1 n A m i m m a Uquot K 9 2 HJ Laplacian operator Ruben Cullins csEsny Laplacian Operator and LOG 2 82f 62f V f 8x2 6122 Just anothsr linear ller V2003 y G03 y VZGOC y f0 Y l l Laplacian of Laplacian of Gaussian LOG Gaussianfiltered image fi1tered image Rnh en C nllins CSE598G LOG Operator Second derivatives Laplacian 45 MHeberL CMU Robert Collins CSE598G L Gwy 3939l x1 Ruben Cullins csEsny Approximating LOG with DOG LOG can be approximate by a Difference of two Gaussians CDOG at different scales v2G N C G Best appromnanon when GzGi 16 quot 7 71 2 but more convenientif 01 f7 02 g E We will come back to DOG later Robert Collins CSE598G Local Scale Space Maxima Lindeberg proposes that the natural scale for describing a feature is the scale at which a normalized derivative for detecting that feature achieves a local maximum both spatially and in scale DnormL is a normalized Laplacian of Gaussian operator 02 L0GG Scale Example for blob detection Ruben Cullins csEsny Extrema in Space and Scale Robert Collins CSE598G Blob Detection Rob en Collins CSE59SG Why Normallzed Derlvatlv s Let Imago r column a Mob M 1m scale cs ccnmcd at n 390 mm mm 1s 21 mm maxlmum in m m space Icprcscnmrion m f Considcrn no mugs go mm mm l5 my a spantu gtmed verslon 06 We mum hm Expect Lmom m be 1 mm math in the scale 5 rcprcscnmrion of v him i Howcxcr me mugnmuks of mo hm hcr x pon m39 diff cm whme make it hard Laplacian OfGaussian LOG m compare Munc dcuip ons um h crcnt scales A57 sumng v i xoul m ofgcncr hy mm the Mob is ccmcrcd At nd pcrformingd change ofvm39iublcs y39 39 on me we nd hm zero LUIS 10 T 1 1 1 Amplitude ofLOG response 31 Z 4 11 decreases with greater smoothing 3L s lt13 Rob en Collins CSE59SG Interesting Observation If We approximate the LOG by a Difference of Gaussian DOG lter We do not have to normalize to achieve constant applit39ude across scale 1 1 2m 1lt 2713530 6 yinII H4 moo0 Why docs the LOG 8139 have to be nornmlizcd to gixc an invariant response 4 o SCJICS u mic is upproximmion the DOC lm does nor The DOG opct or dOCgt not are prmimnrc he LOG in he N39uduiondl son of DOG z LOG Rmhcr m mo funciions m approximmclg un up 0 1 scale mm such thar DOGLOG m constant Speci callyquot by consideringthc mines ofthc two mcrions M the origin 0 ii igt easy to see ilk 00001 a 04x75c139 LOGS Robert Collins CSE59SG Another Explanation The relationship between D and UEVZG can be understood from the heat diffusion equation parameterized in terms Ufa rather than the more usual 1 172 0639 f N30 00 From this we see that VZG can be computed from the nite difference approximation to UGfjo using the difference ofnearhy scales at Ivy and a gv2 R Czrul kg 7 GltLl39 50 do 1m7a and therefore GltJ u L717 7 C17 IJr k I 7 UUZVZG Lowe IJCV 2004 Sift key paper Robert Collins CSE598G Anyhow Scale space theory says we should look for modes in a DOG ltered image volume Let s just think Of the spatial dimensions for now We want tO look for modes in DOG ltered image meaning a weight image convolved with a DOG lter Insight if we view DOG lter as a shadow kernel we could use meanshift tO nd the mOdes Of course we d have tO gure out what meanshift kernel corresponds tO a shadow kernel that is a DOG Robert Collins CSE598G KernelShadow Pans Given a convolution kernel H What is the corresponding meanshift kernel K Perform change of variables r Ha xll2 Rewrite Hax gt hHaxllz gt hr Then kernel K must satisfy h r c k 1 Examples Shadow Ep anic hni kov Gaussian Kernel Flat Gaussian Robert Collins h r c k 1 mm Kernel related to DOG Shadow shadow gtl 39 1 J 4 jj39lzquot1l quot Z 3939quotII391 am 113 quot quot M11713 1J1 HESIquot 1 1 W where L 1lj L Jul 73 01 OSqrt16 02 0sqrt16 kernel 15 j39fx39x39yl f f 39I39a3939139 33E 22228 W c k r Kernel related to DOG Shadow 391 x 39 391m39Gf Chum ill6 n7 DE n5 n4 n3 n2 U1 El m n 2mm 4EID Bun Bun mun 12mm some values are negative Is this a problem Umm Yes it is Robert Collins CSE598G Dealing with Negative Weights Robert Collins CSE598G Show little demo with neg weights meanshift will sometimes converge to a valley rather than a peak The behavior is sometimes even stranger than that step size becomes way too big and you end up in another part of the function Robert Collins CSE598G Why we might want negative weights Given an nbuehet histogram mi l il n and data u histogram d l il n CRM suggest measuring p 5 mg X61 similarity using the BattaeharyyaCoefheient i They use the meanshift algorithm to climb the W m d spatial gradient of this function by weighting each i 1 pixel falling into buehet i the term at right 1 quotl Note the similarity to the likelihood ratio function Wt z Ogl 8 t t t t t t t l l 0 1 2 3 4 5 6 7 8 9 1 0 Robert Collins CSE598G Why we might want negative weights mi W1 3 10g2 7 Using the likelihood ratio makes sense probabilistically For example using meanshift with uniform kernel 0n weights that are likelihood ratios m g 4quot mgr would then be equivalent to using KL divergence to measure difference between model m and data 1 histograms IN m 2 3g 2 M Z 1quot 139 mg d I I Hfquot sum over pixels sum over buckets with value i note ndi pixels have value i Robert Collins CSE598G Analysis Scaling the Weights recall mean shift offset m x111 wtic like r11 3 my i3 a 1 what if wa is scaled to cwa in if H W wiiiwfi l So mean shift is invariant to scaled weights Ruben Cullins csEsny Analysis Adding a Constant what il we add a constant to getwac 7 ZKQL zlltu 2K1 WAL ZIH uuH FEKHH K vgtIlt xgt 2Ka wzlz zltH uum m W So mean shift is not invariant to an added constant This is annoying Robert Collins CSE598G Adding a Constant result It isn t a good idea to just add a large positive number to our weights to make sure they stay positive show little demo again adding a constant Robert Collins cswsc Another Interpretation of Meanshift Offset Thinking of offset as a weighted center of mass doesn t make sense for negative weights O O O O O h o o 0 weig t point 0 O O 0 AF 2aKaxwaax o o o 2a Kaxwa 0 39 o 1 o 0 o o o o o C O Robert Collins csmc Another Interpretation of Meanshift Offset Think of each offset as a vector which has a direction and magnitude vector of the average vector o I o AX 2a Kax wa ax o g 2a Kenx wa o o 0 i 0 i O Note a negative weight now just means O O a vector in the opposite direction O O i O o o i O Interpret mean shift offset as an estimate 0 Note numerator interpreted as sum of directions and magnitudes But denominator should just be sum of magnitudes which should all be positive Robert Collins CSE598G Absolute Value in Denominator We now see how to modify the mean shift equation I l to make sense for negative weights The numerator of that equation votes for both the magnitude and direction ofpointwise ott aet weerors so the negative weights should stay However the denominator normalizes by the overall total magnitude of the votes and therefore we must aunt only the magnitude the absolute value of each term The modi ed equation is in fit Jrj a a if t 2 IW ILIHttetl With this modi eation n39tean Shi remains well behavequot on imagea that contain negative pixel values or does it Robert Collins CSE598G back to the demo There can be oscillations when there are negative weights I m not sure what to do about that Ruben chums csEsny Outline of ScaleSpace Mean Shift General Idea build a designer shadow kernel chat generates che desired DOG sea1e space when convolved wich weight image wx Spatial Kernels Change variables andtake derivatives onhe shadow kernel to nd corresponding meanshi kernels using che relationship shown earlier Given an initial estimate xu so apply che meanshi algorithm to nd che nearestlocal mode in sea1e space Note chat using meanshi we DO NOT have to explicitly generate che sea1e space Rob en Collins CSE59SG ScaleSpace Kernel H x C 4 GJ J mm Am omm VJGf uuc6 Ruben Cullins CSE598G MeanShift through Scale Space 1 Input Weight image Wa with current location x and scale sn 2 Holding s xed perform spatial meanshift using equation A 7 31sz 1w mxu uu n quot 2 H xlZKxu uu39u 3 Let x be the location computed from step 2 Holding x xed perform meanshift along the scale axis using equation I X X Hi J l ziH mx m 4 Repeat steps 2 and 3 until convergence Rob ert Collins CSE59SG Second Thoughts Rather than being strictly correct about the kernel K note that it is approximately Gaussian N no n5 n4 n3 n2 n1 m n 2mm AEIEI EDD Bun mun 12mm blue Kernel associated with shadow kernel of DoG with sigma 0 red Gaussian kernel with sigma osqrt16 so why not avoid issues with negative kernel by just using a Gaussian to find the spatial mode Robert Collins CSE598G scaledem0m interleave Gaussian spatial mode finding with 1D DoG mode finding Robert Collins CSE598G Summary Mean shift tracking Choosing scale of kernel is an issue Scalespace feature selection provides inspiration Perform meanshift With scalespace kernel to optimize for blob location and scale Contributions Natural mechanism for choosing scale WITHIN meanshift framework Building designer kernels for efficient hillclimbing on implicitly defmed convolution surfaces

### BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.

### You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

#### "I signed up to be an Elite Notetaker with 2 of my sorority sisters this semester. We just posted our notes weekly and were each making over $600 per month. I LOVE StudySoup!"

#### "I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

#### "It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.