New User Special Price Expires in

Let's log you in.

Sign in with Facebook


Don't have a StudySoup account? Create one here!


Create a StudySoup account

Be part of our community, it's free to join!

Sign up with Facebook


Create your account
By creating an account you agree to StudySoup's terms and conditions and privacy policy

Already have a StudySoup account? Login here


by: Allie West II


Allie West II

GPA 3.51

Jane Mulligan

Almost Ready


These notes were just uploaded, and will be ready to view shortly.

Purchase these notes here, or revisit this page.

Either way, we'll remind you when they're ready :)

Preview These Notes for FREE

Get a free preview of these Notes, just enter your email below.

Unlock Preview
Unlock Preview

Preview these materials now for free

Why put in your email? Get access to more of this material and other relevant free materials for your school

View Preview

About this Document

Jane Mulligan
Class Notes
25 ?




Popular in Course

Popular in ComputerScienence

This 8 page Class Notes was uploaded by Allie West II on Thursday October 29, 2015. The Class Notes belongs to CSCI 5722 at University of Colorado at Boulder taught by Jane Mulligan in Fall. Since its upload, it has received 34 views. For similar materials see /class/231980/csci-5722-university-of-colorado-at-boulder in ComputerScienence at University of Colorado at Boulder.




Report this Material


What is Karma?


Karma is the currency of StudySoup.

You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 10/29/15
Templates and Classifiers lLast Day Structure from Motion lFeaturebased lDense lToday Templates and Classifiers FampP Ch 22 Hebert notes CMU Recognition by template matching jectm Training Images Pauline Sven Test image Recognition by finding patterns lWe have seen very simple template matching under filters I Some objects behave like quite simple temp ates l Frontal faces I Strategy I Find image windows lCorrect lightin lPass them to a statistical test a classifier that accepts faces and rejects nonfaces Templates for Recognition I Some objects can be identified by simple tests on image windows faces stop signs lTemplafe mafch39 g ake all windows of a particular shape ITest to see whether relevant object is present lPossibly search over scale size and orientation More complicated sh es and objects can be identified by looking at relationships among groups of templates Classifiers lHow do we test whether object is present lClassifier ltakes a feature set as input lProduces a class la el l Build using a training set of feature label examples xiyi lFind a rule that takes a plausible measurement xi and computes its label yi Basic ideas in classifiers l Loss lsome errors may be more expensive than others 2 g a fatal disease that is easily cured by a cheap med c ne With no sideeettects egt false poslilves n diagnosis are better i on false Ylegailves lWe discuss two class classification L1gt2 is the loss caused by calling 1 a 2 l Total risk of using classifier RU Prlt1 2mm 2Pr2 all sL2gt1 lWe want to minimize total risk R Basic ideas in classifiers lGenerally we should classify as 1 if the expected loss of classifying as 1 is better than for 2 I For observation x gives lif lu JJlLH 2 INJJHLVJ t zif ttt l rlli 3939i WIN1 ICrucial notion Decision boundary l points where the loss is the same for either case Probabilistic Formulation P p0bJecli tteat e p0 ect2t feature I e femur For m Bayes r tsk l Decision boundarpy p yea i feature gt A pabectz i feamre I Learn pfeatureobject p0bect l feamre N pfeaure l obectpobect l Bayes Risk ER12s over label set 12 Issues lHow to represent and learn pfeafure object or decision oundary lHow to approach Bayes risk given small number of samples lWhat features to use lHow to reduce the feature space Evaluating Classifier Performance Deceiver Operating characteristic ROC peratmg paint 0 False postltvz rate 1 l abyectl feature F gt A pbackgrowid t feature Detection Rate l Detection Rate Probfeature from object is correctly classified as object I False Positive Rate Probfeature from background is classified as object ROC Curve FPF RDC curve astttve sensitivity false POSWVB true negative 4 cases True p specificity tatse negative ROC tetts us what happens as we vary test threshold Approaches l Every single pattern classificationlearning approach has been applied to this problem I Pick your favorite I Na39l39ve Bayes l Boosting l Neural networks I SVMs INNS lPCALDAICA dimensionality reduction letc Histogram based classifiers lUse a histogram to represent the class conditional densities lie px1 px2 etc lAdvantage estimates become quite good with enough data lDisadvantage Histogram becomes big with high dimension lbut maybe we can assume feature independence Histogram approach Example x I Features m magnitude of lst derivatives of Gaussian Laplacian at 3 different scales 6 component feature I Representation pmo histogram of features from training data 24 levels per axis Example from Berni Sch 22 um 36 1 p 31750 January 2000 m orveclor av locaiion Scan over all poss ble locations WV M2 wndow PO l Obj PM l Obj Computed by look ng up slogr am Tables uled av Mam ng Mme theh 6 mp lReturn object that maximizes pobjecfw limage 0c H pm lobjecfw pobjecfw Example from Berni smug I For complex scenes scan the image and evaluate quotprobabilityquot at scanned window locations lObJect nis in the image if many windows votequot for the ima e eg by evaluating VO l396ObJCCTnkPObJ CTnIWk Example from Berni 5mm Database m um rnn mm mm nnn a a 4 Example from Berm Schiele More ComplicaTed FeaTur39es llll niiiiili iii lim nmiil in ii llmll mniili lin l lllm mill in n FeaTLlre SeT of coefficienTs 5 1 N Example from BerquotT Sch ele Example from Henry Schneiderman EsTlmaTlng The ProbablllTles lCollecT The values of The feaTures for l6lven feaTures 1r compuTed from a straining dam in hisfograms may wmdow Threshold The likelihood raTlo approximme me probabiIi es y pv s l Assume quotdependence 1 i l 2 39l 75 liuvl 5 502000 orig nal images pg 0 my M pg M 1ooo synTheT c variations per or39lglnal 1 1 1 A gm l we onga l mi gr x l we gt quot9 l H w can we com pule These l probabilities p 1ooooooo examples quot39 W Example from Henry Schneiderman l CompLITe The values of all The feaTLlres in The win ow From 10 Images I For each feaTLlre compLITe The probabiliTies of coming 5 r m from The objecT or nonobjecT class I AggregaTe irlTo likelihood raTio quot o is i Search in position 0 l o liqav Search in scale useass min lMove a window To all possible posiTions and all possible scales l AT each posiTionscae evaluaTe The classifier E xampe r39om Henry Sc nei erman lReTurn deTecTion if above Threshold Example from Henry Schneiderman Feature Selection I Each feature is a set of variables wavelet coefficients 5 61CN IFind feature set which best classifies the dataset I Problem IIf Nis large the feature is very discriminative 5 is equivalent to the entire window if Nis the total number of variables but representing the corresponding distribution is very expensive IIf Nis small the feature is not discriminative but classification is very fast Solution Classifier Cascade I Standard problem IWe can have either discriminative or efficient features but not othl ICannot do classification in one shot I PCA Captures most of the variance later I Classi er Cascade IApcply first a classifier with simple features Fast an will eliminate the most obvious nonobject locations IThen apply a classifier with more com lex features ore expensive but applied any to these locations that survived the previous stage Cascade Example Apply classifier Apply clussifier Apply olassif er with very simple wi l more with more and fast features comma features complex features Eliminates most on when is en on what is left e Image Using Weak Features gt0 I Don39t try to design strong features from the beginning just use really stupid but really fast features and a lot o t em I Weak earner Very fast but very inaccurate classifier I Example Multiply input window by a very simple box operator and threshold output Example from Paul Viola hi trihiited h Tntel n imrt of the OpenCV library Feature Selection EB I Operators defined over all possible shapes and positions within the win ow I For a 24x24 window 45396 combinationsll I How to select the quotusefulquot features I How to combine them into classifiers IIn ut Training examples XI with labels quotfinequot or nonfacequot 1 4 weights w initially wi 1 I Choose the feature weak classifier hr with minimum error 5 Zwilhixi yil I Update the weights such that IWI39is increased if xl39is misclassified I Wl39is decreased if xl39is correctly classified I Compute a weight at for classifier hr Iarlarge if Eris small I Final classifier H x SgHKZ 04h 06 Repeat T t mes Discriminative Approaches Eli lThe automatic selection process selects natural featur I btttcuit to represent the dtstr butch h highrdimensiunal u mm teature spaces We dectstuh boundary directly I ceherai tdea Much less trahh data is needed to construct the dectstuh buundorytnon the dis rtbuttuhs I ox rhtze separatch between the classes for bette generalizmiun I Fewer jorameters 2 causstahs wrh equal cuv ricmce 7 params h e params Nearest Neighbors Q but who urn so I Does hot requtre recovery ot dtstrtbuttohs or dectstoh surfaces I Asymptotically twtce Bayes rtsh at most I Choice of distance metric Nicol I Ihdexthg may be dttttcuit Large Feature Spaces PCA I x teature vector ot htgh dtrhehstot 39v V l Difficult indexing in m nrd mensiunal pace I Must utthe d rhehstuhs are probably not usetui 2 I Prthctpai Component bomthaht etgettvectors of scatter matrtx 2 XX X 2m 2H 2 I Most of the information is contained th the space s awed by v1 vh the Principal Component Analysis IPCA Project first in the lowerdimensi nal space spanned by the principal components I Ihdexthg th much lower dtrhehstohai sp ce I Feature selecttott lFealures are hhear ahd thdepehdeht lCollect aset ot ptctures ot rh obJects ICompute the etgettvector representat lFor each obJect compute the coetttc space spanned by the obJects Vp IFor new tmage p compute pV OJ tott tttto space Vp IIdehttty obJect based oh mth dtstahce wrwk tott tents for the p h eigenvector s associated thh the h largest eigenvalues PCA for Recognition lAssume centred features Xini 1 Vk ainin teatores onto pr incipai directions ltVt39XtutlcVe39X I Input Feature v2 tor X I PruJect x pnto principal component space s VkX I Find obJecl Witn teatore vector x39io ClOSZSl to X In argmsxllx 7X1 Appearancebased matching lNayar et al3996 Columbia inn ii i Difficulties with PCA lProjection may suppress important detail lsmullest variance directions may not be unimportant lMethod does not take discriminative task into accou wis to compute features that allow good discrimination lnot the same as largest variance Linear Discriminant Analysis lWe wish to choose linear functions of the features that allow good discrimination lAssume classconditional covariances are the same lWant linear feature that maximi es he spread of class means for a fixed within class variance Problems ation in appearance I Vari I d eto illum nation and expression I doetoi distribution at teatores tor compression may not be tne best cnoice tor discrimination I LbA Fnd praJectiand rectionv tnat separates ne 2 classes best pistoneo wtwwnchsswsnltwtpml JV tent Scottoioteiossosottoipioioetion my I nemlizedeigenvaluepmblem I sniiar iicationot LbA totaces Fisheergs E2 noneor Yale alumbia Example Exampwmm Eehumeur m a


Buy Material

Are you sure you want to buy this material for

25 Karma

Buy Material

BOOM! Enjoy Your Free Notes!

We've added these Notes to your profile, click here to view them now.


You're already Subscribed!

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

Why people love StudySoup

Bentley McCaw University of Florida

"I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

Janice Dongeun University of Washington

"I used the money I made selling my notes & study guides to pay for spring break in Olympia, Washington...which was Sweet!"

Steve Martinelli UC Los Angeles

"There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."


"Their 'Elite Notetakers' are making over $1,200/month in sales by creating high quality content that helps their classmates in a time of need."

Become an Elite Notetaker and start selling your notes online!

Refund Policy


All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email


StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here:

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to

Satisfaction Guarantee: If you’re not satisfied with your subscription, you can contact us for further help. Contact must be made within 3 business days of your subscription purchase and your refund request will be subject for review.

Please Note: Refunds can never be provided more than 30 days after the initial purchase date regardless of your activity on the site.