×

Let's log you in.

or

Don't have a StudySoup account? Create one here!

×

or

13

0

28

309 Class Note for CS 59000 with Professor Qi at Purdue

Marketplace > Purdue University > 309 Class Note for CS 59000 with Professor Qi at Purdue

No professor available

These notes were just uploaded, and will be ready to view shortly.

Either way, we'll remind you when they're ready :)

Get a free preview of these Notes, just enter your email below.

×
Unlock Preview

COURSE
PROF.
No professor available
TYPE
Class Notes
PAGES
28
WORDS
KARMA
25 ?

Popular in Department

This 28 page Class Notes was uploaded by an elite notetaker on Friday February 6, 2015. The Class Notes belongs to a course at Purdue University taught by a professor in Fall. Since its upload, it has received 13 views.

×

Reviews for 309 Class Note for CS 59000 with Professor Qi at Purdue

×

×

What is Karma?

You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 02/06/15
CS 59000 Statistical Machine learning Lecture 12 Yuan Aura j PUMME C3 Quit 77 20608 Outline 0 Review of Probit regression Laplace approximation BIC Bayesian logistic regression Kernel methods Kernel ridge regression Kernel Principle Component Analysis Probit Regression Probit function pt Ha Ia N16 1 10 Labeling Noise Model Plth 1 0X Ell 0X E 1 2 0X 0X is the activation function with input vector X Robust to outliers and labeling errors Generalized Linear Models 1 t quott 5 5 395 d y 2 Eltlnl in 1mm gt n My Generalized linear model 1 fltwTltzgt Activation function f Link function f Canonical Link Function If we choose the canonical link function f 1 y quotIii31 Gradient of the error function N 1 V111EW Zyn tn n 711 For the Gaussian s 3 1 Whereas for the logistic model 8 1 Examples Distribution Name Link Function Mean Function Normal Identity X u p X Exponential 1 1 nverse X l l X lemma 13 I I I3 Inverse Inverse 2 12 139 Gaussian squared u H Poisson Log 111 exp inomia 3 r B I Logit X In p erp Multinomial V 1 L 1 EXD u Eitni Laplace Approximation for Posterior Gaussian approximation around mode z 7 ZOTAZ 7 Z0 ml k 111fz 2 111fZ0 A WV 111 fzizZO Z 2 fZ0 eKP 9 Z0TAZ 20 A 12 AZ 27M2 exp 52 ZOTAZ Z0 2 NZiZOAi1 Evidence Approximation Z fzdz fzo exp h zoTAZ 20 dz 1 277U2 ADW 2 101 2 pD6p6 19 W 19D9P9 AI 1 11127r e111A 2 9 4 111D 2 1111D611Ap 1110MAP Occam factor A VV111D QMAPPWMAP VVIDPWMAPID Bayesian Information Criterion Approximation of Laplace approximation 1 in 101 2 111Di6hmp 531 in N More accurate evidence approximation needed Bayesian Logistic Regression W m01WSL71W m0 DIH 111Wlt N Z tn In yn 1 e tn 1111 7 y n const 711 qw NW W1 APSN N SN VV111Wt 331 Z y 21 71 1 Kernel Methods Predictions are linear combinations of a kernel function evaluated at training data points Kernel function lt gt feature space mapping CX X blxlfcblixll Linear kernel kxx XT Stationary kernels 183 X 13X X V Fast Evaluation of Inner Product of Feature Mappings by Kernel Functions qbx xlajg 15T kxz xTz2 2121 222 zfzf 23313111222 33f ame zle MKl 452 lnner product needs computing six feature values and 3 x 3 9 multiplications Kernel function has 2 multiplications and a squa ng Kernel Trick 1 Reformulate an algorithm such that input vector X enters only in the form of inner product XTX 2 Replace input x by its feature mapping X Igt 3 Replace the inner product by a Kernel function A7XiX 25XTqbxl Examples Kernel PCA Kernel Fisher discriminant Support Vector Machines Dual Representation for Ridge Regression 1 N T 2 T JW Xn tn 5W w N N w Z qubxn tn qbxn Z anqbxn IJTa n 1 n 1 Dual variables WT Xn tn 1 Ja aTqgtqgtTltIgtqgtTa aTqgtltIgtTt itTt 353M931 Kernel Ridge Regression Using kernel trick nm XnT bXvn kxny Xm 1 i 1 w A Ja EaTKKa aTKt t1t iaTKa Minimize over dual variables a K AIM 1 t gX WTq5x aTltIgtgbx kXT K Arm1t Generate Kernel Matrix kxx ck1xxl POSItIve semidefinite XX xw xjxq xq Consider Gaussian kernel kltxx q k1x x kx x exp k1xi x k X7X eXp IIXXIlg202 kxx k71XX kgx7x kxx39 k1xx k72xxl kXX k3 X 95 kxx39 XTAx39 HX xl2 XTX XTX 2xTx39 kXX k7axa kbxb x2 k X x Zita xa7 xiii krb xb x2 kxx exp xTxZag exp xTx I2 exp x Txl202 kxx exp Ti Hxx Hx x 2HXX Combining Generative amp Discriminative Models by Kernels Since each modeling approach has distinct advantages how to combine them Use generative models to construct kernels Use these kernels in discriminative approaches Measure Probability Similarity by Kernels Simple inner product kxx pxpx For mixture distribution anx39 I Zpxlipx lipi For infinite mixture models kltxx gt pxlzpx lzgtpltz dz For models with latent variables eg Hidden Markov Models kltXX gt ZpltXIZgtpX lZgtpltZgt Fisher Kernels Fisher Score glt6xgt w 1npltxl6 Fisher Information Matrix F Ex Fisher Kernel k7xx g6XTF1g93X Sample Average 1 N N X T F jV E g6xng6 n n 1 Principle Component Analysis PCA Assume anql 0 We have Sui Ami S 1 ix XT n 1 111 is a normalized eigenvector uil uy 1 Feature Mapping Zn ltXn 0 1 T C I N Z Xngt xn 711 Eigen problem in feature space CV1 Z VVi Dual Variables JV E q5xn XnTvi 1V 721 Suppose AZ gt O we have Eigenproblem in Feature Space 1 1 N N N N Z Xn bxnT Z 1123771 Xm I i Z ain xngt n 1 m 1 n 1 1 N m N N Z Xl a X71 2 Limkxna X771 i Z Linkxl X71 n 1 m 1 711 K237quot Z AjNKaY Eigenproblem in Feature Space 2 Normalization condition N N a r T 1 VEV39Z Z Z ai39i7zaivrt XIigt1Xm arifKa i AiNaj ail 711 7711 Projection coefficient N N yixgt 45XTvi Z ainqbxT Xn Z aquoti 1kxt X71 n1 n1 General Case for Nonzero Mean Case Kernel Matrix N K K INK K1N lNKlN where 1N denotes the N X N matrix in which every element takes the value 1N Kernel PCA on Synthetic Data Contour plots of projection coefficients in feature space kxx eXp X Xll201 Limitations of Kernel PCA Discussion

×

25 Karma

×

×

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

Why people love StudySoup

Steve Martinelli UC Los Angeles

"There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

Janice Dongeun University of Washington

"I used the money I made selling my notes & study guides to pay for spring break in Olympia, Washington...which was Sweet!"

Bentley McCaw University of Florida

Forbes

"Their 'Elite Notetakers' are making over \$1,200/month in sales by creating high quality content that helps their classmates in a time of need."

Become an Elite Notetaker and start selling your notes online!
×

Refund Policy

STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com