×

### Let's log you in.

or

Don't have a StudySoup account? Create one here!

×

or

by: Nick Rowe

21

0

56

# Computing For Life Sciences CS 59000

Marketplace > Purdue University > ComputerScienence > CS 59000 > Computing For Life Sciences
Nick Rowe
Purdue
GPA 3.68

Yuan Qi

These notes were just uploaded, and will be ready to view shortly.

Either way, we'll remind you when they're ready :)

Get a free preview of these Notes, just enter your email below.

×
Unlock Preview

COURSE
PROF.
Yuan Qi
TYPE
Class Notes
PAGES
56
WORDS
KARMA
25 ?

## Popular in ComputerScienence

This 56 page Class Notes was uploaded by Nick Rowe on Saturday September 19, 2015. The Class Notes belongs to CS 59000 at Purdue University taught by Yuan Qi in Fall. Since its upload, it has received 21 views. For similar materials see /class/208074/cs-59000-purdue-university in ComputerScienence at Purdue University.

×

## Reviews for Computing For Life Sciences

×

×

### What is Karma?

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/19/15
CS 59000 Statistical Machine learning Lecture 2 Yuan Mam Q alamq lt apurdueedu Fat 2008 Review Polynomial Curve Fitting 0 1 M yaw 100 101x 10ng i wMajM E 70ij 30 SumofSquares Error Function h t in 1St Order Polynomial 3rOI Order Polynomial 9th Order Polynomial Ove rfitting e Training 9 Test RootMeanSquare RMS Error ERMS 2EwN Polynomial Coefficients M 0 M 1 M 3 M 9 mg 019 032 031 035 101 127 799 23237 10 2543 532133 w 1737 4356331 w 23163930 w 64004226 wg 106130052 w 104240013 w 55768299 w 12520143 Regularization Penalize large coefficient values N A Eltwgt Z ynw m2 Hw2 n21 DIH Regularization lnA 18 Regularization 1m 0 Regularization ERMS vs lnA Training Test 30 IDA 25 20 35 Polynomial Coefficients lnAz oo lnA2 18 lnAZO 105 035 035 013 101 23237 474 005 703 532183 077 006 mg 4856831 3197 005 w 23163930 389 003 w 64004226 5528 002 103 106180052 4132 001 10 104240018 4595 000 w 55768299 9153 000 w 12520143 7268 001 Data Set Size N 10 9th Order Polynomial Data Set Size N 15 9th Order Polynomial Training Data the more the better 9th Order Polynomial H l Review Probability Theory H5 Marginal Probability yj no j pltXxi 1102 Joint Probability Conditional Probability pltXazzYyjgtE peerlama E N Ci Probability Theory Sum Rule ya no rj 1102 Product Rule The Rules of Probability Sum Rule pX 21901 Product Rule pX Y pYXPX Bayes Theorem 19X iYPY pYX poo 29X I ZNXIYMY Y posterior oc likelihood X prior Probability Density amp Cumulative Distribution Functions 2037 Pm Transformed Densities dab pmgy Ig y Expectations m Zpom Em pxf dz Efifiyi Z pxy f 2 2322 Expectation Approximate Expectation N 1 2 N Z discrete and continuous Variances and Covariances mm E mm 1Efv2 Efltmgt21 Efltxgt2 cova y 2 Emmy y Dinghy COVXy Exy X EXYT MYTH ExyXyT EXEYT The Gaussian Distribution 2 1 1 2 NIM0 WGXP M M 27 02 A 39 mum gt o Napa2 d3 1 20 OO Gaussian Mean and Variance foo Nxu02 asdcz y Ex2 N mlymz 32 dz p2 02 vara IEI2 1Ew2 a2 The Multivariate Gaussian 1 1 1 1 W Dle lgexp X UTE X 11 NXMa 2 33931 Gaussian Parameter Estimation ll 1933 Likelihood function 02 v m n CE Maximum Log Likelihood N 1 N N 1npxm02 T 2 E xn 2 1na2 31n27r 7121 N N 1 1 ML N 2133 0m N E 72 libIL2 n Properties of MML 80039 Jim EWML u Unbiased N 1 MOE1L lt N gt02 Biased A A A n r U V V V Curve Fitting Revisited 2456 W I20 pt930w6 NWZSEOaWL 1 yx07w 270 13 Maximum Likelihood 12 ptxw NtnynvwaB 1 n 1 MIR N 221 W tn2 1116 g1nlt2wgt n lnptix w BEYW Determine WML by minimizing sumofsquares error N Z 9513n WML tn2 n1 2H 1 BML Predictive Distribution ptiWML ML N tyawML7B1 Ii MAP A Step towards Bayes pwia NW0Oz11 a M12 exp ngw pWixtaa 0lt ptixWa pWa DI Q N 65W Zyxnw tn2 ngW Determine WMAP by minimizing regularized sumofsquares error Bayesian Curve Fitting pt3xt ptawpw xt dWNtmcs23 N W17 MSWTS Z ntn 8256 1 TS n1 N s1 041 62 xn rcnT ltbltccngt xaxi T n21 Bayesian Predictive Distribution ptix x t N timw 3203 Model Selection via CrossValidation j runl run2 ICE run3 S 4 Curse of Dimensionality Curse of Dimensionality Polynomial curve fitting P 1469 6 D D D yX W 1110 Z w z Z Z wijw j Z i1 i1 j1 7 E wijkximjxk D J H i Gaussian Densities in higher dimensions Decision Theory Inference step Determine either ptix or pxt Decision step For given determine optimal w Minimum Misclassification Rate A H pmistake px E R1C2 px E R2 C1 px C2 dx pX C1 dx R1 R2 Minimum Expected Loss Example classify medical images as cancer or normal Decision cancer normal cancer 0 1000 normal 1 0 Truth Minimum Expected Loss IEL ijpXCkdx 19 Regions Rj are chosen to minimize EiL Z ijPCkiX k Reject Option 00 reject region Decision Theory for Regression Inference step Determine pXt Decision step For given make optimal prediction for w Loss function EL Ltyxpxtdxdt The Squared Loss Function Minimize IEL t2pxt dX dt The Squared Loss Function Minimize IEL t2pxt dX dt Entropy Him 21M 10g2 1996 Important quantity in coding theory statistical physics machine learning Entropy Coding theory discrete with 8 possible states how many bits to transmit the state of All states equally likely 1 1 8 x g log2 g 3 bits a b C d e f g h W 6 1 21 614 614 611 61 4 code 0 10 110 1110 111100 111101 111110 111111 1 1 1 1 1 1 4 1 H 1 1 1 1 1 x 2 Og2 2 4 0g 4 8 Og2 8 16 OgZ 16 64 0g 64 2 bits averaecodele th 1gtlt1 1gtlt21gtlt31gtlt44gtlt1gtlt6 n g g 2 4 8 16 64 2 bits Entropy In how many ways can Q identical objects be allocated P bins W 1 Entropy maXImlzed when Vz pi M Entropy probabilities 05 0 J Ln H 177 probabilities 05 0 J Ln H 309 EUDDUUHUHHHHHHHUUUDME Differential Entropy Put bins of width f along the real line i1310 przMlnpma p B1np 0dx Differential entropy maximized for fixed 02 when 1790 N Eiu02 in which case g 1 11127r02 Conditional Entropy HyX pyxlnpYXdy dx HWY HyX HX The KullbackLeibler Divergence KLltqugt pltxgt1nqltxgtdx pltxgt1npltxgtdxgt pxln dx 1 N KLqu 2 W nlnqgtltn9 1npXn 104pr 2 0 KLPHQ i KLQHP How to prove KLpllq 2 0 Hint Convex function amp Jensen s inequality

×

×

### BOOM! Enjoy Your Free Notes!

×

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

Bentley McCaw University of Florida

#### "I was shooting for a perfect 4.0 GPA this semester. Having StudySoup as a study aid was critical to helping me achieve my goal...and I nailed it!"

Janice Dongeun University of Washington

#### "I used the money I made selling my notes & study guides to pay for spring break in Olympia, Washington...which was Sweet!"

Jim McGreen Ohio University

#### "Knowing I can count on the Elite Notetaker in my class allows me to focus on what the professor is saying instead of just scribbling notes the whole time and falling behind."

Parker Thompson 500 Startups

#### "It's a great way for students to improve their educational experience and it seemed like a product that everybody wants, so all the people participating are winning."

Become an Elite Notetaker and start selling your notes online!
×

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com