×

### Let's log you in.

or

Don't have a StudySoup account? Create one here!

×

### Create a StudySoup account

#### Be part of our community, it's free to join!

or

##### By creating an account you agree to StudySoup's terms and conditions and privacy policy

by: Mr. Daryl Wiegand

12

0

23

Mr. Daryl Wiegand
Penn
GPA 3.65

Staff

These notes were just uploaded, and will be ready to view shortly.

Either way, we'll remind you when they're ready :)

Get a free preview of these Notes, just enter your email below.

×
Unlock Preview

COURSE
PROF.
Staff
TYPE
Class Notes
PAGES
23
WORDS
KARMA
25 ?

## Popular in Operations And Info Mgmt

This 23 page Class Notes was uploaded by Mr. Daryl Wiegand on Monday September 28, 2015. The Class Notes belongs to OPIM913 at University of Pennsylvania taught by Staff in Fall. Since its upload, it has received 12 views. For similar materials see /class/215420/opim913-university-of-pennsylvania in Operations And Info Mgmt at University of Pennsylvania.

## Popular in Operations And Info Mgmt

×

×

×

### What is Karma?

#### You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!

Date Created: 09/28/15
OPIM 91 3 LINEAR OPTIMIZATION Lecture 10 Regression a Means and Medians 0 Least Squares Regression 0 Least Absolute Deviation LAD Regression 0 LAD Via LP 0 Average Complexity of PrimalDual Simplex Method 2 Means and Medians Consider 1995 Adjusted Gross Incomes on Individual Tax Returns b1 b2 b3 bm l bm 25462 45110 15505 33265 75420 Table 11995 Individual Income Tax Returns All figures are estimates based on samplesmoney amounts are in thousands of dollars Size of adjusted gross income Number of returns Adjusted gross income 1 2 All returns 113213327 4189353615 No adjusted gross income 944141 65253648 1 under 5000 14646131 37604828 5000 under 10000 13982404 104503355 10000 under 15000 13562088 169317443 15000 under 20000 11385632 198418324 20000 under 25000 9970099 223400219 25000 under 30000 7 847862 215200244 30000 under 40000 12380339 430491242 40000 under 50000 9098750 405538597 850000 under 75000 13679023 828349278 75000 under 100000 5374489 458505550 100000 under 200000 4074852 532030480 200000 under 500000 1007135 292117517 500000 under 1000000 175374 120347093 1000000 or more 85998 227582937 Taxable returns 89252989 4007580441 Nontaxable returns 28965338 181773174 Median 52 1214 8 22500 2 Mean 1 WI 57 Zb 4189353615000118218327 35437 711 i1 Mean s Connection with Optimization 2 argminx Z x 19 i1 Proof fx cc 1202 i1 f x Zzoc b i1 17 1 lim fx 00 gt iisaminimum x gtoo f oz 0 gt 21 mi 4 Median s Connection with Optimization m 3 argminx Z x bil i1 Proof fx le bI i1 m l x gt 0 f x sgn x 1 where sgnx 0 x i1 l x lt 0 of b s smaller than x of b s larger than x Ifm is odd m A Regression Model for Algorithm Ef ciency t of iterations of constraints of variables 3 Model 1 20 m n 3 Linearization Take logs logl alogZ logm n 6 error difference w model prediction Solve several k instances logll log2 logm1 111 61 loglz log2 logm2 112 3 62 L log lk j log2 logmk nk In matrix notation b Ax 6 Goal nd x that minimizes e 6 Least Squares Regression De nition Euclidean Distance M2 Zia2V 139 Least Squares Regression 2 argminxllb Axllg Calculus p 193 fx llb Axn 2 b Zen1x j af a Xkx2 bi ZjaUxj aik0 kl2n Rearranging i ZaikbizzzczikaUfj kl2 n 139 j In matrix notation A Tb A TA Assuming ATA is invertible iff A has full rank 2 ATA1ATb Least Absolute Deviation Regression De nition Manhattan Distance x1ZIXi 1 Least Absolute Deviation Regression 8 argminxllb Ax1 Calculus no explicit formula this time but iterative scheme fx Ax1 2 b1 Zaijxj 1 8 b1 dz flt gt Z ea a 0 k12 n M 1 21117121 139 Rearranging aikbz39 aikai39je39 Zm O k k12 n 1 In matrix notation ATEEb ATE2EAJE where EU I Diag 1 Assuming ATE 3A is invertible 3 ATE A39 ATE b whereEdepends on x 8 Least Absolute Deviation Regression Continued An implicit equation Can be solved using successive approximations x0 0 x1 ATEx A 1ATEx0b x2 ATEx1A 1ATEx1b xk1 ATExkA1ATExkb lim xk k gtoo Least Absolute Deviation Regression via Linear Programming min 2 b Zealx z39 j Equivalent Linear Program p 196 min 2 l i ll39 5 bi Zal jxj39 E 139 i12m J AMPL Model param m param n set I 2 1m set J 2 1n param A IJ param b I var XJ var tI minimize sumdev sum i in I ti subject to lowerbound i in I ti lt bi sum j in J Aijxj subject to upperbound i in I bi sum j in J Aijxj lt ti 10 Primal Dual Simplex Method Thought experiment a it starts at 00 o In reducing it there are n m barriers a At each iteration one barrier is passed the others move about randomly a To get u to zero we must on average pass half the barriers 0 Therefore on average the algorithm should take m n 2 iterations Least Squares Regression g 2 gt T 0488m n1052 Least Absolute Deviation Regression amp 09508 N 1049 59 10491 gt T0517mn Primal Dual Simplex Method Data Name m n iters Name m n iters 25fV47 777 1545 5089 nesm 646 2740 5829 80bau3b 2021 9195 10514 recipe 74 136 80 adlittle 53 96 141 0105 104 103 92 a ro 25 32 16 0205 203 202 191 agg2 481 301 204 505021 49 48 46 agg3 481 301 193 sc50b 48 48 53 bandm 224 379 1139 scagr25 347 499 1336 beaconfd 111 172 113 scagr7 95 139 339 blend 72 83 117 scfxml 282 439 531 bn11 564 1113 2580 scfxm2 564 878 1197 bn12 1874 3134 6381 scfxm3 846 1317 1886 boeingl 298 373 619 scorpion 292 331 411 boeing2 125 143 168 scrs8 447 1131 783 bore3d 138 188 227 scsdl 77 760 172 brandy 123 205 585 scsd6 147 1350 494 czprob 689 2770 2635 scsd8 397 2750 1548 d6cube 403 6183 5883 sctapl 284 480 643 degen2 444 534 1421 sctap2 1033 1880 1037 degen3 1503 1818 6398 sctap3 1408 2480 1339 e226 162 260 598 seba 449 896 766 etamacro 334 542 1580 sharelb 107 217 404 f ff800 476 817 1029 share2b 93 79 189 nnis 398 541 680 shell 487 1476 1155 t1d 24 1026 925 ship04l 317 1915 597 tlp 627 1677 15284 ship04s 241 1291 560 forplan 133 415 576 ship081 520 3149 1091 ganges 1121 1493 2716 ship08s 326 1632 897 greenbea 1948 4131 21476 ship121 687 4224 1654 grow15 300 645 681 ship12s 417 1996 1360 grow22 440 946 999 sierra 1212 2016 793 grow7 140 301 322 standata 301 1038 74 israel 163 142 209 standmps 409 1038 295 kb2 43 41 63 stocforl 98 100 81 lot 134 300 242 stocfor2 2129 2015 2127 maros 680 1062 2998 TABLE 1 Number of iterations for the primalidual simplex method 12 Primal Dual Simplex Method Plots f a x I If x39 w l I quot If f I r I l I i I I b itquot I g 3 I u I pg I I 39l I l I n f s a 3 I H g 4 r J I 9 1quotquot I2rmar Llrcur I 1 l1 1 392 I 4 3 B 39139 9 m 11 FIGURE 1 A logilog plot of T VS m n and the L1 and L2 regression lines OPIM 91 3 OPTIMIZATION Lecture 2 Simplex Method for Linear Programming AnExample maximize x1 3x2 3x3 subjectto 3x1 x2 2x3 5 7 2X1 4X2 4X3 E 3 X1 2X3 5 4 2X1 2X2 X3 5 8 3X1 5 5 X1 X2 X3 2 Rewrite with slack variables maximize z x1 3x2 3x3 subjectto wl 7 3x1 x2 2x3 W2 2 3 2X1 4X2 4X3 W3 2 4 X1 2X3 W4 2 8 2X1 2X2 X3 W5 2 5 3X1 X1 X2 X3 W1 W2 W3 2 0 Notes a This layout is called a dictionary 0 Setting x1 X2 and X3 to 0 we can read off the values for the other variables wl 7 wz 3 etc This speci c solution is called a dictionary solution a All the variables in the current dictionary solution are nonnegative Such a solution is called feasible 0 The initial dictionary solution need not be feasible we were just lucky above a Dependent variables on the left are called basic variables 0 Independent variables on the right are called nonbasic variables S implex MeLhudiFirst Item nn Here s the problem agam 1m rncreases Obj goes up w much can x2 mamase Answer unm wA decreases to zero on End result X2 gt 0 Whaeas wA 0 That rs x2 mustbecomebasxc and m mustbecome nonbasrc Algebrarcauy rearrange equauons to m the words ofJeaneLuc preard Make 5 u Th1 rs a pwot New dxctxonary rs S implex MeLhndis sound Pian Recall the dlctlonary a er the rst plvot llllll Now letxl lncrease Othe baslc vanables ws hlLs zero rst So n en ers and ms leaves the basls New dictionary ls It s opumal no plnk Agenda 0 Discuss unboundedness today 0 Discuss initializationinfeasibility ie what if initial dictionary is not fea sible today 0 Discuss degeneracy next lecture Unbuundedness Consxda Lhe followmg dlcuonary m enemy Could morease etherX orxg to mareaSe Obj Consxdamcre W39hxch basxc vanable decreases to zero mm X cangrow whuH unl A L L 5 do 5 Th 15 how we detect unboundedness thh the slmplai method Initialization Consider the following problem maximize 3x1 4x2 subjectto 4x1 2x2 5 8 2X1 E 2 3X1 2X2 E x1 3X2 5 1 3X2 E 2 X1 X2 2 PhaseI Problem Modify problem by a subtracting a new variable x0 from each constraint and b replacing objective function with x0 maximize x0 subjectto x0 4x1 2x2 f 8 X0 2X1 E 2 X0 3X1 2X2 E x0 X1 3X2 E 1 X0 3X2 E 2 X0 X1 X2 2 Clearly feasible pick x0 large x1 0 and x2 0 If optimal solution has obj 0 then original problem is feasible and nal phaseI basis can be used as initial phaseII basis ignoring x0 there after If optimal solution has obj lt 0 then original problem is infeasible Initialization First Pivot Applet depiction shows both the PhaseI and the PhaseII objectives hj 1 v1 Iii3 mi was meant Die ti af i will 1 1 it l x mm 11 43 i2 quotIf 39 Iquot II I 7 I39 r i e n n 39 Dictionary is infeasible even for PhaseI One pivot needed to get feasible Entering variable is x0 Leaving variable is one whose current value is most negative ie w 1 After rst pivot Wri t niutimaxy m i u1 3 I it 15 a lt Feasible Initialization Second Pivot Going into second pivot misusing mam v hjh 77 y 11 E E 1 EF 7 7 I Focus on the yellow highlights Let x1 enter Then 1125 must leave we F i3 m as E After second pivot More to go Initialization Third Pivot Going into third pivot 1 xi Iii 3 m 3931 x2 must enter x0 must leave After third pivot x3 I W f ii I Is 31 2 Optimal for PhaseI no yellow highlights obj 0 therefore original problem is feasible PhaseII Recall last dictionary lament Diatjimiy For PhaseII a Ignore column with x0 in PhaseII b Ignore PhaseI objective row w5 must enter w4 must leave After pivot mittmt iset39imw 222 2 I L Lquot I we h 35 Optimal

×

×

### BOOM! Enjoy Your Free Notes!

×

Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'

## Why people love StudySoup

Steve Martinelli UC Los Angeles

#### "There's no way I would have passed my Organic Chemistry class this semester without the notes and study guides I got from StudySoup."

Anthony Lee UC Santa Barbara

#### "I bought an awesome study guide, which helped me get an A in my Math 34B class this quarter!"

Jim McGreen Ohio University

Forbes

#### "Their 'Elite Notetakers' are making over \$1,200/month in sales by creating high quality content that helps their classmates in a time of need."

Become an Elite Notetaker and start selling your notes online!
×

### Refund Policy

#### STUDYSOUP CANCELLATION POLICY

All subscriptions to StudySoup are paid in full at the time of subscribing. To change your credit card information or to cancel your subscription, go to "Edit Settings". All credit card information will be available there. If you should decide to cancel your subscription, it will continue to be valid until the next payment period, as all payments for the current period were made in advance. For special circumstances, please email support@studysoup.com

#### STUDYSOUP REFUND POLICY

StudySoup has more than 1 million course-specific study resources to help students study smarter. If you’re having trouble finding what you’re looking for, our customer support team can help you find what you need! Feel free to contact them here: support@studysoup.com

Recurring Subscriptions: If you have canceled your recurring subscription on the day of renewal and have not downloaded any documents, you may request a refund by submitting an email to support@studysoup.com