Popular in Course
Popular in Electrical Engineering
This 17 page Class Notes was uploaded by Dorris Borer on Monday September 28, 2015. The Class Notes belongs to ESE605 at University of Pennsylvania taught by Staff in Fall. Since its upload, it has received 10 views. For similar materials see /class/215453/ese605-university-of-pennsylvania in Electrical Engineering at University of Pennsylvania.
Reviews for MDRNCONVEXOPTIMIZATION
Report this Material
What is Karma?
Karma is the currency of StudySoup.
You can buy or earn more Karma at anytime and redeem it for class notes, study guides, flashcards, and more!
Date Created: 09/28/15
ESE 605 Modern Convex Optimization 1 Introduction mathematical optimization leastsquares and linear programming convex optimization example course goals and topics nonlinear optimization brief history of convex optimization Thanks to Professor Stephen Boyd Stanford University for permission to use and modify his slides Mathematical optimization mathematical optimization problem minimize f0x subject to 3 19 i1m o x x1 xn optimization variables 0 f0 R gt R objective function 0 fi R gt R i1m constraint functions optimal solution 96 has smallest value of f0 among all vectors that satisfy the constraints Introduction Examples portfolio optimization 0 variables amounts invested in different assets 0 constraints budget maxmin investment per asset minimum return 0 objective overall risk or return variance device sizing in electronic circuits 0 variables device widths and lengths 0 constraints manufacturing limits timing requirements maximum area 0 objective power consumption data fitting 0 variables model parameters 0 constraints prior information parameter limits 0 objective measure of misfit or prediction error Introduction Solving optimization problems general optimization problem 0 very difficult to solve 0 methods involve some compromise 69 very long computation time or not always finding the solution exceptions certain problem classes can be solved efficiently and reliably o leastsquares problems 0 linear programming problems 0 convex optimization problems Introduction Leastsquares minimize lle solving leastsquares problems 0 analytical solution 95 ATA1ATb o reliable and efficient algorithms and software 0 computation time proportional to 71 A E kan less if structured o a mature technology using leastsquares o leastsquares problems are easy to recognize o a few standard techniques increase flexibility 69 including weights adding regularization terms Introduction Linear programming minimize CTZE subject to 0sz 3 19 i1m solving linear programs 0 no analytical formula for solution 0 reliable and efficient algorithms and software 0 computation time proportional to an if m 2 71 less with structure 0 a mature technology using linear programming 0 not as easy to recognize as leastsquares problems 0 a few standard tricks used to convert problems into linear programs 69 problems involving 61 or EGOnorms piecewiselinear functions Introduction Convex optimization problem minimize f0ZE subject to 3 19 i1m o objective and constraint functions are convex fz390696 y 3 0613413 fi y ifa 1a20 20 0 includes leastsquares problems and linear programs as special cases Introduction 1 7 solving convex optimization problems 0 no analytical solution 0 reliable and efficient algorithms 0 computation time roughly proportional to maxn3n2mF where F is cost of evaluating fi39s and their first and second derivatives 0 almost a technology using convex optimization 0 often difficult to recognize 0 many tricks for transforming problems into convex form 0 surprisingly many problems can be solved via convex optimization Introduction 1 8 Example m lamps illuminating n small flat patches lamp power pj illumination Ik intensity Ik at patch k depends linearly on lamp powers pj m Ik Zakjpj akj 7 ng maxcos ij j1 problem achieve desired illumination Ides with bounded lamp powers minimize maxk177ni10g1 log Idesi subject to 0 gpj 3pm j1m Introduction 1 9 how to solve 1 use uniform power pj p vary p 2 use leastsquares minimize 221Ik Ides2 round pj if pj gt pmax or pj lt 0 3 use weighted leastsquares minimize 221Ik Ides2 2 wjpj pmaX22 iteratively adjust weights wj until 0 g pj g pmax 4 use linear programming minimize maxk177nl1k Idesl subject to 0 gpj 3pm j1m which can be solved via linear programming of course these are approximate suboptimal solutions39 Introduction 1 10 5 use convex optimization problem is equivalent to minimize f0p maxk wm hIkIdes subject to 0 gpj 3pm j1m with hu maxu1u a QM f0 is convex because maximum of convex functions is convex exact solution obtained with effort modest factor X leastsquares effort Introduction 1 11 additional constraints does adding 1 or 2 below complicate the problem 1 no more than half of total power is in any 10 lamps 2 no more than half of the lamps are on pj gt 0 answer with still easy to solve with extremely difficult Another Example Robust LP minimize CTZE subject to proba7TJc 3 b7 2 77 i1m o assume a7 is Gaussian with mean 17 covariance 27 a7 N NEW o Is this problem hard or easy answer for n gt it is easy for n 3 VERY hard 0 moral untrained intuition doesn39t always work without the proper background very easy problems can appear quite similar to very difficult problems Introduction 1 12 Mon Not R Astron Soc 000 0007000 0000 Printed 11 April 2003 MN ETEX style le V22 Reconstruction of the early Universe as a convex optimization problem Y Brenier1 U Hisch273 M H non2 G Loeper1 S lVlatarrese4 8 R Mohayaee2 A Sobolevskiizv5 O 1 CNRS UMR 6621 Universit de NiceASophiaeAntipolis Parc Valmse 06108 Nice Cedem 02 France NRS UMR 6529 Obsematoire de la Cote d7AZ LH BP 4229 06304 Nice Cedem 4 France 5 1 3 Institute for Advanced Study Einstein Drive Princeton NJ 08540 USA 914 Dipartimento di Fisica G Galilei and INFN Sezione di Padova via Manolo 8 351317Pado39ua Italy 5 Department of Physics M V Lomonosso39u Moscow University Leninskie Gory 119992 Moscow Russia 2 1 1 April 2003 V1 11A ABSTRACT astrophO304214 techniques is provide arXiv We show that the deterministic past history of the Universe can be uniquely recon structed from the knowledge of the present mass density eld the latter being inferred from the 3D distribution of luminous matter assumed to be tracing the distribution of dark matter up to a known bias Reconstruction ceases to be unique below those scales 7 a few Mpc 7 where multistreaming becomes signi cant Above 6 h 1 Mpc we 0 ose and implement an effective Monge7Am ere7Kantorovich method of unique reconstruction At such scales the Zelldovich approximation is well satis ed and re construction becomes an instance of optimal mass transportation a problem which goes back to Monge 1781 After discretization into N point masses one obtains an assignment problem that can be handled by effective algorithms with not more than 0N3 time complexity and reasonable CPU time requirements Testing against N body cosmological simulations gives over 60 of exactly reconstructed points We apply several interrelated tools from optimization theory that were not used in cosmological reconstruction before such as the Monge7Ampere equation its re lation to the mass transportation problem the Kantorovich duality and the auction algorithm for optimal assignment Selfcontained discussion of relevant notions and d Key words cosmology theory 7 largescale structure of the Universe 7 hydrody namlcs Course goals and topics goals 1 recognizeformulate problems such as the illumination problem as convex optimization problems 2 develop code for problems of moderate size 1000 lamps 5000 patches 3 characterize optimal solution optimal power distribution give limits of performance etc topics 1 convex sets functions optimization problems 2 examples and applications 3 algorithms Introduction 1 13 Nonlinear optimization traditional techniques for general nonconvex problems involve compromises local optimization methods nonlinear programming 0 find a point that minimizes f0 among feasible points near it 0 fast can handle large problems 0 require initial guess 0 provide no information about distance to global optimum global optimization methods 0 find the global solution 0 worstcase complexity grows exponentially with problem size these algorithms are often based on solving convex subproblems Introduction 1 14 Brief history of convex optimization theory convex analysis ca1900 1970 algorithms 0 1947 simplex algorithm for linear programming Dantzig 0 19605 early interiorpoint methods Fiacco 84 McCormick Dikin c 19705 ellipsoid method and other subgradient methods Kantorovich wins Nobel Prize in Economics in 1975 Dantzig doesn39t Shor develops the ellipsoid algorithm A 25 year old Georgian Mathematician named Leonid Khachian proves that ellipsoid algorithm solves LP5 with polynomial time complexity Appears on the cover of NY Times 0 19805 polynomialtime interiorpoint methods for linear programming interiorpoint polynomial time algorithm for LP Karmarkar 1984 Works Better than Simplex Introduction 1 15 Nemirovsky and Yudin prove that information complexity of convex programs is far better than general nonlinear programs 0 late 1980s now polynomialtime interiorpoint methods for nonlinear convex optimization Nesterov 84 Nemirovski 1994 Nesterov and Nemirovsky extend Karmarkar39s approach to genral convex problems Applications 0 before 1990 mostly in operations research few in engineering 0 since 1990 many new applications in engineering control signal processing communications circuit design new problem classes semidefinite and secondorder cone programming robust optimization geometric programing sum of squares programing Introduction 1 16
Are you sure you want to buy this material for
You're already Subscribed!
Looks like you've already subscribed to StudySoup, you won't need to purchase another subscription to get this material. To access this material simply click 'View Full Document'