From: Gerhard Heinzel Subject: Re: Optimization method? Date: Thu, 14 Sep 2000 10:17:01 +0200 Newsgroups: sci.math.num-analysis Summary: [missing] On Thu, 14 Sep 2000, Rod wrote: > > Daniel Thuresson wrote in message > news:39C06F27.7E02D53B@mvd.chalmers.se... > > Hi > > > > I wonder if anyone could suggest an optimization method for a problem. I > > have 6 design variables and the object function is rather > > computationally heavy, non-linear, possibly non-smooth and probably also > > has several local minimas. Due to the large computational effort for the > > object function, I would like to use as few iterations as possibly. > > I was thinking that some sort of gradient free algorithm might be the > > best choice, any suggestions? > > > > Thanks > > Daniel > > > > There is a something called the Simplex method which is gradient free. I > found reference to it on the web a couple of years ago so you should find it > somewhere. It may overcome your problems but I wouldn't describe it as a > "fast" algorithm. You will not find any really "fast" algorithm, if your function is non-smooth and has several local minimas. Indeed you can be happy if you find the best minimum at all. There is literature for these problems under the name "global optimization", but these algorithms are typically pretty slow. For 6 variables the Nelder-Mead-Simplex is indeed a reasonable choice to start with, IF occasionally restarted with freshly (ideally randomly) initialized starting vectors. Don't stop after the first "convergence". The algorithm does not need derivatives. Don't confuse it with the "simplex" in linear programming. It is useful to estimate the starting value as precisely as you can. Also scale the parameters to have comparable orders of magnitude. If the magnitude of one parameter can vary a lot, it may help to use its logarithm in the fitting algorithm (provided, of course, that its sign is fixed). As for global optimization, I found the following useful: M. Ali, A. T\"orn, S. Viitanen: "A numerical comparison of some modified Controlled Random Search algorithms", Turku Center for Computer Science, TUCS Technical Report No. 98 (1997). (also available from http://www.tucs.abo.fi/) Some more addresses: http://solon.cma.univie.ac.at/~neum/glopt/software_g.html http://www.cs.sandia.gov/opt/survey/stats.html http://www.bell-labs.com/user/audris/global/global.html http://www-unix.mcs.anl.gov/otc/Guide/faq/nonlinear-programming-faq.html Good luck Gerhard ============================================================================== From: "Alan Miller" Subject: Re: Optimization method? Date: Thu, 14 Sep 2000 12:32:45 GMT Newsgroups: sci.math.num-analysis There is a version of the Nelder-Mead simplex algorithm at my ozemail web site. This version emanated from Rothamsted (England) many years ago. John Nelder was Head of the Statistics Dept. there after Sir Ronald Fisher. It is safer than most implementations which I have seen in that it does not stop immediately the convergence criterion is satisfied, but attempts to confirm that a minimum has been found. -- Alan Miller, Retired Scientist (Statistician) CSIRO Mathematical & Information Sciences Alan.Miller -at- vic.cmis.csiro.au http://www.ozemail.com.au/~milleraj http://users.bigpond.net.au/amiller/ [original post was quoted here --djr]