From: Paul Victor Birke Subject: Re: Minimization when function is evaluated by Monte Carlo? Date: Thu, 04 May 2000 02:31:04 GMT Newsgroups: sci.math.num-analysis Summary: [missing] Alan Miller wrote: > > Paul Victor Birke wrote in message <39100CC1.BEBC283D@home.com>... > > > > > >Alan Miller wrote: > >>>snippings<< > >> An algorithm which is well suited to the present problem is the old > >> but reliable Nelder-Mead simplex algorithm. > >********** > >Dear Allan > > > >Not reliable at all beyond 4 variables! > > RUBBISH > > This looks like an argument based upon a sample of > one problem and/or poor code. > > I have used the algorithm sporadically for about 25 years > with a great deal of success. > > > > >Paul > > > >See modified Simplex by Virginia Torczon in earyl 1990s > > I know nothing about this version. Do you have more details? ************************************************ Dear Allan It has a history of being untrustworthy. I found this out in 1981. Try it on tough problems not just easy ones! However, please refer to Dennis, J.E. and Torzon V.," Direct Search Methods on Parallel Machines", SIAM Journal of Optimization, V 1, N 4, pp448-474, Nov 1991 where there are some good details. Paul ============================================================================== From: spellucci@mathematik.tu-darmstadt.de (Peter Spellucci) Subject: Re: Minimization when function is evaluated by Monte Carlo? Date: 3 May 2000 11:26:48 GMT Newsgroups: sci.math.num-analysis In article , "Alan Miller" writes: snip |> Simulated annealing and genetic algorithms require huge numbers of |> function evaluations. They are applicable to optimization when there |> are multiple minima and you want to find the global minimum. this is true if high precision is required. but since the function will be very much noisy high precision makes no sense anyway. |> |> An algorithm which is well suited to the present problem is the old |> but reliable Nelder-Mead simplex algorithm. At each step, it |> heads away (and usually downhill) from the worst point in the |> current simplex, and then eliminates that worst point. |> If there is noise in the estimate of the function value, which is the |> case here, it may in fact be heading away from the 2nd or 3rd |> worst point, but it will still usually be heading downhill. but nelder mead may fail completely. see papers on it in SIOPT. It is possible to modify Nelder mead to make it convergent (locally, against a point satisfying the second order necessary conditions) for smooth functions at the cost of some additional computations and introduction of linesearches (from the center of gravity of the simplex to the vertices, in case the simplex stalls. ) but it is completely overcome by methods like Powells COBYLA2, which , together with the paper, can be downloaded from http://plato.la.asu.edu/guide.html. these methods use linear interpolation on a simplex with adaptive moves of this simplex etc. noise on the function makes linear interpolation again unsafe as soon as the diameter of the simplex squared, multiplied by the order of magnitude of the second derivatives of the function, comes in the level of the noise. this might be quite early. therefore I think that using a stochastic optimizer for function with stochstic noise is the most appropriate idea. but, admittedly, Nelder Mead is easy to use, hence little is lost in trying it. Peter