From: Gerhard Heinzel
Subject: Re: initial guesses for local optimization
Date: Mon, 28 Jun 1999 05:33:22 +0200
Newsgroups: sci.math.num-analysis
To: Bukic
On 27 Jun 1999, Bukic wrote:
> Suppose I want to find the global minimum of an n-dimensional
> function over the hyper-rectangle defined by the lower bounds
> xlb(1:n) and xub(1:n) (in Fortran 95 notation). I have a subroutine
> which will find the local minimum given an initial guess. If
For exactly this problem I am successfully using two "global optimzation"
algorithms, "Direct Search Simulated Annealing" and "Controlled Random
Search". They are described in:
M. Ali, A. T\"orn, S. Viitanen: `A Direct Search Simulated Annealing
algorithm for optimization involving continuous variables', Turku Center for
Computer Science, TUCS Technical Report No. 97 (1997).
M. Ali, A. T\"orn, S. Viitanen: `A numerical comparison of some
modified Controlled Random Search algorithms', Turku Center for
Computer Science, TUCS Technical Report No. 98 (1997).
These papers are available from http://www.tucs.abo.fi/
>
> Another question -- can someone suggest a Fortran subroutine
> which provides a good test case for global optimization -- one
> with several local minima and a known global minimum?
Some test cases are given in the Technical Report TR37 by the same authors.
Good luck,
Gerhard
==============================================================================
From: JeffLeader@MindSpring.com (Jeffery J. Leader)
Subject: Re: initial guesses for local optimization
Date: Mon, 28 Jun 1999 02:01:54 GMT
Newsgroups: sci.math.num-analysis
bukic@aol.com (Bukic) wrote:
>I have a subroutine
>which will find the local minimum given an initial guess. If
>I am going to select m initial guesses before finding any
>local minima, how should I do so?
[...]
>Actually, once one has found the local minimum from the first
>guess, the next guess used should exploit the information
>learned about the function. How?
If you know nothing else, I'd think you'd want to use either Multiple
Random Start (MRS)--pick a point at random, do the search, pick a new
point, repeat--or Modified Multiple Random Start (MMRS)--pick a point
at random, do the search, pick a new point, and do the search if the
function value at that new point is less than the current best guess
at the minimum value, else reject it and choose a new point (I assume
your real concern is doing at most m searches, not choosing m points
per se). Watch out for issues in your PRNG if you have high
dimensionality especially, of course. In fact, if m is small you may
wish to use a quasi-Monte Carlo method, that is, choose the (at least)
m points to be sure that they follow some reasonable (though
presumably non-uniform) pattern of spreading out over the hyper-box.
If you have more info. about the function, including knowing how many
local minima you're apt to have, you may well be able to do better
than this. For general problems in more than several dimensions MRS
and MMRS are very competitive and, perhaps surprisingly, MMRS is not
clearly better in practice. (YMMV!)
==============================================================================
From: spellucci@mathematik.tu-darmstadt.de (Peter Spellucci)
Subject: Re: initial guesses for local optimization
Date: 28 Jun 1999 10:44:41 GMT
Newsgroups: sci.math.num-analysis
In article <19990627193901.12465.00000841@ng-cr1.aol.com>,
bukic@aol.com (Bukic) writes:
|> Suppose I want to find the global minimum of an n-dimensional
|> function over the hyper-rectangle defined by the lower bounds
|> xlb(1:n) and xub(1:n) (in Fortran 95 notation). I have a subroutine
|> which will find the local minimum given an initial guess. If
|> I am going to select m initial guesses before finding any
|> local minima, how should I do so?
there are several techniques for random choice of initial points, some
already mentioned in other's responses. for global optimization in general
you have the choice between deterministic and stochastic approach. visit
arnlod neumaiers page, who collected much information on the subject:
http://solon.cma.univie.ac.at/~neum/globt.html
there is an introduction into the field by janos pinter:
http://catt.bix.okstate.edu/itorms/pinter
panos pardalos wrote a book full of testcases for global optimization:
Floudas, Christodoulos A.; Pardalos, Panos M.
A collection of test problems for constrained global optimization algorithms.
(English)
[B] Lecture Notes in Computer Science, 455. Berlin etc.: Springer-Verlag
(ISBN 3-540-53032-0). XIV, 180 p. DM 33.00 (1990).
for more information see also
http://plato.la.asu.edu/gom.html
http://www.cs.sandia.gov/opt/survey/
hope that helps
peter