From: Karl Hahn Subject: Re: minimising a function of many many variables Date: Wed, 03 Jan 2001 17:15:25 -0500 Newsgroups: sci.math Summary: Optimization techniques for large numbers of variables Adreas Faatz wrote: > I wonder if there exists any mathematical software tool, which is > able to minimise a function of 100000 or 200000 > variables. > Any hints ? > There are two algorithms I know of that are capable of this. Both are modeled on natual world phenomena, and neither guarantees THE minimum. They only look for and find a very good minimum. The first is called annealing because it is modeled on the physics of molten glass cooling. Somehow, if the glass is cooled slowly enough, it finds a configuration that nearly minimizes internal stress energy. Imagine a goat in the mountains. He wants to find the lowest elevation he can because that's where the best grazing is. A very dumb goat would just walk downhill until he couldn't do so anymore. He would arrive at a local minimum. But there might be better minima elsewhere. Here's where the annealing process comes in. The goat would be willing to try new starting points that were nearby. Each of these starting points will be, most likely, higher than where he is now. From a bunch of them selected at random, he selects one using a weighted random function. The probability weight of each choice is e^-(E_j/(kT)) where k is the constant that converts the goats "temperature" to the parameter being minimized, T is the "temperature", and E_j is the "energy" of the jth point (where "energy" is the parameter to be minimized). The goat then selects the new starting point and walks downhill again until he can no longer do so. After that you reduce the goat's "temperature" a wee bit and repeat the algorithm. Almost by magic the repetition of this procedure finds a point that is minimum over a very wide region. Unfortunately because it uses random variables, it might find a different minimum in one run than in another. Another algorithm that solves this problem, again not perfectly, but very well, is a genetic algorithm. This works on nearly the same principle as the annealing, except in this one you have a population of goats and they have kids at each iteration. Each goat has kids that are randomly scattered near the parent goat. Then each goat in the population either survives or is eliminated according to a random function weighted by how well the goat minimizes your function. The weightings are selected so that the average probability of a goat surviving is N/n, where N is the total population you would like to maintain and n is the current population. How wide an area this algorithm will search is dependent on how far the kids are scattered from their parent. Software exists for both these algorithms, although I don't know if you will be able to find one for free that can do as many independent variables as you ask for. Hope this helps. Karl Hahn ============================================================================== From: foltinek@math.utexas.edu (Kevin Foltinek) Subject: Re: minimising a function of many many variables Date: 03 Jan 2001 18:06:32 -0600 Newsgroups: sci.math In article <3A53A47D.F7C4E20B@cisco.com> Karl Hahn writes: > [Minimizing a function - goat in the mountains analogy] > > Another algorithm that solves this problem, again not > perfectly, but very well, is a genetic algorithm. This > works on nearly the same principle as the annealing, except > in this one you have a population of goats and they have > kids at each iteration. Each goat has kids that are randomly > scattered near the parent goat. This isn't really a genetic algorithm (GA). A GA will select pairs of goats to form parents; the kids will be distributed in some manner which depends on the location of the two parents. (Continuing the goat analogy, the kids may be distributed along the straight line connecting the parents.) There is some "art" in GA, in constructing a good encoding for the independent variables, and in determining how the kids will be distributed; these should be done with some consideration for some of what you already know about the structure of the problem. What you described (each goat has nearby kids) is more of an exhaustive search (with some random perturbations rather than truly exhaustive searching) than a GA. > [snip the "thinning the herd" process] A third component of GA is the random mutation - there is some small but non-zero probability that a given kid will be not exactly on the straight line between its parents, but will be some distance away. I think the importance of this mutation process in GA is not universally agreed-upon, and may in fact vary with the nature of the problem; but, it generally should improve performance, and the probability of mutation should be rather small (otherwise you are again doing essentially an exhaustive search). Kevin.