From: jpc@a.cs.okstate.edu (John Chandler) Newsgroups: sci.math Subject: Re: unconstrained multivariate optimization Date: 13 Sep 1995 16:31:16 GMT In article <435ssf$qnv@clarknet.clark.net>, Randy Poe wrote: >In article <434c8n$da2@sparcserver.lrz-muenchen.de>, u7f01bf@sun2.lrz-muenchen.de says... >> >>jamesk@cs.ust.hk (James Kwok) writes: >> >>>Hi, >> >>>I want to optimize a multivariate nonlinear function >>>f(x_1,x_2,...,x_n), where n is large. Obviously, I can >>>first set arbitrary initial values for x_1,x_2,...,x_n. >>>then fix x_2 up to x_n and find the best value for x_1, >>>then fix x_1, x_3 up to x_n and find the best value for x_2, >>>...., and then loop over x_1 to x_n again. >> >> I dont think that will work very well. You need a >>algorithm for multidimensional minimization, see >>Numerical Recipies C, Chap. 10. >>Use conjugate gradients, if the function is differentiable, >>downhill simplex, if not. >> > >The suggestions offered by people are good ones. However, >just for information, the algorithm described by the >original poster is the "cyclic coordinate" method. It >isn't the worst thing you can do. If combined with a >modern line-search technique (one which searches for >sufficient descent, not full-fledged minimization each >time), I believe it can be proved to converge. Michael Powell devised a problem for which it does not converge, twenty years ago or so. By itself, this method is very slow on many problems having a narrow "valley" in the function being minimized, such as Rosenbrock's test problem: f(x,y) = 100*(y-x^2)^2 + (x-1)^2, start at (-1.2,1.0) The cyclic method can be accelerated greatly by moving along the resultant of one complete cycle of moves, not including the first component of the first cycle. This tactic, plus an "anti-zigzagging" tactic, are used in STEPIT, a robust and portable FORTRAN direct search routine used fairly widely since 1965, available via anonymous ftp as file stepit.f in directory /pub/jpc at a.cs.okstate.edu . STEPIT has no trouble with Powell's problem. Direct search methods by Powell (The Computer Journal 7 (1964) 155-162) and Brent (Algorithms for Minimization Without Derivatives, Prentice-Hall 1973) are available in files va04a.f and praxis.f . These methods have superlinear convergence near the minimum. VA04A is not robust. PRAXIS is robust but not scale-invariant, as pointed out by Fletcher. PRAXIS is the most efficient direct search software known, for many problems. -- John Chandler jpc@a.cs.okstate.edu