From: Victor Eijkhout Subject: Re: preconditioning Date: 19 Jan 2000 11:30:44 -0500 Newsgroups: sci.math.num-analysis,comp.ai.neural-nets Summary: [missing] Jive Dadson writes: > say, how to implement cheaply a linear function that is an > approximation of the inverse Hessian of the objective function. I hope your Hessian is positive definite? If not, your problems will be a lot harder. 1/ See if taking the diagonal of teh matrix helps you any. 2/ If your matrix is sparse, maybe you can get lucky with an incomplete factorisation. I recently wrote an overview of ILU techniques: http://www.netlib.org/cgi-bin/search.pl?query=eijkhout&boolean=or&gams=&prec=&lang=&startat=11 Number 12, see number 13 for a new ILU method. Like all of the previous, it sometimes works, sometimes doesn't. I like to think that it works more often than some others. 3/ If your Hessian is not sparse I can't help you much further. You'll have to go back to the theory to come up with something; I'm not quite sure that an algebraic technique will help you much in that case. -- Victor Eijkhout "There ought to be limits to freedom" [G.W. Bush, reacting to www.gwbush.com]