1989 OntheLimitedMemoryBFGSMethodfor

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Limited-Memory BFGS Algorithm.

Notes

Cited By

1999

Quotes

Author Keywords

large scale nonlinear optimization, limited memory methods, partitioned quasi-Newton method, conjugate gradient method

Abstract

We study the numerical performance of a limited memory quasi-Newton method for large scale optimization, which we call the L-BFGS method. We compare its performance with that of the method developed by Buckley and LeNir (1985), which combines cyles of BFGS steps and conjugate direction steps. Our numerical tests indicate that the L-BFGS method is faster than the method of Buckley and LeNir, and is better able to use additional storage to accelerate convergence. We show that the L-BFGS method can be greatly accelerated by means of a simple scaling. We then compare the L-BFGS method with the partitioned quasi-Newton method of Griewank and Toint (1982a). The results show that, for some problems, the partitioned quasi-Newton method is clearly superior to the L-BFGS method. However we find that for other problems the L-BFGS method is very competitive due to its low iteration cost. We also study the convergence properties of the L-BFGS method, and prove global convergence on uniformly convex problems.

References

,

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
1989 OntheLimitedMemoryBFGSMethodforDong C. Liu
Jorge Nocedal
On the Limited Memory BFGS Method for Large Scale OptimizationMathematical Programming10.1007/BF015891161989