Abstract
Four steplength algorithms are presented for minimizing a class of nondifferentiable functions which includes functions arising from l1 and l∞ approximation problems and penalty functions arising from constrained optimization problems. Two algorithms are given for the case when derivatives are available wherever they exist and two for the case when they are not avaible. We take the view that although a simple steplength algorithm may be all that is required to meet convergence criteria for the overall algorithm, from the point, of view of efficiency it is important that the step achieve as large a reduction in the function value as possible, given a certain limit on the effort to be expended. The algorithms include the facility for varying this limit, producing, anything from an algorithm requiring a single function evaluation to one doing an exact linear search. They are based on univariate minimization algorithms which we present first. These are normally at least quadratically convergent when derivatives are used and superlinearly convergent otherwise, regardless of whether or not the function is differentiable at the minimum.
Original language | English (US) |
---|---|
Pages (from-to) | 309-331 |
Number of pages | 23 |
Journal | Computing |
Volume | 23 |
Issue number | 4 |
DOIs | |
State | Published - Dec 1979 |
ASJC Scopus subject areas
- Software
- Theoretical Computer Science
- Numerical Analysis
- Computer Science Applications
- Computational Theory and Mathematics
- Computational Mathematics