Abstract
We derive a quadratically convergent algorithm for minimizing a nonlinear function subject to nonlinear equality constraints. We show, following Kaufman [4], how to compute efficiently the derivative of a basis of the subspace tangent to the feasible surface. The derivation minimizes the use of Lagrange multipliers, producing multiplier estimates as a by-product of other calculations. An extension of Kantorovich's theorem shows that the algorithm maintains quadratic convergence even if the basis of the tangent space changes abruptly from iteration to iteration. The algorithm and its quadratic convergence are known but the drivation is new, simple, and suggests several new modifications of the algorithm.
Original language | English (US) |
---|---|
Pages (from-to) | 162-171 |
Number of pages | 10 |
Journal | Mathematical Programming |
Volume | 33 |
Issue number | 2 |
DOIs | |
State | Published - Nov 1985 |
Keywords
- Constrained Optimization
- Newton's Method
- Quadratic Convergence
ASJC Scopus subject areas
- Software
- General Mathematics