Newton's method for constrained optimization

Jonathan Goodman

Research output: Contribution to journalArticlepeer-review


We derive a quadratically convergent algorithm for minimizing a nonlinear function subject to nonlinear equality constraints. We show, following Kaufman [4], how to compute efficiently the derivative of a basis of the subspace tangent to the feasible surface. The derivation minimizes the use of Lagrange multipliers, producing multiplier estimates as a by-product of other calculations. An extension of Kantorovich's theorem shows that the algorithm maintains quadratic convergence even if the basis of the tangent space changes abruptly from iteration to iteration. The algorithm and its quadratic convergence are known but the drivation is new, simple, and suggests several new modifications of the algorithm.

Original languageEnglish (US)
Pages (from-to)162-171
Number of pages10
JournalMathematical Programming
Issue number2
StatePublished - Nov 1985


  • Constrained Optimization
  • Newton's Method
  • Quadratic Convergence

ASJC Scopus subject areas

  • Software
  • General Mathematics


Dive into the research topics of 'Newton's method for constrained optimization'. Together they form a unique fingerprint.

Cite this