A sequential quadratic programming algorithm for nonconvex, nonsmooth constrained optimization

Frank E. Curtis, Michael L. Overton

Research output: Contribution to journalArticlepeer-review


We consider optimization problems with objective and constraint functions that may be nonconvex and nonsmooth. Problems of this type arise in important applications, many having solutions at points of nondifferentiability of the problem functions. We present a line search algorithm for situations when the objective and constraint functions are locally Lipschitz and continuously differentiable on open dense subsets of ℝ n. Our method is based on a sequential quadratic programming (SQP) algorithm that uses an ℓ 1 penalty to regularize the constraints. A process of gradient sampling (GS) is employed to make the search direction computation effective in nonsmooth regions. We prove that our SQP-GS method is globally convergent to stationary points with probability one and illustrate its performance with a MATLAB implementation.

Original languageEnglish (US)
Pages (from-to)474-500
Number of pages27
JournalSIAM Journal on Optimization
Issue number2
StatePublished - 2012


  • Constrained optimization
  • Exact penalization
  • Gradient sampling
  • Nonconvex optimization
  • Nonsmooth optimization
  • Sequential quadratic programming

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science


Dive into the research topics of 'A sequential quadratic programming algorithm for nonconvex, nonsmooth constrained optimization'. Together they form a unique fingerprint.

Cite this