Abstract
We consider optimization problems with objective and constraint functions that may be nonconvex and nonsmooth. Problems of this type arise in important applications, many having solutions at points of nondifferentiability of the problem functions. We present a line search algorithm for situations when the objective and constraint functions are locally Lipschitz and continuously differentiable on open dense subsets of ℝ n. Our method is based on a sequential quadratic programming (SQP) algorithm that uses an ℓ 1 penalty to regularize the constraints. A process of gradient sampling (GS) is employed to make the search direction computation effective in nonsmooth regions. We prove that our SQP-GS method is globally convergent to stationary points with probability one and illustrate its performance with a MATLAB implementation.
Original language | English (US) |
---|---|
Pages (from-to) | 474-500 |
Number of pages | 27 |
Journal | SIAM Journal on Optimization |
Volume | 22 |
Issue number | 2 |
DOIs | |
State | Published - 2012 |
Keywords
- Constrained optimization
- Exact penalization
- Gradient sampling
- Nonconvex optimization
- Nonsmooth optimization
- Sequential quadratic programming
ASJC Scopus subject areas
- Software
- Theoretical Computer Science