Optimality necessary conditions in singular stochastic control problems with nonsmooth data

K. Bahlali, F. Chighoub, B. Djehiche, B. Mezerdi

Research output: Contribution to journalArticlepeer-review


The present paper studies the stochastic maximum principle in singular optimal control, where the state is governed by a stochastic differential equation with nonsmooth coefficients, allowing both classical control and singular control. The proof of the main result is based on the approximation of the initial problem, by a sequence of control problems with smooth coefficients. We, then apply Ekeland's variational principle for this approximating sequence of control problems, in order to establish necessary conditions satisfied by a sequence of near optimal controls. Finally, we prove the convergence of the scheme, using Krylov's inequality in the nondegenerate case and the Bouleau-Hirsch flow property in the degenerate one. The adjoint process obtained is given by means of distributional derivatives of the coefficients.

Original languageEnglish (US)
Pages (from-to)479-494
Number of pages16
JournalJournal of Mathematical Analysis and Applications
Issue number2
StatePublished - Jul 15 2009


  • Adjoint process
  • Distributional derivative
  • Maximum principle
  • Singular control
  • Stochastic control
  • Stochastic differential equation
  • Variational principle

ASJC Scopus subject areas

  • Analysis
  • Applied Mathematics


Dive into the research topics of 'Optimality necessary conditions in singular stochastic control problems with nonsmooth data'. Together they form a unique fingerprint.

Cite this