Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorK. Bahlali-
dc.contributor.authorB. Djehiche-
dc.description.abstractThe present paper studies the stochastic maximum principle in singular optimal control, where the state is governed by a stochastic differential equation with nonsmooth coefficients, allowing both classical control and singular control. The proof of the main result is based on the approximation of the initial problem, by a sequence of control problems with smooth coefficients. We, then apply Ekeland's variational principle for this approximating sequence of control problems, in order to establish necessary conditions satisfied by a sequence of near optimal controls. Finally, we prove the convergence of the scheme, using Krylov's inequality in the nondegenerate case and the Bouleau– Hirsch flow property in the degenerate one. The adjoint process obtained is given by means of distributional derivatives of the coefficients. Link
dc.subjectStochastic differential equation; Stochastic control; Maximum principle; Singular control; Distributional derivative; Adjoint process; Variational principle.en_US
dc.titleOptimality necessary conditions in singular stochastic control problems with nonsmooth dataen_US
Appears in Collections:Publications Internationales

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.