To read this content please select one of the options below:

EXISTENCE THEORY FOR STOCHASTIC OPTIMAL CONTROL SYSTEMS

W.G. NICHOLS (Department of Mathematics, University of South Florida, Tampa, Florida 33620 (U.S.A.))
CHRIS P. TSOKOS (Department of Mathematics, University of South Florida, Tampa, Florida 33620 (U.S.A.))

Kybernetes

ISSN: 0368-492X

Article publication date: 1 March 1975

42

Abstract

The aim of the present paper is to present certain existence theorems for stochastic control systems whose state variables, χ(t;ω), are continuous functions from the set R+ = {t;t ≥ 0} into the space L2(Ω, A, μ). That is, for each t R+, χ(t;ω) is a vector‐valued random variable whose second absolute moment exists. U = μ(t), the admissible controls, are taken as measurable functions of t only. It is assumed that the initial time is fixed but allow the terminal time tf(ω) to vary with ω∈Ω. The usual space constraints and boundary conditions are also allowed to vary with ω∈Ω. The cost functional is taken to be a continuous functional over a suitable class of continuous functions.

Citation

NICHOLS, W.G. and TSOKOS, C.P. (1975), "EXISTENCE THEORY FOR STOCHASTIC OPTIMAL CONTROL SYSTEMS", Kybernetes, Vol. 4 No. 3, pp. 143-148. https://doi.org/10.1108/eb005388

Publisher

:

MCB UP Ltd

Copyright © 1975, MCB UP Limited

Related articles