Stochastic Control

Stochastic Control is a sub-field of control theory, which deals with the existence of uncertainty either in observations or in the noise that drives the evolution of the system. It aims to design the time path of the controlled variables that performs the desired control task with minimum cost, somehow defined, despite the presence of this noise.The system designer assumes, that random noise with known probability distribution affects the evolution and observation of the state variables.The context may be either discrete time or continuous time.