Stochastic nonlinear control: A unified framework for stability, dissipativity, and optimality

Author(s)
Rajpurohit, Tanmay
Editor(s)
Associated Organization(s)
Organizational Unit
Organizational Unit
Daniel Guggenheim School of Aerospace Engineering
The Daniel Guggenheim School of Aeronautics was established in 1931, with a name change in 1962 to the School of Aerospace Engineering
Supplementary to:
Abstract
In this work, we develop connections between stochastic stability theory and stochastic optimal control. In particular, first we develop Lyapunov and converse Lyapunov theorems for stochastic semistable nonlinear dynamical systems. Semistability is the property whereby the solutions of a stochastic dynamical system almost surely converge to (not necessarily isolated) Lyapunov stable in probability equilibrium points determined by the system initial conditions. Then we develop a unified framework to address the problem of optimal nonlinear analysis and feedback control for nonlinear stochastic dynamical systems. Specifically, we provide a simplified and tutorial framework for stochastic optimal control and focus on connections between stochastic Lyapunov theory and stochastic Hamilton-Jacobi-Bellman theory. In particular, we show that asymptotic stability in probability of the closed-loop nonlinear system is guaranteed by means of a Lyapunov function which can clearly be seen to be the solution to the steady-state form of the stochastic Hamilton-Jacobi-Bellman equation, and hence, guaranteeing both stochastic stability and optimality. Moreover, extensions to stochastic finite-time and partial-state stability and optimal stabilization are also addressed. Finally, we extended the notion of dissipativity theory for deterministic dynamical systems to controlled Markov diffusion processes and show the utility of the general concept of dissipation for stochastic systems.
Sponsor
Date
2018-02-06
Extent
Resource Type
Text
Resource Subtype
Dissertation
Rights Statement
Rights URI