Stochastic Averaging and Extremum Seeking treats methods inspired by attempts to understand the seemingly non-mathematical question of bacterial chemotaxis and their application in other environments. The text presents significant generalizations on existing stochastic averaging theory developed from scratch and necessitated by the need to avoid violation of previous theoretical assumptions by algorithms which are otherwise effective in treating these systems. Coverage is given to four main topics.
Stochastic averaging theorems are developed for the analysis of continuous-time nonlinear systems with random forcing, removing prior restrictions on nonlinearity growth and on the finiteness of the time interval. The new stochastic averaging theorems are usable not only as approximation tools but also for providing stability guarantees.
Stochastic extremum-seeking algorithms are introduced for optimization of systems without available models. Both gradient- and Newton-based algorithms are presented, offering the user the choice between the simplicity of implementation (gradient) and the ability to achieve a known, arbitrary convergence rate (Newton).
The design of algorithms for non-cooperative/adversarial games is described. The analysis of their convergence to Nash equilibria is provided. The algorithms are illustrated on models of economic competition and on problems of the deployment of teams of robotic vehicles.
Bacterial locomotion, such as chemotaxis in E. coli, is explored with the aim of identifying two simple feedback laws for climbing nutrient gradients. Stochastic extremum seeking is shown to be a biologically-plausible interpretation for chemotaxis. For the same chemotaxis-inspired stochastic feedback laws, the book also provides a detailed analysis of convergence for models of nonholonomic robotic vehicles operating in GPS-denied environments.
The book contains block diagrams and several simulation examples, including examples arising from bacterial locomotion, multi-agent robotic systems, and economic market models.
Stochastic Averaging and Extremum Seeking will be informative for control engineers from backgrounds in electrical, mechanical, chemical and aerospace engineering and to applied mathematicians. Economics researchers, biologists, biophysicists and roboticists will find the applications examples instructive.
Содержание
Stochastic Averaging for Aymptotic Stability.- Stochastic Averaging for Practical Stability.- Single-parameter Stochastic Extremum Seeking.- Stochastic Source Seeking for Nonholonomic Vehicles.- Stochastic Source Seeking with Tuning of Forward Velocity.- Multi-parameter Stochastic Extremum Seeking and Slope Seeking.- Stochastic Nash Equilibrium Seeking for Games with General Nonlinear Payoffs.- Nash Equilibrium Seeking for Quadratic Games and Application to Oligopoly Markets and Vehicle Deployment.- Newton-based Stochastic Extremum Seeking.
Об авторе
Miroslav Krstic is an author of several books on adaptive control, stochastic nonlinear control, extremum seeking, and control of PDEs. Several of these books have had a high impact in the control field and inspired many researchers to work on the topics that the books have covered and apply the tools from the books in their research and in practice.
Shujun Liu is a young researcher in mathematics and control theory in China with strong connections with the leading research groups in control theory at the Chinese Academy of Sciences. Her doctoral work on stochastic stability and stabilization has had considerable influence on a number of research groups in China who have taken on this topic after her initial work with her doctoral advisor Professor Jifeng Zhang.
Much of the material of this book was developed while the first author was a postdoctoral scholar with the second author at University of California, San Diego.