This book focuses on two-time-scale Markov chains in discrete time. Our motivation stems from existing and emerging applications in optimization and control of complex systems in manufacturing, wireless communication, and ?nancial engineering. Much of our e?ort in this book is devoted to designing system models arising from various applications, analyzing them via analytic and probabilistic techniques, and developing feasible compu- tionalschemes. Ourmainconcernistoreducetheinherentsystemcompl- ity. Although each of the applications has its own distinct characteristics, all of them are closely related through the modeling of uncertainty due to jump or switching random processes. Oneofthesalientfeaturesofthisbookistheuseofmulti-timescalesin Markovprocessesandtheirapplications. Intuitively, notallpartsorcom- nents of a large-scale system evolve at the same rate. Some of them change rapidly and others vary slowly. The di?erent rates of variations allow us to reduce complexity via decomposition and aggregation. It would be ideal if we could divide a large system into its smallest irreducible subsystems completely separable from one another and treat each subsystem indep- dently. However, this is often infeasible in reality due to various physical constraints and other considerations. Thus, we have to deal with situations in which the systems are only nearly decomposable in the sense that there are weak links among the irreducible subsystems, which dictate the oc- sional regime changes of the system. An e?ective way to treat such near decomposability is time-scale separation. That is, we set up the systems as if there were two time scales, fast vs. slow. xii Preface Followingthetime-scaleseparation, weusesingularperturbationmeth- ology to treat the underlying systems.
Содержание
Prologue and Preliminaries.- Introduction, Overview, and Examples.- Mathematical Preliminaries.- Asymptotic Properties.- Asymptotic Expansions.- Occupation Measures.- Exponential Bounds.- Interim Summary and Extensions.- Applications.- Stability of Dynamic Systems.- Filtering.- Markov Decision Processes.- LQ Controls.- Mean-Variance Controls.- Production Planning.- Stochastic Approximation.