This is the first comprehensive book on information geometry, written by the founder of the field. It begins with an elementary introduction to dualistic geometry and proceeds to a wide range of applications, covering information science, engineering, and neuroscience. It consists of four parts, which on the whole can be read independently. A manifold with a divergence function is first introduced, leading directly to dualistic structure, the heart of information geometry. This part (Part I) can be apprehended without any knowledge of differential geometry. An intuitive explanation of modern differential geometry then follows in Part II, although the book is for the most part understandable without modern differential geometry. Information geometry of statistical inference, including time series analysis and semiparametric estimation (the Neyman–Scott problem), is demonstrated concisely in Part III. Applications addressed in Part IV include hot current topics in machine learning, signal processing, optimization, and neural networks. The book is interdisciplinary, connecting mathematics, information sciences, physics, and neurosciences, inviting readers to a new world of information and geometry. This book is highly recommended to graduate students and researchers who seek new mathematical methods and tools useful in their own fields.
Tabella dei contenuti
1 Manifold, Divergence and Dually Flat Structure.- 2 Exponential Families and Mixture Families of Probability.- 3 Invariant Geometry of Manifold of Probability.- 4 α-Geometry, Tsallis q-Entropy and Positive-Definite.- 5 Elements of Differential Geometry.- 6 Dual Affine Connections and Dually Flat Manifold.- 7 Asymptotic Theory of Statistical Inference.- 8 Estimation in the Presence of Hidden Variables.- 9 Neyman–Scott Problem.- 10 Linear Systems and Time Series.- 11 Machine Learning.- 12 Natural Gradient Learning and its Dynamics in Singular.- 13 Signal Processing and Optimization.- Index.