The author considers the problem of sequential probability forecasting in the most general setting, where the observed data may exhibit an arbitrary form of stochastic dependence. All the results presented are theoretical, but they concern the foundations of some problems in such applied areas as machine learning, information theory and data compression.
Inhoudsopgave
Introduction.- Notation and Definitions.- Prediction in Total Variation: Characterizations.- Prediction in KL-Divergence.- Decision-Theoretic Interpretations.- Middle-Case: Combining Predictors Whose Loss Vanishes.- Conditions Under Which One Measure Is a Predictor for Another.- Conclusion and Outlook.
Over de auteur
Dr. Daniil Ryabko (HDR) has a full-time position at INRIA, he has recently been on research assignments in Belize and Madagascar.