Statistical learning theory is aimed at analyzing complex data with necessarily approximate models. This book is intended for an audience with a graduate background in probability theory and statistics. It will be useful to any reader wondering why it may be a good idea, to use as is often done in practice a notoriously "wrong” (i.e. over-simplified) model to predict, estimate or classify. This point of view takes its roots in three fields: information theory, statistical mechanics, and PAC-Bayesian theorems. Results on the large deviations of trajectories of Markov chains with rare transitions are also included. They are meant to provide a better understanding of stochastic optimization algorithms of common use in computing estimators. The author focuses on non-asymptotic bounds of the statistical risk, allowing one to choose adaptively between rich and structured families of models and corresponding estimators. Two mathematical objects pervade the book: entropy and Gibbs measures. The goal is to show how to turn them into versatile and efficient technical tools, that will stimulate further studies and results.
Olivier Catoni
Statistical Learning Theory and Stochastic Optimization [PDF ebook]
Ecole d’Ete de Probabilites de Saint-Flour XXXI – 2001
Statistical Learning Theory and Stochastic Optimization [PDF ebook]
Ecole d’Ete de Probabilites de Saint-Flour XXXI – 2001
Mua cuốn sách điện tử này và nhận thêm 1 cuốn MIỄN PHÍ!
Ngôn ngữ Anh ● định dạng PDF ● ISBN 9783540445074 ● Biên tập viên Jean Picard ● Nhà xuất bản Springer Berlin Heidelberg ● Được phát hành 2004 ● Có thể tải xuống 3 lần ● Tiền tệ EUR ● TÔI 6376548 ● Sao chép bảo vệ Adobe DRM
Yêu cầu trình đọc ebook có khả năng DRM