when certain parameters in the problem tend to limiting values (for example, when the sample size increases indefinitely, the intensity of the noise ap- proaches zero, etc.) To address the problem of asymptotically optimal estimators consider the following important case. Let X 1, X 2, … , X n be independent observations with the joint probability density !(x, O) (with respect to the Lebesgue measure on the real line) which depends on the unknown patameter o e 9 c R1. It is required to derive the best (asymptotically) estimator 0:( X b … , X n) of the parameter O. The first question which arises in connection with this problem is how to compare different estimators or, equivalently, how to assess their quality, in terms of the mean square deviation from the parameter or perhaps in some other way. The presently accepted approach to this problem, resulting from A. Wald’s contributions, is as follows: introduce a nonnegative function w(0l> ( ), Ob Oe 9 (the loss function) and given two estimators Of and O! n 2 2 the estimator for which the expected loss (risk) Eown(Oj, 0), j = 1 or 2, is smallest is called the better with respect to Wn at point 0 (here Eo O is the expectation evaluated under the assumption that the true value of the parameter is 0). Obviously, such a method of comparison is not without its defects.
R.Z. Has’minskii & I.A. Ibragimov
Statistical Estimation [PDF ebook]
Asymptotic Theory
Statistical Estimation [PDF ebook]
Asymptotic Theory
Mua cuốn sách điện tử này và nhận thêm 1 cuốn MIỄN PHÍ!
Ngôn ngữ Anh ● định dạng PDF ● ISBN 9781489900272 ● Phiên dịch S. Kotz ● Nhà xuất bản Springer New York ● Được phát hành 2013 ● Có thể tải xuống 3 lần ● Tiền tệ EUR ● TÔI 4606245 ● Sao chép bảo vệ Adobe DRM
Yêu cầu trình đọc ebook có khả năng DRM