Mathematical Algorithms for Linear Regression discusses numerous fitting principles related to discrete linear approximations, corresponding numerical methods, and FORTRAN 77 subroutines. The book explains linear Lp regression, method of the lease squares, the Gaussian elimination method, the modified Gram-Schmidt method, the method of least absolute deviations, and the method of least maximum absolute deviation. The investigator can determine which observations can be classified as outliers (those with large errors) and which are not by using the fitting principle. The text describes the elimination of outliers and the selection of variables if too many or all of them are given by values. The clusterwise linear regression accounts if only a few of the relevant variables have been collected or are collectible, assuming that their number is small in relation to the number of observations. The book also examines linear Lp regression with nonnegative parameters, the Kuhn-Tucker conditions, the Householder transformations, and the branch-and-bound method. The text points out the method of least squares is mainly used for models with nonlinear parameters or for orthogonal distances. The book can serve and benefit mathematicians, students, and professor of calculus, statistics, or advanced mathematics.
Helmuth Spath
Mathematical Algorithms for Linear Regression [PDF ebook]
Mathematical Algorithms for Linear Regression [PDF ebook]
购买此电子书可免费获赠一本!
语言 英语 ● 格式 PDF ● ISBN 9781483264547 ● 编辑 Werner Rheinboldt ● 出版者 Elsevier Science ● 发布时间 2014 ● 下载 3 时 ● 货币 EUR ● ID 5734815 ● 复制保护 Adobe DRM
需要具备DRM功能的电子书阅读器