This book studies mathematical theories of machine learning. The first part of the book explores the optimality and adaptivity of choosing step sizes of gradient descent for escaping strict saddle points in non-convex optimization problems. In the second part, the authors propose algorithms to find local minima in nonconvex optimization and to obtain global minima in some degree from the Newton Second Law without friction. In the third part, the authors study the problem of subspace clustering with noisy and missing data, which is a problem well-motivated by practical applications data subject to stochastic Gaussian noise and/or incomplete data with uniformly missing entries. In the last part, the authors introduce an novel VAR model with Elastic-Net regularization and its equivalent Bayesian model allowing for both a stable sparsity and a group selection.
विषयसूची
Chapter 1. Introduction.- Chapter 2. General Framework of Mathematics.- Chapter 3. Problem Formulation.- Chapter 4. Development of Novel Techniques of Co Co SSC Method.- Chapter 5. Further Discussions of the Proposed Method.- Chapter 6. Related Work on Geometry of Non-Convex Programs.- Chapter 7. Gradient Descent Converges to Minimizers.- Chapter 8. A Conservation Law Method Based on Optimization.- Chapter 9. Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations.- Chapter 10. Online Discovery for Stable and Grouping Causalities in Multi-Variate Time Series.- Chapter 11. Conclusion.
लेखक के बारे में
Bin Shi is a Ph.D. candidate in the School of Computing and Information Sciences at FIU under the supervision of Professor Sitharama S. Iyengar. His preliminary research focuses on the theory of machine learning, especially on optimization. Bin Shi received his B.S. of Applied Math from Ocean University of China, China in 2006, Master of Pure Mathematics from Fudan University, China in 2011 and Master of Theoretical Physics from University of Massachusetts Dartmouth in 2015. His research interests focus on statistical machine learning and optimization, some theoretical computer science.
Dr. S.S. Iyengar is the Distinguished University Professor, Ryder Professor of Computer Science and Director of the School of Computing and Information Sciences at Florida International University (FIU), Miami. He is also the founding director of the Discovery Lab. Prior to joining FIU, Dr. Iyengar was the Roy Paul Daniel’s Distinguished Professor and Chairman of the Computer Science department for over 20 years at Lousiana State University. He has also worked as a visiting scientist at Oak Ridge National Lab, Jet Propulsion Lab, Satish Dhawan Professor at IISc and Homi Bhabha Professor at IGCAR, Kalpakkam and University of Paris and visited Tsinghua University, Korea Advanced Institute of Science and Technology (KAIST) etc. Professor Iyengar is an IEEE Distinguished Visitor, SIAM Distinguished Lecturer, and ACM National Lecturer and has won many other awards like Distinguished Research Master’s award, Hub Cotton award of Faculty Excellence (LSU), Rain Maker awards (LSU), Florida Information Technology award (IT2), Distinguished Research award from Tunisian Mathematical Society etc. During the last four decades, he has supervised over 55 Ph.D. students, 100 Master’s students, and many undergraduate students who are now faculty at Major Universities worldwide or Scientists or Engineers at National Labs/Industries around the world. He has publishedmore than 500 research papers, has authored/co-authored and edited 22 books. His books are published by MIT Press, John Wiley, and Sons, CRC Press, Prentice Hall, Springer Verlag, IEEE Computer Society Press, etc. One of his books titled /Introduction to Parallel Algorithms’ has been translated to Chinese.