Thursday, June 16, 2016

Optimization Methods for Large-Scale Machine Learning

The preprint also includes a section on Large Scale L1 regularization of interest in compressive sensing. Of note:
There are important methods that are not included in our presentation|such as the alternating direction method of multipliers (ADMM) [54,61, 64] and the expectation-maximization (EM) method and its variants [45, 153]|but our study covers many of the core algorithmic frameworks in optimization for machine learning, with emphasis on methods and theoretical guarantees that have the largest impact on practical performance.

Optimization Methods for Large-Scale Machine Learning by Léon Bottou, Frank E. Curtis, Jorge Nocedal

This paper provides a review and commentary on the past, present, and future of numerical optimization algorithms in the context of machine learning applications. Through case studies on text classification and the training of deep neural networks, we discuss how optimization problems arise in machine learning and what makes them challenging. A major theme of our study is that large-scale machine learning represents a distinctive setting in which the stochastic gradient (SG) method has traditionally played a central role while conventional gradient-based nonlinear optimization techniques typically falter. Based on this viewpoint, we present a comprehensive theory of a straightforward, yet versatile SG algorithm, discuss its practical behavior, and highlight opportunities for designing algorithms with improved performance. This leads to a discussion about the next generation of optimization methods for large-scale machine learning, including an investigation of two main streams of research on techniques that diminish noise in the stochastic directions and methods that make use of second-order derivative approximations.




Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly