ensmallen
flexible C++ library for efficient mathematical optimization
Machine learning is a field that is inextricably intertwined with the field of optimization. Countless machine learning techniques depend on the optimization of a given objective function; for instance, classifiers such as logistic regression, metric learning methods like NCA, manifold learning algorithms like MVU, and the extremely popular field of deep learning. Thanks to the attention focused on these problems, it is increasingly important in the field to have fast, practical optimizers.
Therefore, the need is real to provide a robust, flexible framework in which new optimizers can be easily developed. Similarly, the need is also real for a flexible framework that allows new objective functions to be easily implemented and optimized with a variety of possible optimizers. However, the current landscape of optimization frameworks for machine learning is not particularly comprehensive. A variety of tools such as Caffe, TensorFlow, and Keras have optimization frameworks, but they are limited to SGD-type optimizers and are only able to optimize deep neural networks or related structures. Thus expressing arbitrary machine learning objective functions can be difficult or in some cases not possible. Other libraries, like scikit-learn, do have optimizers, but generally not in a coherent framework and often the implementations may be specific to an individual machine learning algorithm. At a higher level, many programming languages may have generic optimizers, like SciPy and MATLAB, but typically these optimizers are not suitable for large-scale machine learning tasks where, e.g., calculating the full gradient of all of the data may not be feasible. informations.
No comments:
Post a Comment