Neural networks dominate the modern machine learning landscape, but their training and performance still suffer from sensitivity to the empirical choice of training task hyperparameters. The aim of automated machine learning (AutoML) techniques is to automate and optimize the population of training tasks during the empirical parameter search, in order to maximize performance under limited computation resources.
Current attempts for AutoML are focused on configurations common to deep learning models, such as architecture, loss function, learning rate and optimization algorithm. Nevertheless , the training data and its quality are considered constant, which is in contrast to their importance in determining the quality of the trained model. In this talk, Moses will propose an integrated approach to AutoML which includes, in addition to the off-the-shelf hyperparameter optimizer, parameterization over the metadata population selected for training.
Guttmann is a computer vision and deep learning expert, with a two-decade track record in leading and driving execution of large-scale, complex products in a multitude of disciplines. Before co-founding allegro.ai, he founded an innovative semi-automatic 3D conversion company, built face recognition technologies and implemented wavelet-based compression on embedded systems for several companies.