Advanced Optimization Techniques, Application, Deep Learning, Experiment Management, Modeling Best Practices, Natural Language
Efficient BERT: Find your optimal model with Multimetric Bayesian Optimization
SigOpt ML Engineer Meghana Ravikumar shares how concurrently tuning metrics like model accuracy and number of model parameters allows us to distill BERT and assess the trade-offs between model size and performance.
SigOpt Webinar: Warm Start Tuning with Prior Beliefs
SigOpt is designed to accelerate hyperparameter optimization by intelligently trading off exploration and exploitation with an ensemble of Bayesian and global optimization algorithms that do not require access to proprietary prior insights on a model or parameters. But our users often have deep insights on their models and parameters that could guide this tuning process. Michael McCourt, Head of Research, walks through how SigOpt's newest feature Prior Beliefs to incorporated this type of prior knowledge into SigOpt’s hyperparameter optimization process.
All Model Types, Multimetric Optimization, Training & Tuning
SigOpt Webinar: Introducing Metric Management
SigOpt provides a “human-in-the-loop” process so that you can guide the platform’s discovery of a more effective set of parameters for your model, based on external business factors or domain-specific expertise. This process is called Metric Management, and is enabled by a variety of tools built into our API and web interface.
Advanced Optimization Techniques, Deep Learning, Training & Tuning
Tuning for Systematic Trading: Training, Tuning and Metric Strategy
In deep learning, it can be particularly tough to select the right metric and know when a model has converged during training. In this talk, we discuss ways to monitor convergence, automate early stopping and set the right metric strategy for deep learning training and tuning jobs
Advanced Optimization Techniques, All Model Types, Multimetric Optimization
Tuning for Systematic Trading: Intuition behind Bayesian optimization with and without multiple metrics
To kick off our first session of Tuning for Systematic Trading, SigOpt ML Engineer Tobias Andreasen walks us through an introduction of Bayesian optimization and some advanced features that can help you optimize model performance.
Advanced Optimization Techniques, Data Augmentation, Deep Learning
SigOpt at MLconf: Optimized Image Classification on the Cheap
Abstract: In this talk, we anchor on building an image classifier trained on the Stanford Cars dataset to evaluate two approaches to transfer learning -fine tuning and feature extraction- and the impact of hyperparameter optimization on these techniques.
Advanced Hyperparameter Optimization for Deep Learning with MLflow
Building on the “Best Practices for Hyperparameter Tuning with MLflow” talk, we will present advanced topics in HPO for deep learning, including early stopping, multi-metric optimization, and robust optimization.
Using Optimal Learning to Tune Machine Learning Models
Using Bayesian Optimization to Tune Machine Learning Models: In this talk we briefly introduce Bayesian Global Optimization as an efficient way to optimize machine learning model parameters, especially when evaluating different parameters is time-consuming or expensive.
In this talk we briefly introduce Bayesian Global Optimization as an efficient way to optimize machine learning model parameters, especially when evaluating different parameters is time-consuming or expensive.
Bayesian Optimization methods used by SigOpt, coupled with the incredibly scalable deep learning architecture provided with the Nervana Cloud and neon, allow anyone to easily tune their models to quickly achieve higher accuracy.