Experiments: Automate Intelligent Hyperparameter Optimization

Nick Payton and Tobias Andreasen
Deep Learning, Experiment Management, Hyperparameter Optimization, Machine Learning

SigOpt includes Runs, a hosted API to track a single training run, log modeling attributes, and organize your modeling workflow. It also includes a Dashboard to analyze your models and organize your projects as you execute these Runs. The third primary component of SigOpt is Experiments, which is a hosted API to automate intelligent, sample-efficient hyperparameter optimization. Here is the simple process: 

  1. Annotate your code with SigOpt Runs
  2. Create the experiment by defining the parameters or hyperparameters to optimize, the number of training runs, and parallel width for the job
  3. Call your annotate code in an optimization loop (%%optimize with Jupyter, `sigopt run` on the command line) 
  4. Let SigOpt’s algorithm intelligently maximize or minimize the metric(s)

As this optimization loop proceeds, SigOpt is intelligently applying an ensemble of Bayesian and other global optimization algorithms to quickly learn about your parameter space and maximize or minimize your objective metric(s). Our solution automatically populates the results of each suggestion in our Dashboard, as in this example experiment:

At the end of the job, SigOpt will have uncovered the best performing configuration of your model.

Learn how easy it is to use Runs and Experiments together by checking out this blog post: Take the Pain out of Training and Tuning.

If you have the infrastructure to train multiple models at the same time, you can parallelize SigOpt to return up to 100 suggestions at once. And if you have specific experimentation needs, address them with SigOpt’s advanced features that make our platform the most robust available solution for hyperparameter optimization. 

If you want to test Experiments before applying it to your own modeling project, use this example notebook to execute a simple hyperparameter optimization job with SigOpt. 

If you want to try out the product, sign up for free access, execute a run to track your training, and launch an experiment to automate hyperparameter optimization. If you want to learn more about our products, track industry news, and hear from our research team, follow our blog or subscribe to our youtube channel.

Nick Payton
Nick Payton Head of Marketing & Partnerships
Tobias Andreasen
Tobias Andreasen Machine Learning Specialist

Want more content from SigOpt? Sign up now.