Metric Thresholds, a New Feature to Supercharge Multimetric Optimization
Today, we are excited to announce the general availability of Metric Thresholds, a new feature that supercharges the performance of Multimetric optimization. Metric Thresholds helps you discover better models that meet your problem-specific needs by allowing you to define “thresholds” on your metrics.
SigOpt Partners with Two Sigma to Extend its Leadership in Model Experimentation and Optimization Solutions
SigOpt, Inc. (“SigOpt”), a leading provider of solutions that maximize the performance of machine learning, deep learning and simulation models, announced a strategic partnership with Two Sigma, a leading systematic investment manager.
Categories
All
Activity Recognition
Advanced Optimization Techniques
All Model Types
Applied
Applied AI Insights
Augmented ML Workflow
Company news
Convolutional Neural Networks
Data Augmentation
Deep Learning
Focus Area
Machine Learning
Materials Science
Methodology
Model Type
Modeling Best Practices
Multimetric Optimization
Preference Optimization
Reinforcement Learning
Research
SigOpt 101
Simulations & Backtests
Training & Tuning
Authors
All
Alexandra Johnson
Barrett Williams
Ben Hsu
Dan Anderson
Harvey Cheng
Ivy Zhou
Meghana Ravikumar
Michael McCourt
Nick Payton
Nicki Vance
Olivia Kim
Patrick Hayes
Ruben Martinez-Cantin
Sarth Frey
Scott Clark
Taylor Jackle Spriggs
Brady Neal
David Kozak
Eric Bai
Francisco Merino-Casallo
Gustavo Malkomes
Ian Dewancker
Javier García-Barcos
Jungtaek Kim
Katharina Eggensperger
Marcel Wursch
Marcus Liwicki
Michele Alberti
Mislav Balunovich
Paul Leu
Raul Astudillo
Rolf Ingold
Roman Garnett
Sathish Nagappan
Shali Jiang
Vinaychandran Pondenkandath
Advanced Optimization Techniques, Applied, Applied AI Insights, Convolutional Neural Networks, Research
University of Fribourg uses SigOpt to develop a spectral initialization strategy to increase classification accuracy on medical images by 2.2%
From June 9-15 this year, leaders in the machine learning community from around the world met at the Long Beach Convention Center for the 36th International Conference on Machine Learning.
Metric Thresholds, a New Feature to Supercharge Multimetric Optimization
Today, we are excited to announce the general availability of Metric Thresholds, a new feature that supercharges the performance of Multimetric optimization. Metric Thresholds helps you discover better models that meet your problem-specific needs by allowing you to define “thresholds” on your metrics.
We collaborate with the University of Pittsburgh on fabricating nanostructured glass with ultrahigh transmittance, ultralow haze, and superomniphobicity.
Highlight: Bayesian Optimization of Composite Functions
We are excited to release a new SigOpt feature into alpha testing: Training Monitor. This feature was designed to better empower our neural network developers.
Congratulations, Roman Garnett, on your NSF CAREER Award
Machine learning infrastructure tools can help bridge the gap between the modeler and the cluster by inserting an abstraction layer between the model builder and any infrastructure tools used to communicate with the cluster.
We are pleased to announce the addition of Projects, a feature that allows you and your team to organize experiments for easier access, viewing, and sharing.
Integration of in vitro and in silico Models Using Bayesian Optimization With an Application to Stochastic Modeling of Mesenchymal 3D Cell Migration
SigOpt, Inc. (“SigOpt”), a leading provider of solutions that maximize the performance of machine learning, deep learning and simulation models, announced a strategic partnership with Two Sigma, a leading systematic investment manager.
As part of our mission to make hyperparameter optimization accessible to modelers everywhere, we want to share our naming choices with other developers of HPO APIs.
In this latest Academic Interview, Brady Neal from the Mila (Quebec Artificial Intelligence Institute) discuss the paper, “A Modern Take on the Bias-Variance Tradeoff in Neural Networks” and its implications for deep learning practitioners.
The NeurIPS conference is held in Montreal this year, where leaders in the community will present innovative research over the course of tutorials, main conference proceedings and workshops.
Highlight: Bayesian Optimization of High Transparency, Low Haze, and High Oil Contact Angle Rigid and Flexible Optoelectronic Substrates
Our research team at SigOpt has been very fortunate to be able to collaborate with outstanding researchers around the world, including through our internship program.
Highlight: A Nonstationary Designer Space-Time Kernel
This post introduces the article A Nonstationary Designer Space-Time Kernel by Michael McCourt, Gregory Fasshauer, and David Kozak, appearing at the upcoming NeurIPS 2018 spatiotemporal modeling workshop.
Highlight: Automating Bayesian Optimization with Bayesian Optimization
This post introduces the article Automating Bayesian Optimization with Bayesian Optimization by Gustavo Malkomes and Roman Garnett, appearing in the upcoming NeurIPS 2018 proceedings.
Highlight: Efficient Nonmyopic Batch Active Search
This post introduces the article Efficient Nonmyopic Batch Active Search by Shali Jiang, Gustavo Malkomes, Matthew Abbott, Benjamin Moseley and Roman Garnett, appearing in the upcoming NeurIPS 2018 proceedings.
This post is an interview with the DeepDIVA team from the University of Fribourg to learn more about their framework and what it means for the broader community.
Advanced Optimization Techniques, All Model Types
Multimetric Updates in the Experiment Insights Dashboard
Competitions like International Conference on Machine Learning help explore the complexities of developing effective ML pipelines under severe time restrictions and without expert intuition regarding the desired or expected output.
The research team at SigOpt works to provide the best Bayesian optimization platform for our customers. In our spare time, we also engage in research projects in a variety of other fields.
SigOpt 101, Simulations & Backtests
SigOpt Enters Strategic Investment Agreement with In-Q-Tel
This is the first edition of our new quarterly newsletter. In these updates, we will discuss newly released features, showcase content we have produced, published, or been cited in, and share interesting machine learning research that our research team has found. We hope you find these valuable and informative!
Today we announce the availability of SigOpt on AWS Marketplace. Now, with a single click, data scientists and researchers can access SigOpt’s powerful optimization-as-a-service platform designed to automatically fine-tune their machine learning (ML) and artificial intelligence (AI) workloads.
We have revisited one of the traditional acquisition functions of Bayesian optimization, the upper confidence bound and looked to a slightly different interpretation to better inform optimization of it. In our article, we present a number of examples of this clustering-guided UCB method being applied to various optimization problems.
All Model Types, Machine Learning, Training & Tuning
In this blog post, we are going to show solutions to some of the most common problems we’ve seen people run into when implementing hyperparameter optimization.
In this post, we will show you how Bayesian optimization was able to dramatically improve the performance of a reinforcement learning algorithm in an AI challenge.
Advanced Optimization Techniques, All Model Types
Building a Better Mousetrap via Multicriteria Bayesian Optimization
In this post, we want to analyze a more complex situation in which the parameters of a given model produce a random output, and our multicriteria problem involves maximizing the mean while minimizing the variance of that random variable.
All Model Types, Augmented ML Workflow
Solving Common Issues in Distributed Hyperparameter Optimization
SigOpt’s API for hyperparameter optimization leaves us well-positioned to build exciting features for anyone who wants to perform Bayesian hyperparameter optimization in parallel.
In this post we will discuss the topic of multicriteria optimization, when you need to optimize a model for more than a single metric, and how to use SigOpt to solve these problems.
Machine Learning, Training & Tuning
Bayesian Optimization for Collaborative Filtering with MLlib
In this post we will show how to tune an MLlib collaborative filtering pipeline using Bayesian optimization via SigOpt. Code examples from this post can be found on our github repo.
Today is a big day at SigOpt. Since the seed round we secured last year, we’ve continued to build toward our mission to ‘optimize everything,’ and are now helping dozens of companies amplify their research and development and drive business results with our cloud Bayesian optimization platform.
In this post, we review the history of some of the tools implemented within SigOpt, and then we discuss the original solution to this black box optimization problem, known as full factorial experiments or grid search. Finally, we compare both naive and intelligent grid-based strategies to the latest advances in Bayesian optimization, delivered via the SigOpt platform.
Advanced Optimization Techniques, Deep Learning
Deep Neural Network Optimization with SigOpt and Nervana Cloud
In this post, we will detail how to use SigOpt with the Nervana Cloud and show results on how SigOpt and Nervana are able to reproduce, and beat, the state of the art performance in two papers.
Applied AI Insights, Simulations & Backtests
Winning on Wall Street: Tuning Trading Models with Bayesian Optimization
SigOpt provides customers the opportunity to build better machine learning and financial models by providing users a path to efficiently maximizing key metrics which define their success. In this post we demonstrate the relevance of model tuning on a basic prediction strategy for investing in bond futures.
This blog post provides solutions for comparing different optimization strategies for any optimization problem, using hyperparameter tuning as the motivating example.
In this post on integrating SigOpt with machine learning frameworks, we will show you how to use SigOpt and TensorFlow to efficiently search for an optimal configuration of a convolutional neural network (CNN).
Advanced Optimization Techniques, Deep Learning
Unsupervised Learning with Even Less Supervision Using Bayesian Optimization
In this post on integrating SigOpt with machine learning frameworks, we will show you how to use SigOpt and XGBoost to efficiently optimize an unsupervised learning algorithm’s hyperparameters to increase performance on a classification task.
SigOpt allows experts to build the next great model and apply their domain expertise instead of searching in the dark for the best experiment to run next. With SigOpt you can conquer this tedious, but necessary, element of development and unleash your experts on designing better products with less trial and error.
In this first post on integrating SigOpt with machine learning frameworks, we’ll show you how to use SigOpt and scikit-learn to train and tune a model for text sentiment classification in under 50 lines of Python.
SigOpt provides customers the opportunity to define uncertainty with their observations and using this knowledge we can balance observed results against their variance to make predictions and identify the true behavior behind the uncertainty.
Gaussian processes are powerful because they allow you to exploit previous observations about a system to make informed, and provably optimal, predictions about unobserved behavior. They do this by defining an expected relationship between all possible situations; this relationship is called the covariance and is the topic of this post.
Rescale gives users the dynamic computational resources to run their simulations and SigOpt provides tools to optimize them. By using Rescale’s platform and SigOpt’s tuning, efficient cloud simulation is easier than ever.
At SigOpt, we can help you raise this metric automatically and optimally whether you are optimizing an A/B test, machine learning system, or physical experiment.