Want to learn more about the technologies behind SigOpt? We’ve got you covered.
Get started with the basics behind SigOpt’s methods.
An introductory document discussing the key concepts behind Bayesian optimization.
A primer on Gaussian processes and how they power SigOpt’s optimization.
Learn about covariance kernels and how they apply to Gaussian processes.
Discussions of applying SigOpt's methods to Machine Learning and Hyperparameter Optimization.
Using SigOpt and reinforcement learning to play games with OpenAI
A short example using SigOpt and scikit-learn to build and tune a text sentiment classifier.
An example using SigOpt and xgboost to build an optical character recognition model using unsupervised techniques.
An example using SigOpt and TensorFlow to build a CNN for optical character recognition.
Optimizing deep neural nets with SigOpt and Nervana Cloud.
An example using SigOpt and MLlib to optimize parameters of the alternating least squares algorithm.
A comparison of different hyperparameter optimization methods.
Discussions of problem types where SigOpt can be used.
A short example using SigOpt to tune a system with multiple independent objectives.
Applications of Bayesian optimization
A system for more efficient multicriteria optimization
An example of deploying and scaling parameter optimization methods.
Using SigOpt to tune a financial trading model.
Using SigOpt to tune a model for predicting basketball scores.
Comparisons to Other Methods
An overview of how we rigourously compare SigOpt to other optimization methods.
An article outlining an aggregative performance comparison strategy for Bayesian optimization methods.
An article presenting an overview of SigOpt’s evaluation framework system used to benchmark our optimization engine.
A more abstract framework for comparing optimization methods.