The conference on Neural Information Processing Systems (NeurIPS) was held in December in Vancouver, Canada. SigOpt was excited to return as a Platinum sponsor this time to learn about the latest research and continue to support the community. We really value every chance we get to connect with users and researchers across all industries. Here are some highlights from this year’s event:

SigOpt gave a demonstration at the industry Expo on Intuition Behind Advanced Algorithmic Features: 

There were many fantastic papers at NeurIPS and we really enjoyed learning about them. In case you missed it, here are a few that we found interesting: 

1). Cost Effective Active Search
“I am particularly excited about a recent paper from my collaborators at Washington University in St. Louis and Carnegie Mellon (CMU): Shali Jiang, Roman Garnett, and Benjamin Moseley. To understand their contribution, we need to recall that:

    1. Active learning is a general framework for sequential data acquisition, i.e., intelligent sampling from a large unlabeled pool of data.
    2. Active search is an instance of active learning where the observations can be either positive or negative, and the goal is to rapidly find the most number of positive observations under a given budget.
    3. Previous efforts have focused on the budget setting, where the user has to specify the total number of allowed observations (or decision) beforehand.

In this paper, the authors seek the answer for a slightly different question: how can we perform intelligent sampling for quickly finding a predefined number of positive observations

Although these two problems resemble one another, they render completely different algorithmic solutions, and that’s what this paper is about! The authors propose a novel approach that results in a 56-73% reduction in cost (total number of observations) when compared to a greedy approach for solving this problem, as well as a new theoretical investigation of the challenges of optimally solving this problem.”

—Gustavo Malkomes, SigOpt Research Engineer

2). Fast and Accurate Least-Mean-Squares Solvers 
“It is quite surprising/refreshing to see a paper with ‘Least-Mean-Squares Solvers’ in its title at NeurIPS.  

    1. The main technique presented in this paper leverages the Caratheodory theorem to compress the input data (more precisely the covariance matrix). The authors provide a new algorithm to improve the asymptotic performance of computing the Caratheodory sets. 
    2. This paper showed some impressive empirical results, reducing the run-time of LMS solvers by almost 2 orders of magnitude on large datasets.

I especially enjoyed that the authors can combine old mathematical concepts (Caratheodory theorem dates back to 1907) with modern ideas (matrix sketching) in an innovative way to improve the performance a well-known problem.”

—Harvey Cheng, SigOpt Research Engineer

3). Certifying Geometric Robustness of Neural Networks
“As neural networks become more prevalent in safety-critical systems, guaranteeing their performance, and their robustness to attacks, becomes a necessity.  This work from ETH Zurich author (and SigOpt research intern alumnus) Mislav Balunovic and his team presents a strategy to certify a neural network as robust to a variety of input transformations (e.g., rotations, translations).  

The authors present a new strategy for developing a convex relaxation of the set of images which can be generated through such transformations; a proposed neural network is then tested against this relaxation and certified as robust if it makes the correct prediction for all images in the relaxation.  

By developing a strategy for finding the asymptotically optimal convex relaxation (the smallest one possible) they are able to more confidently verify images and certify geometric robustness of a given neural network significantly more accurately than the state of the art.”

—Michael McCourt, SigOpt Research Engineer

SigOpt Research Engineer Harvey Cheng at a poster session presenting one of SigOpt’s workshop papers:

We were fortunate to have 4 workshop papers accepted: 

1). Applying Bayesian Optimization to Understand Tradeoffs for Antireflective Optical Designs
Sajad Haghanifar, Michael McCourt, Bolong Cheng, Paul Leu
About the paper: Continuing from earlier articles presented at last year’s NeurIPS workshop on molecules and materials and this year’s ICML workshop on climate change, this article discusses the use of sample-efficient multi-objective optimization to expose the tradeoffs present in additive manufacturing of antireflective glass.  Our strategy, derived from active search and Bayesian optimization, compares favorably to a standard genetic algorithm. Furthermore, the Pareto efficient frontier uncovered by our algorithm reveals certain ideal properties associated with low-reflectivity surfaces which help inform fabrication practices.
Workshop: Second Workshop on Machine Learning and the Physical Sciences
Blog: Reducing Reflection in Additive Manufacturing of Optoelectronics

2). Automated Model Search Using Bayesian Optimization and Genetic Programming 
Louis Schlessinger, Gustavo Malkomes, Roman Garnett
About the paper: This paper proposes an algorithm that uses Bayesian optimization and genetic programming to traverse the infinite search space of probabilistic models. Results suggest that the surrogate-assisted evolutionary kernel construction algorithm can discover underlying structure on different latent functions.
Workshop: Workshop on Meta-Learning

3). Accelerating Psychometric Screening Tests With Bayesian Active Differential Selection 
Trevor Larsen, Gustavo Malkomes, Dennis Barbour
About the paper: To provide a rapid screening for a change in the psychometric function estimation of a patient, we use Bayesian active model selection to perform an automated pure-tone audiogram test to quickly figure out if the current audiogram is different from the previous one. Results show that with a few tones we can detect if the patient’s audiometric function has changed between the two test sessions with high confidence. 
Workshop: ML4H: Machine Learning for Health

4). Parallel Robust Bayesian Optimization with Off-Policy Evaluations 
Javier Garcia-Barcos, Ruben Martinez-Cantin
About the paper: We combined an existing method to deal with input noise: unscented Bayesian optimization with the use of stochastic policies for data acquisition within BO. The combination of both methods resulted in an improved robustness and efficiency, building the first parallel and robust Bayesian optimization method. These methods are evaluated in benchmark functions previously used in the literature.
Workshop: Safety and Robustness in Decision Making

This is our most comprehensive participation and attendance at NeurIPS thus far. Whether you came by the demo during Expo Day, our booth in the exhibit hall or the workshop sessions, we want to thank you for taking your time to meet us! If you have any questions about SigOpt or would like to connect with us, please don’t hesitate to reach out. Hope to see you at our next event!

img-Tiffany
Tiffany Huynh Field Marketing
MichaelMcCourt
Michael McCourt Research Engineer
HarveyCheng
Harvey Cheng Research Engineer
Barrett-Williams
Barrett Williams Product Marketing Lead