The Next Battleground for Deep Learning Performance
The frameworks are in place, the hardware infrastructure is robust, but what has been keeping machine learning performance at bay has far less to do with the system-level capabilities and more to do with intense model optimization.
On the Radar: SigOpt for machine learning algorithm optimization
The potential for machine learning systems, such as those based on deep learning, depends on organizations having the skill to develop their models and fine tune the multitude of model configuration parameters (known as hyperparameters).
On Aug 10, @fjhickernell, leader of the SigOpt sponsored QMCPy project, gives a digital tutorial at the #MCQMC 2020 meeting. Register now, and learn more about why Quasi-Monte Carlo is an important tool for efficient computations: https://t.co/iGHm3y17ig
The modern #MachineLearning stack is anything but simple. As models have gotten more complex, so have processes that go into creating them. Is your team overlooking any of these core components? https://t.co/1133ATedGk
Kicking off in one hour at 9am PT! The @usenix #opml20 Session 7 on Model Training. Catch SigOpt's Tobias Andreasen with other great speakers from @GoogleAI and @Intuit. Don't forget to join the Slack channel for live discussion: https://t.co/6vfFQvfCrU