AI’s carbon footprint will be an issue for enterprises

George Lawton and TechTarget
All Model Types, Applied AI Insights

Parameter tuning and AI’s carbon footprint

One of the biggest challenges in improving the performance of AI models lies in tuning the parameters, or weights of the different artificial neurons used in the neural networks. And this aspect of training is not just limited to AI — this same type of tuning is required for a lot of business problems, including optimizing business models and simulations, operations research, logistics and programming by example.

Programming by example, or PBE, is a technique for training an AI through examples, such as providing I/O pairs related of how to structure data. All these problems can lead to combinatorial explosion, in which each parameter can increase the number of possible solutions, as data scientists test out various combinations of parameters.

Parameters can have a big impact on the performance of a model, and, as a result, that model’s impact on a business, said Scott Clark, co-founder and CEO of SigOpt Inc., which makes software for tuning deep learning parameters. Because parameter tuning often requires evaluating a wide variety of configurations, it can be computationally expensive. To the extent data centers are not run with sustainable energy, AI’s carbon footprint will get bigger.

But not all parameter tuning methods are equal. The more exhaustive or naive they are, the more computationally intensive. This type of approach (exhaustive and naïve) randomly tries out all possibilities, which can require significant time to find the right solution. The more intelligent and adaptive the parameter tuning methods are, the less computationally intensive.

Researchers and vendors are designing a new category of AI tools for parameter optimization that rely on Bayesianoptimization algorithms instead of naive random search or exhaustive grid search. Bayesian optimization uses a combination of statistics and approximations to search for the best combination of parameters for efficiency. SigOpt research claims they can slash compute time and power consumption by 95% compared with that used by the standard practice of randomly searching configurations.

Learn more: