Artificial intelligence (AI) can distinguish a dog from a cat, but the billions of calculations needed to do so demand quite a lot of energy. The human brain can do the same thing while using only a small fraction of this energy. Could this phenomenon inspire us to develop more energy-efficient AI systems?

Researchers from the University of Massachusetts calculated that during the training of a model for natural language processing, 284 metric tons of carbon dioxide is emitted. This is equivalent to the emission of five cars during their entire life span, including construction. Some AI models developed by tech giants — which are not reported in scientific literature — might emit at a greater magnitude.
With the help of this compression strategy, the number of calculations can already be reduced by 30% to 50%. Thereafter, more application-based techniques will help us further improve efficiency. As such, we can already regain more than 90% of power by just optimizing the AI model, apart from the hardware considerations.

Source: “We Need To Talk About An Energy Label For AI”, Forbes

Download

Other contents