Human learning is way more efficient in power consumption… for now.

This article takes about 1 minute to read

When I was eating dinner a simple thought occurred to me: is machine learning more power efficient than human learning… from scratch?

A neural network called AlexNet was trained between five and six days on two GTX 580 3GB GPUs which have 363W consumption.

So simply calculating,

I found some site that describes recommended energy intakes for newborn babies. After some funny discussion we came to conclusion with my mom that I might’ve been able to perform ImageNet when I would have been 3 years old, so if I take that data, then

By comparing these I got to conclusions that humans are about 300 times more power efficient in learning from scratch. Of course, I took an old version of GPU training so that might differ nowadays (I didn’t find other GPU usages quickly, but I knew I would find them in Alex’s paper. Maybe someone can calculate this for other nets?).

Let’s assume that energy efficiency has the same exponential as Moore’s law so that it halves every 18 months. The GTX 580 launched on 9 November 2010 so it’s almost six years now, and the consumptions should be at the level of 363\times 2^{-4} = 22W. With this consumption it would take in total 6kW. By projecting these numbers human level power efficiency should be achieved by year 2022 - 6 years from now. And with better algorithms it’s probably going to be even sooner. Isn’t this exciting?

Leave a Comment