When Google bought DeepMind in January 2014 for $500 million, many were skeptical of why the search giant would spend so much on a company that had produced nothing of note until that point. Later in the year it was awarded the 'Company of the Year' by Cambridge Computer Laboratory and since then its trajectory has been increasing.
We recently discussed how it is making significant impacts in the NHS and how it may be doing some useful work with Moorfield's Eye Hospital, but they haven't been resting on their laurels, and are beginning to make inroads into energy saving in Google's data centers.
Given that there are 3.5 billion searches on Google every day, it is no surprise that 2% of all greenhouse gases emitted in the world come from their data centers. This creates an issue for the company, given their mantra of 'don't be evil’. Few would argue that releasing this amount of harmful gases into the atmosphere isn't evil, even if it is a byproduct of something good.
DeepMind has taken on the challenge of optimizing these data centers and initial work has seen energy use cut by 15%, including a 40% decrease in the amount of energy used in cooling.
It is one of the best uses of artificial intelligence, as despite human intuition showing that you could move a data centre to a colder climate or underground to save on cooling costs, it is predominantly a data-driven exercise. Mustafa Suleyman, co-founder of DeepMind described it as, 'It’s one of those perfect examples of a setting where humans have a really good intuition they’ve developed over time but the machine-learning algorithm has so much more data that describes real-world conditions.'
The data that he is alluding to is a collection of data from 5 years across Google's 15 huge data centers across the world. This, combined with the the 1.2 trillion Google searches conducted every year means that studying this would be far too much for humans alone. It is something that Mustafa is very aware of 'it’s able to learn from all sorts of niche little edge cases seen in the data that a human wouldn’t be able to identify. So it’s able to tune the settings much more subtly and much more accurately.'
At present the experiment has only been rolled out slowly, starting with just over 1% of servers being used for the experiment two years ago and now a 'double-digit percentage' being controlled through DeepMind's AI approach.
It is just the start of the potential this has, especially given the moves that companies are making to the cloud rather than on premise systems. For instance, Netflix, which used 36.5% of the world's bandwidth in 2015, has moved to an Amazon web services cloud system. Amazon is going to have 17 data centers within the next year holding a huge amount of information, including data from the two largest on-demand video sites, Netflix and Amazon Prime. If DeepMind, or a similar AI focussed company, could have a similar impact at these data centers or others across the world, the impact could be huge.
This is just the start of how AI could impact climate change initiatives and as we collect more information on the impacts that it will have on certain types of industry, land use or energy use, it will become even more powerful in the future.