Could Big Data Predict Earthquakes?

Can Big Data do what many believe to be impossible?


This week we have all been shocked at the scale of death and devastation in Nepal after the 7.8 magnitude earthquake that is expected to have claimed 10,000 lives. The strength of the tremors caused avalanches in the Himalayas, housing tumbled in the capital Kathmandu and strong tremors were even felt 900 miles away in India.

At times like this everybody asks whether there is more than could have been done to predict this earthquake earlier in order to help evacuate people and prevent so many casualties. It is generally accepted that there is no way to predict earthquakes accurately, but with the use of Big Data and technologies that can help us to analyze huge amounts of data, it is beginning to look like it is a distinct possibility in the future.

This is because earthquakes do not just happen. They may begin hundreds of miles underground, but they are not something that happens in isolation. The problem that those trying to predict earthquakes have, is that the changes are slight and can be explained in isolation, the cumulative effect of them give researchers a good indication that something is about to happen though.

The ability to predict when these earthquakes are likely to happen should be seen as important work, not only because around 13,000 people die every year due to earthquakes, but also because $12 billion is also estimated to be lost. If half of this loss could be invested in the technology needed to help predict these, then the chances of saving thousands of lives could be even greater.

Terra Seismic are a company who are trying to prove that earthquakes are predictable. They claim that through the use of satellite data that they can predict major earthquakes anywhere in the world with a 90% accuracy. A prime example of this in action was their February 22 prediction that there would be an earthquake of around 6.5 on the Indonesian Island of Sumatra, on March 3 the island was hit by a 6.4 earthquake.

They claim to have predicted some of the world’s largest earthquakes, such as the 8.1 quake that hit Chile in 2010 for instance.

Bringing several disparate data sources together and using a Python based software running on an Apache server, it is possible for them to make accurate predictions of earthquakes where sensors are capable of picking up the information needed.

The problem with this, as we have seen with the Nepal earthquake, is that the countries who have this kind of technology are often the richest, the ones who can afford this kind of infrastructure are also the ones who are best prepped for earthquakes and will naturally have fewer deaths and disruption.

We have seen that the importance of this could be huge from recent events and it should be a matter of international urgency to make sure that the opportunities to track this kind of information is in place. 


Read next:

Working At The Boundaries Of Aesthetics And Inference