The Analytics Evolution: From Volume of Data to Analytical Speed

We take a look at the importance of speed when it comes to data


We’ve known for some time now that Big Data is the cornerstone of companies who want to anticipate the needs of their customers.

E-commerce, now in its third decade, has developed significantly over the last couple of years, with the insights available from the beginning of Tesco’s Clubcard scheme in 1994 comparably prehistoric to what’s on offer now.

Back in 1994, Tesco wanted to get rid of some of the data it held as it was just too expensive to hold. Thanks to the rise of platforms such as Hadoop, there has been considerable progress in this space of late, making the cost of analysing data cheaper than ever.

There are around 70 billion automatic auctions happening each day currently. Lasting only a few microseconds, these auctions decide which companies will advertise to a specific website user by analysing the type of person they are. Done by collecting data on someone’s recent history, this is why we’re forever inundated with adverts that resemble the last couple of websites we’ve visited.

In an industry where speed isn’t just about data, F1 team Lotus recently signed a deal with EMC to revamp its data systems.

With F1 one of the world’s most complex sports, data has become fundamental to the success of its most prolific teams.

When discussing the speed at which Lotus’ data now operates at,  Michael Taylor, Lotus F1 Team’s IT director states;

'The team is now able to analyse car aerodynamics in two hours, rather than the two weeks it took previously'

Through platforms such as Hadoop and NoSQL, data can now be processed and analysed at speeds which were unimaginable to Tesco’s chairman Lord MacLaurin back in 1994. Considering the advancement in consumer expectation, as well as platforms mentioned above, who knows what speed data will be processed in 5 years time.


Read next:

Analytics as a Strategic Business Partner