FOLLOW

FOLLOW

SHARE

The Evolution of Big Data: 2010 to 2016

With each passing year, big data gets bigger

1Dec

Although the use of large data sets for enterprise purposes dates back to the 1930s and the numbering scheme applied by the United States Social Security Administration, the second decade of the 21st century has been one of the most active in terms of big data development and advancement.

Third Apache Hadoop Summit - 2010

In late June 2010, the Third Hadoop Summit was hosted in a Hyatt conference room in Santa Clara, California. A major organizer of this summit was internet giant Yahoo, which had taken a chance on Hadoop by deploying a cluster of 300 machines connected under this framework. This particular summit was momentous in the world of big data because it served as the catalyst for adopting online analytical processing to Yahoo, an initiative that would later become known as OLAP in Hadoop. At the summit, Yahoo also announced that its datasets had reached a level of 70 petabytes while Facebook announced its 40 petabytes milestone. These days, big data storage is largely dependent on Hadoop.

Big Data Expansion Capacity Study - 2011

A breakthrough study published by Science Magazine provided an unprecedented look at the staggering growth rate of big data since 1986. In those days, a little over 99 percent of data stored for commercial purposes was kept in analog media formats. According to Winshuttle, it took a little over 10 years for that statistic to come full circle, which meant that 99% of enterprise data had switched to digital storage by 2007. By May of 2011, computer scientists estimated that the amount of enterprise data saved and stored by the enterprise world had climbed to 7.4 exabytes per year; this is actually higher than all the data saved by individuals for private purposes in 2011.

The Official Launch of IPv6 - 2012

The Internet Society declared June 6 of 2012 as the formal day to celebrate the launch of IPv6, the new version of the Internet Protocol to accommodate the expansion of the online realm for the next few decades. The developers of IPv6, the Internet Engineering Task Force, explained that they would not have been able to come up with this protocol update without big data as a support tool to overcome the issue of IPv4 maxing out at 4.3 billion addressable internet locations.

Big Data and the Internet of Things - 2015

The concept of smart cities dates back to the early 21st century, and it was mostly limited to municipal services being offered on digital platforms such as mobile SMS and via interactive websites. A 2015 symposium by market research giant Gartner explained how smart cities are becoming smarter by the analysis of real-time data collected by digital sensors strategically installed across municipal districts. This marriage of big data analytics and the IoT was hailed as a glimpse of many future developments.

When Big Data Failed U.S. Politics - 2016

Big data has been actively used in prediction modeling since 2012, and it was widely trusted to predict a landslide victory for Hillary Clinton, who was widely accepted to become the first woman President of the United States. The most reliable forecasts failed to predict a surprise Donald Trump victory under the Electoral College rules, thereby questioning the validity of big data. Computer scientists were not convinced that big data actually failed the statisticians; it was actually the other way around. In the end, the 2016 U.S. presidential election shows that there is still much to learn about how to handle big dafta.

Comments

comments powered byDisqus
Turkeyss

Read next:

What’s On Data Analysts’ Plates This Thanksgiving

i