FOLLOW

FOLLOW

SHARE

Big Data Top Trends 2017

What is going to be the new trends in big data over the next 12 months?

18Nov

2016 is coming to an end, much to the relief of many. Having looked at the trends in big data for the past couple of years we thought we would continue the tradition into what will be an incredibly unpredictable year both in terms of data and the world in general.

Machine Learning, AI, And IoT To Become Common

One thing that we are almost certain about is that use of machine learning, artificial intelligence, and the internet of things will increase significantly in 2017. We have seen devices like the Amazon Echo selling over 4 million units, with Google and Microsoft also launching similar connected and AI focussed devices for the home. As these numbers increase even further the concept of having a device that you can speak to to conduct certain tasks is going to be normalized.

However, it is not going to be only in the home that these technologies become popular, companies are also going to increasingly adopt them as the importance of the technologies becomes more apparent. We have already seen the impact that this can have in industries like oil and gas, where sensors have managed to cut costs, increase efficiencies and improve safety, all whilst the industry has been struggling internationally.

This kind of success is going to be seen in more industries in 2017, with a growing number of companies looking at these new technologies and recognizing how they can have a profound impact for them. These impacts won’t necessarily be limited to how companies are improving through using these technologies, but we are likely to see more companies move into the area themselves. We have already seen Samsung announce a deal to buy Harman to get into the connected car market and we are likely to see more deals like this.

Increased AI Accuracy

2016 has seen AI take centre stage, but often not in a good way. We have firstly seen Microsoft attempting to create an AI Twitter bot that quickly descended into racist, homophobic and anti-semitic rants before being taken offline. After this we also saw how after media pressure Facebook left its news agenda to AI, which despite Mark Zuckerberg’s claims, has been blamed for failing to weed out fake news masquerading as the real thing. In fact according to the Pew Research Center 62% of Americans get all or some of their news from social media of which Facebook is the biggest contributor. In a study conducted by Buzzfeed, it was found that highly partisan sites often spread false or deliberately misleading content, with 3 popular right wing sites guilty of this 38% of the time, and three left wing sites guilty 19% of the time.

What this shows us is that although AI has developed at breakneck speeds over the past couple of years, it is still not at the point where it needs to be for several essential purposes. These failures are going to spur companies to improve the performance of the AI systems almost as a matter of necessity. Take the Facebook example, in 2013 there were 4.75 billion pieces of content shared on the site every day, that number has likely grown and a team of humans would simply not be able to deal with anything close to that number.

Therefore AI will become more accurate because companies have been backed into a corner and they now need to improve what they’re doing as a matter of urgency.

Companies Will Need To Prepare To Operate At Speed

Speeds in this context is not limited to simple speeds of technologies, like in-memory or even quantum, but also in the speed and volume that we are going to be seeing data created.

As smart elements like the IoT, connected cars, and wearables gain traction in our society the amount of data that we will be able to collect is going to substantially increase too. This means that companies will need to make preparations for this as current systems will either need to be fully replaced to deal with this new influx of data or scaled in order to deal with these changes.

2017 is likely to be a time for this scaling as the connected device market is going to grow even quicker than 2016, but is unlikely to fully peak in the next 12 months, but is more likely in the next 24-48 months. It is therefore going to be a key year for many companies to evaluate their data capabilities and refine it before the flow of data increases even faster.

Less Industry Specialization

The data skills gap is something that has been much discussed ever since the use of data in businesses became so important. The predictions made several years ago have largely come true and there are several leading data leaders that we have spoken to who have found it difficult to find the talent needed for their teams. However, there are several who are taking a different approach to hiring that is likely to gain ground in 2017.

This form of hiring will not be looking ah the industry experience that people have, but will instead focus almost entirely on their actual data analysis skills.

One of the reasons that this hasn’t been a widely adopted approach already is that conventional wisdom about what a data scientist should be includes an in-depth knowledge of the business they work in. However, having spoken to several companies over the last 6 months, having hired based not on industry experience, but instead of data experience, their teams have become more knowledgeable and productive, whilst it is also simpler to fill vacant positions.

One of the key reasons for this is that every new team member who comes from a different industry is looking at problems in a different way, not simply taking their pre-existing knowledge of the industry and imposing it on what’s in front of them. 2017 is likely to see this approach grow in popularity, seeing less specialization but more data knowledge.

Government Scrutiny On Data

One thing that is almost guaranteed over the next 12 months is that governments are going to begin looking at how data is being used by companies and government departments.

We have seen with the hacking of the DNC in the US election, loss of data from the Office of Personnel Management and even Hillary Clinton’s use of a private email server, that it is now in the national conscience. In the UK, Chancellor Philip Hammond has announced £1.9 billion to help protect British companies against hackers and this kind of policy is being mirrored in many countries across the world, with Donald Trump also showing signs that he is going to look at US cyber security too.

With the growing threat from cyber attacks, governments are having their hands forced into taking action, the big questions for 2017 will be what form this takes and how it will impact on data programmes. 

Comments

comments powered byDisqus
Data culture small

Read next:

Building A Culture Of Data

i