The 3 Golden Rules For Choosing A Big Data Solution

We pinpoint the three areas to look at when making your choice


It is now an established fact that those who have implemented data at an early stage are outperforming their competitors. It has been documented in some areas as a 2:1 improvement.

In fact those who haven’t, need to make it a business priority in the next 12 months otherwise they may find that their competitors are too far ahead to catch up.

The problem that many of these companies have is that they are not aware of which system they should be using to start their data programmes. With hundreds on the market, from barebones, free options to multi-million pound systems.

With this huge variety and massive disparity in what could be invested, it is important for companies to remain informed of their options and which would be the best choice for them.

We believe there are three key points to look at when new systems are being considered.

Future Proof

Big Data and analytics are developing at breakneck speed. People are constantly changing what can be done with data, right at this second we are seeing companies move from traditional Hadoop systems to Spark, which marks a move into the high speed data analysis that in-memory systems can bring.

The two questions regarding the technology you need is; what do you need now and what will you need in the future?

If the amount of data you have now is huge and you want to analyze it fast, then you will need to have a super fast and expensive system. Ideally this would be in-memory capable, which requires high speed hardware and relatively complex systems straight away.

If the programme you envisage does not need this straight away, but may do in the future, then it is worth looking at a system that can be easily and cheaply upgraded. If you need to spend a few thousand more on a system that has the potential to add the capabilities you need in the future, rather than saving money initially on a system that can’t, you will ultimately save money.

Affordable and Useful

A Big Data programme is only as good as the company it works within. This means that although it may seem progressive and positive to jump in with both feet and invest in the best system straight away, then attempt to make sense of everything found, the reality is quite different.

The multi million dollar systems that are available now, are available simply because the companies who started with a single data scientist using one simple system have grown to the extent that they are needed.

Starting small and getting good ROI before investing more heavily is ultimately the way that businesses will get the most benefit from using Big Data initiatives. If a company jumps straight in and loses hundreds of thousands of dollars in the first couple of years without making any meaningful impact, then support for the programme will dissipate quickly.

Even with companies who would be considered as a similar size can have a huge variation in their data maturity and thus the requirements from their systems. If a company is just starting out in data analysis, the chances are they will not need to have a system that can digest gigabytes of data in milliseconds. It is far more beneficial to invest in a system that can scale easily in the future, but doesn’t bankrupt your company in the present.

How Educated Are You?

How much of the analysis of the data can be done in house? Do you have the skills to write algorithms or run complex queries? There are multiple systems that can cater to both the skilled and unskilled users, choosing the correct one is vital to the success of your business.

Essentially if you have an amazing data scientist already within the company, then data science could technically run for next to nothing. Aside from their wage and a basic system, they can run a programme from Hadoop alone, meaning that there are no additional costs apart from storage.

On the other hand, if there is a scarcity of these kind of skills within your company then solutions exist that will run these queries for you, mine your data and even help to gather and store it.

The second option is more expensive, but one that negates the problems of finding a qualified and skilled data scientist at a time when they are scarce and demand increasingly high wages. Often these options allow for customization and exist on the cloud, meaning that finding the necessary space to house servers or equipment is gone.

Ultimately, the most important element to remember when choosing a Big Data solution is that there is no one size fits all. It is going to act like a golf club; you can buy the most expensive set, but if you give Rory Mcilroy the most basic, he will still beat you. The best advice that we have is that scaling your systems must be at the heart of your choices, as your data usage will increase and you will need systems that can keep up without breaking the bank. 


comments powered byDisqus

Read next:

Working At The Boundaries Of Aesthetics And Inference