Data-driven decisions can transform every corner of a business, from cutting crucial seconds off factory processes to deciding the direction of an organization.
But one of the consequences of the data revolution is that it has forced the role of data scientist on people who are less number-literate.
It’s all very well investing in enterprise-grade data analytics software, but how can you make sure it’s being used to its fullest potential?
By looking at the data, of course.
This is really a form of operational analytics – using data to assess how well your operation is performing, rather than how customers use your website.
In broad strokes, it means looking at your usage dashboard to see who is using your Business Intelligence (BI) platform, how often, and what for; in order to identify any potential areas of improvement.
Essentially, it means evaluating how people are using your analytics software the same way you evaluate any other part of the business.
Eliminating cost wastage
The first angle of attack is assessing if the software is being used as much as it should be and making sure you’re not losing any money on it.
Start by monitoring who is actually using the software. Presumably, you will have had a plan in place for who should be using it, and roughly how often, so a good place to start is comparing actual usage against that plan.
Is the sales manager logging in a couple of times a week to monitor performance? Is the VP logging in at all? Is the resident data analyst logging in several times a day, as he/she should?
Most importantly, is the shortcut just gathering dust on anyone’s desk?
The next step is to compare the usage with the licence you’re paying for. Most SaaS data analytics software has tiered licensing, and if someone isn’t using it as much as you had expected, you could be paying for more than you need.
Equally, if they’re not using it in a way that justifies the licence, that could be a sign they’re not using it as much as they should be.
Which leads us to…
Or in other words – are people making the most of the data available to them? Are they taking shallow figures for operational reports, or are they actually diving deep?
What data is being used?
Take a look at what data is being used, how often, and by whom. Check every level – from the dataset all the way down to specific tables and columns.
This should teach you two things:
● Whether people are drilling down to the appropriate level of detail
● Whether you have any redundant data
If you have any data lying around untouched, it could potentially be slowing the system down. It could also be an unnecessary cost, depending where it comes from. Perhaps you’re sourcing it from a pay-per-use cloud service like AWS, and you could actually reduce the amount of data you’re importing, because it’s simply not being used.
Or perhaps it’s useful data that’s been unjustifiably overlooked.
How data is being used
Find out which types of analyses and reports are most popular in your company and assess how far they go.
The key question here is whether people are actually using the software to discover things they otherwise wouldn’t have – things that can be converted into actionable improvements. Or whether they’re just harvesting numbers for operational reports; wasting the opportunity that good BI software presents.
Acting on analytics
The trick, of course, is acting on the areas of improvement you’ve discovered. Broadly speaking, these can be divided into two categories: system and operator:
Data analytics software tends to be pretty flexible and presumably malleable enough to stretch across any business sector. But perhaps there are changes you can make as an administrator. Perhaps you need to change permissions settings, or perhaps you’re gathering the wrong data.
Or maybe you’ve got the wrong platform for your users. Perhaps, in your enthusiasm for data, you bought into a system that appeals to the mathematics enthusiast, but not everyone in your organization gets the same thrill from tabulated data that you do.
Perhaps you need something with Natural Language Processing (NLP) that employs BI bots, allowing users to ask questions in plain English and receive answers in plain English, rather than having to trawl through pages and pages of graphs and tables.
Or maybe some training is required. In a world built on data, it’s easy to forget that some people still don’t think in numbers. An analytics dashboard can look like The Matrix to an untrained eye. And fewer eyes are trained than a number-head might think.
This could still be a case for switching to a more intuitive system, but it could also display a skills gap that can only be filled with training. There are plenty of resources on the UK Data Service’s website to help with learning how to deal with data. Alternatively, an industry in data training has sprung up in the wake of Big Data.
After all, not everyone’s a data scientist, but effective use of data is an increasingly important competitive differentiator.
Making data accessible to non-scientists
Data is only valuable if it’s being used effectively. And as much as we like to concentrate on the process – because the technology is sexy, and numbers make everything seem clearer – it’s how we interact with data that really makes a difference.
Wired presciently published a piece in 2013 about how the Big Data Revolution was being slowed by the paucity of available data scientists, predicting that the real revolution would occur when data was made accessible to non-data scientists i.e. packaged in a format accessible to people for whom graphs and tables are kryptonite.
Applying analytical techniques to your analytics systems can yield a number of systemic improvements, but the biggest opportunity is identifying which people would benefit from learning more about how to slice data, or having the data presented in a more intuitive way.
Data is only as good as the people interpreting it.
(Until machine learning allows quantum computers to vastly surpass anything we could do with a dataset.)