FOLLOW

FOLLOW

SHARE

Does The WannaCry Ransomware Attack Show Why We Need To Be More Open?

The huge ransomware attack has shown significant issues

15May

As this article is being written there are 200,000 computers around the world affected by the WannaCry attack which locked people out of their files through accessing the files and encrypting them. Ransomware has been growing in the past year as we noted in the article ‘Humans Are Still The Weak Link In Data Security’ and the likelihood for many of these hacks will have been down to human error. However, it seems that the human error may go well beyond opening an email attachment or using an unsecured site.

In the case of the NHS in the UK, where over 60 hospitals were hacked and were forced to cancel operations and non-emergency care, there have been a number of huge errors pointed out, with the most shocking being that many of the hospitals were still running Windows XP, a system originally released in 2001 and support of which ended completely over 3 years ago. The error for this appears to lie with ministers who refused to update the system in 2015 and who have subsequently ignored warnings from several major data security bodies including the National Cyber Security Centre and the National Crime Agency.

The issue that the UK government and NHS has now is that the cost to upgrade every system to bring it up to standard would cost hundreds of millions of pounds, which the NHS simply cannot afford with funding falling around 20% in real terms between 2015 and 2020. The UK has already seen a huge impact on its healthcare capabilities, with the British Red Cross referring to the situation in January 2017 as a ‘humanitarian crisis.’ It is also not a single centralized system, but a huge number of interconnecting systems that impact the smallest rural GP surgeries all the way up to the largest hospitals. As one user on LinkedIn put it, to implement this upgrade would be like changing the wheel on a bus whilst it was still moving.

A failure of this magnitude is not about the NHS and government needing to upgrade their system after 16 years, but instead about failing to effectively update their system every day since then. Many small incremental updates to a system makes it considerably easier to protect, what has been left is simply a system that is so out-of-date that it needs a complete overhaul.

This is perhaps one of the single worst examples of a government living in the past, specifically 16 years in the past when you talk about operating systems and data security. The NHS holds the most important information about every single person who has ever entered the UK healthcare system, but data protection was not considered important enough to simply upgrade the systems to something less than 16 years old or something that at least had support from the company who created it. Given that Theresa May, UK Prime Minister, has been pushing to collect more information on British citizen’s internet browsing history for several years despite widespread condemnation from tech companies, this abject data security failure is indicative of why people do no trust governments with their data.

However, the ultimate failure did not come from the UK (although the actions of the UK government have been at least ill-thought out and at most pure idiocy) but instead from the NSA in the US.

The attack itself is not something complex, with Marcus Hutchins, a UK based malware blogger who found a temporary kill switch for the ransomware saying ’It’s not a massively sophisticated attack. What is new is the use of a worm to propagate through systems.’ The thing that allowed this worm to travel so far and so quickly was a security flaw in the windows operating system that the NSA had found. This information was then stolen and shared on the dark web, meaning that Microsoft were not aware of the flaw and therefore couldn’t fix it. The NSA had not shared this information with the company, but because of the NSA’s failures it was available to any criminal who wanted to exploit the weakness.

There are naturally two sides to every story and in the case of law enforcement it is clear that they have valid reasons to want access to systems. The case in 2015 of Apple refusing to allow the FBI to have access to a phone used by one of the terrorists in the San Bernardino shootings is indicative of this, with the company refusing to either create a backdoor through which the FBI could use the phone or a simpler way of unlocking it for them. Eventually, this dispute became defunct as the FBI found a programmer who knew a way around the iPhone’s security, but it is clear why law enforcement wanted access and many argue should have been given it. There are obvious situations where having access to secure technologies is key to national security, but given what we have seen before in terms of leaks and misuse of data, many working within tech do not trust government organizations to have effective diligence in either its use or storage.

Perhaps then, it could be argued that the single biggest error that has allowed this ransomware attack to affect so many has been that governments and private companies haven’t been co-operating as effectively as they should. Ultimately the NSA should have told Microsoft about the security issue they found, but given that several technology companies have fought against giving any access to government agencies it’s understandable why they wouldn’t. If there was more controlled co-operation where agencies could trust agencies enough to not abuse data and if agencies knew there were direct methods of access rather than needing to find suspect means, the size of this attack could have been significantly smaller. 

Comments

comments powered byDisqus
Comprehension small

Read next:

Expert Insight: 'Actually Understanding Your Data Is Crucial When Creating Effective Data Visualizations.'

i