One of the most important developments in data over the past 5 years has been the clear potential it has in the medical industry. Making more money for businesses and creating a better customer experience is one thing, but the ability to actually save lives is quite another.
This move has been hit with a significant number of roadblocks though, with obvious reason - medical data is incredibly personal. It may be the most useful when people have maximum access, but at the same time the kinds of companies who have the power to really make a difference are the kinds of companies who most people do not want their medical data in the hands of.
Companies like Facebook and Google are generally held in poor esteem by the public when it comes to data because it is widely known that they make their billion dollar profits from it. It is therefore essential that the foundational moves in the area need to be transparent and squeaky clean, because failure to do so in the early stages today could have a huge impact on future initiatives in the future.
Unfortunately one of the early success stories, the collaboration between the London’s Royal Free hospital and DeepMind seems to have had some major issues.
The hospital and Google’s deep learning company failed to follow data sharing protocols when they worked together to create Streams, an app designed to detect acute kidney injury. According to the Information Commissioner’s Office (ICO), the rules were broken because the app continued to undergo testing after patient data was transferred and patients were not adequately informed that their data was being used for this testing.
Elizabeth Denham, the information commissioner commented on the findings, saying:
‘Our investigation found a number of shortcomings in the way patient records were shared for this trial…Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening. We’ve asked the Trust to commit to making changes that will address those shortcomings, and their co-operation is welcome.’
It is important for companies to stick to these rules in the early stages of anything, because ultimately they are the guinea pigs upon which all future collaborations are judged. If DeepMind and London’s Royal Free hospital have got this wrong in a very public way, the chances are that other hospitals and other companies will think again about moving into a similar area.
Use of data in medicine is going to be perhaps one of the most important developments of the next decade, but in order to achieve the kind of success we know can be achieved, the work that creates the foundations upon which others build must be stable. What we have seen with this is that there is certainly a failure in that regard.
However, when these kind of brand new projects are initiated there is always going to be a certain degree of ‘greyness’ simply because it is an untested area. The actual criticism from the ICO took the form of recommendations and didn’t punish either DeepMind or the hospital. Instead, they offered what was closer to guidelines to help the app continue development given its importance.
Therefore, rather than being a data ‘failure’ it is better to perhaps see it as a learning curve and a case study. There is little doubt that both the Hospital and DeepMind did things wrong, but rather than smashing the whole idea to pieces the ICO took a more reasoned approach, offering advice rather than punishment. This will hopefully lead to a more positive outcome, but if authorities had been less lenient, this could have had a considerably worse outcome for the future of these kind of collaborations.