The EU has been the single international organization to have perhaps the biggest impact on the use of data over the past few years, with the Data Privacy Shield and Right To Be Forgotten two examples from the past couple of years alone. However, a new law called the General Data Protection Regulation (GDPR) passed in April 2016 may shake up the data and analytics departments of companies even further.
In 2018 the law will be enacted and have its biggest impact on how companies use algorithms on their customers. The basic principle of the law is that companies will not be allowed to profile people based on automated algorithms alone. In theory, this seems like a good idea because it has been shown that algorithms can often double down on prejudices. This is not surprising given that Microsoft’s ‘Tay’ Twitter account became a racist, Hitler loving sex pest based on automated algorithms alone when left to its own devices.
However, the issues with algorithmic decisions have far more dire consequences, with examples of it being used to reinforce and double down on existing issues, such as racial profiling in policing and targeting women with stereotypical female job roles. These can have real repercussions, with a historically higher percentage of minorities being arrested feeding directly into modern algorithms, skewing outcomes. Essentially we are seeing that bad data creates bad data out.
The new EU Law is hoping to change this, with the law ‘stat[ing] that a data subject has the right to ‘an explanation of the decision reached after [algorithmic] assessment’ according to Goodman and Flax. The law also bans decisions ‘based solely on automated processing, including profiling, which produces an adverse legal effect concerning the data subject or significantly affects him or her.’
With a basis on not making algorithmic decisions through automated means that could adversely impact on people, it leaves one of the most important parts of the law open to interpretation. How does one define an adverse impact? For instance.
One company likely to see this impact them significantly is Amazon, who did £5.3bn in trade in the UK alone in 2014. It is a company that has third party sellers in over 100 countries across the world, including every EU country, which is one potential issue. It also liberally uses suggestion engine to help its customers make better choices about what to buy, what content will be advertised to them and even how certain items will follow them around through advertising afterwards.
In an article for Propublica, Julia Angwin and Surya Mattu discussed this issue, as they found that the algorithm was having an adverse impact on them and Amazon’s third party sellers:
‘In an instant, Amazon’s software sifted through dozens of combinations of price and shipping, some of which were cheaper than what one might find at a local store. TheHardwareCity.com, an online retailer from Farmers Branch, Texas, with a 95 percent customer satisfaction rating, was selling Loctite for $6.75 with free shipping. Fat Boy Tools of Massillon, Ohio, a competitor with a similar customer rating was nearly as cheap: $7.27 with free shipping.
The computer program brushed aside those offers, instead selecting the vial of glue sold by Amazon itself for slightly more, $7.80. This seemed like a plausible choice until another click of the mouse revealed shipping costs of $6.51. That brought the total cost, before taxes, to $14.31, or nearly double the price Amazon had listed on the initial page.’
This is an automated algorithmic decision that could therefore have seen the customer paying an additional 120% and deliberately takes trade away from third party sellers. It would therefore almost certainly be in breach of the (GDPR), which for this example, seems perfectly reasonable.
However, if you were to look at everything that uses automated algorithms, like programmatic advertising, social media recommendations or even just personalized marketing, it will impact almost every business currently using data. This is where the law and the ultimate upholding of it comes into question, because on the surface it seems like a great idea to stop the inherent prejudices, but is it worth it if it puts a millstone around the neck of future progress?