FOLLOW

FOLLOW

SHARE

You Look Like A Criminal! Predicting Crime With Algorithms.

Can you really predict if someone is going to commit a crime?

4Oct

Can you really predict if someone is going to commit a crime?

Some authorities are using facial recognition, predictive analytics and machine learning to predict who will commit a crime. Even if you can use an algorithm to deduce the likelihood of an individual’s future, to apprehend a suspect before a crime is even committed surely cannot lead to a conviction as no offence will have actually taken place. Yes, this is all very Minority Report.

Nevertheless, companies are currently working on these technologies to catch the bad guys before they even strike.

Cloud Walk tracks people’s location to note where they go and rates them on how likely they are of committing a crime using this location data. According to a spokesperson talking to Financial Times, authorities in China can track location data and purchases - a person buying just a kitchen knife would not be considered suspicious. If this person goes on to buy a hammer and sack later, their rating goes up and their potential for committing a crime flags up.

Researchers at Shanghai Jiao Tong University have carried out a study linking criminality and facial images. Training algorithms with headshots of over 1000 faces from government IDs, with 700 of convicted criminals these supervised learning systems can then segregate criminal and non-criminal groups and determine reoccurring traits in both sets. Once it’s fed new images, it could predict with 90% accuracy it could predict who is more likely to be a criminal.

Using technology in this way is hugely questionable ethically and has the potential to be extremely dangerous and damaging for many minority groups.

There is an increasing use of risk assessment software across the USA and it has become an all too common story how algorithms predict minorities as being most likely to re-offend. This bias is clear when compared with a white person with a long string of convictions like in the case outlined by Pro Publica. Risk assessments are used by courts in decision making like setting bond amounts and even sentencing. As highlighted in a report by Pro Publica, these methods are incredibly flawed in predicting who would go on to commit a crime, as black defendants were labelled likely to commit future crime than white people.

Israeli startup Faception advertises themselves as a facial personality profiling company, who offer their users computer vision and machine learning technology to decipher facial images and predict that person's personality. With the potential to open up a can of worms of all sorts of ethical questions, Faception is reportedly already working with homeland security agencies to help identify terrorists.

Using different classifiers, the technology is able to evaluate a person’s facial mapping with 80% accuracy of certain traits. It’s a technological revamp of centuries-old physiognomy practices that are better left in the 19th Century. However, the company are also showcasing the use of this technology in other ways - thankfully - like poker.

Given the obvious bias and potential misuse of this kind of technology, it’s difficult to see the benefits to be derived from it, although there are some if used correctly. Even if regulated by impartial bodies and the elimination of bias, there are still many moral and ethical hurdles to overcome.

What are your thoughts on this? 

Comments

comments powered byDisqus
Turkeyss

Read next:

What’s On Data Analysts’ Plates This Thanksgiving

i