In 2015, 990 people were shot dead by police in the US. Of these, 102 were unarmed black people. It feels like every week a new shocking video appears online depicting police officers showing a callous disregard for life. This has resulted in widespread civil unrest, protests, and killings of policemen in the streets.
Ending this cycle of violence is to the benefit of everyone, and it could be that police have a new weapon in their attempts to, hopefully, ensure that the issue never raises its ugly head again. One tool that has long been at the police’s disposal that could be of benefit to this cause is data analytics.
Preventing crime before it happens has long been the aim of law enforcement, and a number of police forces are using the wealth of data around criminals and their activity to predict potential hotspots so that they can allocate resources accordingly. The LAPD, for example, has applied an algorithm used to predict aftershocks during earthquakes to crime, feeding it with crime data to establish patterns. According to Rayid Ghani, director of the Center for Data Science and Public Policy at the University of Chicago, the police can also apply similar analytics techniques to preventing violent incidents of their own making,
Police departments collect a tremendous amount of data through complaints, dispatch logs, incident reports, and so forth. Ghani’s idea is to take this data and use it to help police departments predict which officers are at risk of adverse incidents. The current system we have is reactive, and it’s very clearly not working. Far better to focusing on, "Can I detect these things early? And if I can detect them early, can I direct intervention to them — training, counseling.”’
The most important thing with these algorithms, as it is with all of those being used to predict crimes, is the elimination of bias. Data mining looks for patterns in data. When it comes to crime, race is represented disproportionately in the data fed into a data-mining algorithm, which could lead the algorithm to infer that race is the determining factor, whereas it is actually poverty. It is the same when it comes to fighting police misconduct, as much of it goes unreported and there are many fake claims it is impossible to get clean data. Eliminating such bias is vital if you’re going to get a real idea of the problem.
There have been a number of other initiatives that have looked to help combat police misconduct. The city of Indianapolis, for one, has partnered with Code for America to launch Project Comport — an open-data platform for sharing information on complaints and use of force incidents that should make the kind of analysis Ghani is talking about far easier to carry out. Stanford University has also carried out big data research into police conduct toward African Americans in traffic and pedestrian stops in Oakland, finding a huge disparity in the way they were treated compared to other candidates. Stanford researchers analyzed 28,119 stop reports, officer body camera footage, and community survey. Such analysis could potentially have a profound impact on reducing the number of shootings, as many of them are the result of such traffic stops not being dealt with correctly.
Among Stanford’s findings were that African American men are four times more likely to be searched than whites during a traffic stop. African Americans were also more likely to be handcuffed, even if they did not ultimately get arrested. Officers also brought up the subject of parole or probation more often when they stopped black people, and were far more likely to mention the reason for a stop to a white person. The report made specific recommendations for police agencies to consider, including increasing the amount of data collection, as well as better focusing efforts on altering the mindsets, policies, and systems in law enforcement that are the driving force behind racial abuse.
For a data-driven solution to work, the most important thing is that different police departments share their data and take a unified approach. There are some 18,000 different law enforcement agencies in the US, and they need to be pooling as much data as possible to enable the best possible analysis. These sort of insights are also nothing if no action is taken. Oakland PD, for one, is implementing Stanford’s recommendations. Oakland Police Assistant Chief Paul Figueroa noted, ‘This report provides a roadmap forward for the Oakland Police Department and police agencies across the country. This critical work moves from data collection to action. Oakland has already implemented many of the recommendations in the report and will move quickly to implement the remaining items.’ It’s clear there is a problem, and if anything is going to change, it’s vital that other departments follow Oakland PD’s example, and work together to ensure data is used to an optimum level.