I’m a sociologist who studies how police use data. Relying on algorithms can further bias and inequality — but it doesn’t have to be that way.

camden new jersey camden police

Sarah Brayne is an assistant professor of sociology at The University of Texas at Austin; prior to joining the faulty at UT-Austin, Brayne was a Postdoctoral Researcher at Microsoft Research.
Brayne is also the founder and director of the Texas Prison Education Initiative, a group of faculty and students who volunteer teach college classes in prisons throughout Texas.
The following article is based on her forthcoming book, “Predict and Surveil: Data, Discretion and the Future of Policing.”
In it, she writes that many politicians and reformers have called for more and better data on policing — but it’s important to remember that the idea of objective big data isn’t true.
Algorithms are shaped by the social world — and, dangerously, they can make biases invisible in the form of numbers.
But data can help identify non-police functions and maximize public safety, like when a mobile mental health crisis team might be more helpful.
Visit Business Insider’s homepage for more stories.

Business leaders and social scientists know: bad data in, bad data out. As the country struggles with demands for systemic change in policing, would-be reformers are suggesting that “data-driven policing” should be part of the solution. It won’t solve the problem.

In the weeks since George Floyd was killed by then-Minneapolis police officer Derek Chauvin, protestors, politicians, and pundits have called for an end to policing as we know it. Amid urgent conversations about how to skillfully defund, shrink, or abolish the police, reformers have sought the supposed objectivity of big data. Their logic is to strip discretion from biased front-line officers and replace it with “neutral” “data-driven” policing to solve the all-too-human problems of unjust, violent, and racist policing.

As a sociologist, I have spent nearly a decade studying how police already use big data in American cities. That’s hundreds of hours interviewing cops, shadowing crime analysts, and riding in police cruisers to see how officers deploy data in the field.

My on-the-ground approach helped me understand the practice of data-driven policing, including how cops use predictive algorithms and emerging surveillance technologies — many initially designed for military applications — to decide who, when, where, and how to police.

The same politicians and reformers who call for more data, better data, and bigger databases to rein in police brutality hopefully suggest we apply them to any number of other problems in policing: Defunding the police and need to cut costs? Data can help allocate resources most efficiently. Need to reduce racial bias in officer decision-making? Automate it. Want to reduce the categorical suspicion of young Black men and more accurately predict crimes? Try predictive algorithms.

The promise of both efficiency and accountability has already proven nearly irresistible to police departments, yet I have seen firsthand how systems that could help with, say, officer accountability, are not used for that purpose.

The bigger problem is that the objectivity of big data is a false premise. Algorithms are shaped by the social world in which they are created and used. That may be great if you’re a …read more

Source:: Business Insider


(Visited 4 times, 1 visits today)

Leave a Reply

Your email address will not be published. Required fields are marked *