CPT | About
About the project

Cop Out: Automation in the Criminal Legal System

Systems of policing and punishment in the U.S. are undergoing a major change: police, judges, prosecutors and other legal authorities are increasingly using algorithmic technologies when making decisions that have critical consequences in the lives of individuals and communities.

Algorithms are ubiquitous in the criminal legal system. As we all move through the world, a mosaic of overlapping surveillance technologies collect and store information about us. Law enforcement agencies, adjudicators and corrections officials use algorithms to filter through the billions of pieces of data collected through these surveillance technologies and make calculations about which neighborhoods to police, what offense to charge someone with, whether bail should be set, how long that person should be incarcerated and when they will be free again.

These algorithms don’t produce “neutral” or “objective” calculations. They are built with real-world data, which records and reflects the criminal legal system’s biases and abuses. That means, for example, that police relying on the output of a crime forecasting algorithm will go to the same streets and target the same people they have in the past. And in most cases, that produces a feedback loop resulting in a persistent and disproportionate police presence in communities of color.

The companies that build the algorithms behind policing technologies take for granted existing approaches to policing and punishment, approaches that communities are actively contesting. What should the police’s role be? What is the real meaning of public safety? Are there better ways to prevent and redress harm? Those are questions that must always be part of ongoing democratic deliberation about justice. But policing algorithms codify assumptions about justice, thereby preserving the status quo.

Most of us are unaware of the ways that algorithmic tools, trained on information gathered through continuous mass surveillance, are influencing legal system actors who have enormous power over our lives. This project’s goal is to illuminate that fact and to begin the process of making an opaque and disorienting system legible.

Download Documents

This website is accompanied by an essay expanding on the harms of algorithms in policing and punishment and an appendix containing an annotated bibliography of some of the most important scholarship on the use of algorithms in the criminal legal system.


Jameson Spivack


This project would not be possible without the tireless research and advocacy of our friends and allies who also seek a more just and equitable American criminal legal system. This includes lawyers, advocates, journalists, scholars and activists, as well as everyone else who filed public records requests, investigated how law enforcement uses algorithmic technologies and wrote about their legal, ethical and moral implications. It also includes those who have been subjected to carceral technologies and have bravely shared their stories with the public. Their work forms the basis of this project.

Special thanks go to Andrew Ferguson, Jerome Greco and Freddy Martinez for their critical guidance and expertise. The entire team at the Center, past and present, also deserves recognition for the countless ways they helped out with this project: Alvaro Bedoya, Katie Evans, Clare Garvie, Cynthia Khoo, Laura Moy, Harrison Rudolph, Korica Simon, Emily Tucker, David Vladeck, Nina Wang and Serena Zets. We are also grateful for the Center’s research assistants and summer fellows; our copy editor, Joy Metcalf; our illustrator, Lara Zigic; and our design and web development firm, nclud.

The Center on Privacy & Technology at Georgetown Law is supported by the Ford Foundation, the Kresge Foundation, the Open Society Foundations, the MacArthur Foundation, Luminate, the Media Democracy Fund, the Seldin/Haring-Smith Foundation and Georgetown University Law Center.