The AI Police
A new company makes artificial intelligence software that’s in use at a handful of police departments. Can it make law enforcement more transparent?
There’s a story Brett Goldstein ’96 likes to tell. It starts on a Friday night in 2010 with him sitting in a darkened Crown Victoria on a Chicago street, poring over maps. Goldstein was a commander at the Chicago Police Department, in charge of a small unit using data analysis to predict where certain types of crimes were likely to occur at any time. Earlier that day, his computer models forecast a heightened probability of violence on a particular South Side block. Now that he and his partner were there, Goldstein was doubting himself.
“It didn’t look like it should be a target for a shooting,” he recalled. “The houses looked great. Everything was well manicured. You expect, if you’re in this neighborhood, you’re looking for abandoned buildings, you’re looking for people selling dope. I saw none of that.”
Still, they staked it out. Goldstein’s wife had just given birth to their second child, and he was exhausted after a day in the office. He started to doze off. Goldstein’s partner argued that the data must be wrong. At 11 p.m., they left.
Several hours later, Goldstein woke up to the sound of his BlackBerry buzzing. There had been a shooting—on the block where he’d been camped out. “This sticks with me because we thought we shouldn’t be there, but the computer thought we should be there,” said Goldstein. He took the near-miss as vindication of his vision for the future of law enforcement. “I do believe in a policeman’s gut. But I also believe in augmenting his or her gut,” he said.
Seven years after his evening on the South Side, Goldstein threw on a gray suit and some aerodynamic sunglasses and headed out from his hotel in Midtown Manhattan into New Jersey. This spring, he founded CivicScape, a technology company that sells crime-predicting software to police departments. Nine cities are either using the software or in the process of implementing it, including four of the country’s 35 largest cities by population. Departments pay between $30,000 a year to use the software in cities with less than 100,000 people to $155,000 a year in cities with populations that exceed 1 million. Goldstein wanted to check in on the two clients who were furthest along—the police departments in the New Jersey towns of Camden and Linden.
Goldstein likes to harp on his own lack of charisma, but he’s well suited to be a pitchman for police departments. In Chicago, he rose from patrol officer to the city’s chief data officer over a seven-year government career and regularly drops a few war stories from the streets into his conversations with cops. He’s also peddling something that every department is after nowadays: technological sophistication. The criminal justice system produces reams of data, and new computing methods offer to turn any pool of numbers into something useful. Today, almost every major police department in the country is using or has used some form of commercial software that makes predictions about crime, either to determine what blocks warrant heightened police presence or even which people are most likely to be involved. Technology is transforming the craft of policing.
Not everyone is rubbing their hands in anticipation. Many police officers still see so-called predictive policing software as mumbo jumbo. Critics outside of law enforcement argue that it’s actively destructive. The historical information these programs use to predict patterns of crime aren’t a neutral recounting of objective fact; they’re a reflection of socioeconomic disparities and the aggressive policing of black neighborhoods. Computer scientists have held up predictive policing as a poster child of how automated decision making can be misused. Others mock it as pseudoscience. “Systems that manufacture unexplained ‘threat’ assessments have no valid place in constitutional policing,” wrote a coalition of civil rights and technology associations, including the ACLU, the Brennan Center for Justice, and the Center for Democracy & Technology, in a statement last summer.
A numbing progression of police shootings in the past several years serves as a reminder of what’s at stake when police officers see certain communities as disproportionately threatening. Over the course of eight days in late June, juries failed to convict officers who killed black men in Minnesota, Ohio and Wisconsin. In each case, the officer’s defense relied on his perception of danger. The worst-case scenario with predictive policing software is deploying officers to target areas with their ears raised, leading them to turn violent in what would otherwise be routine encounters.