Back to News & Commentary

New Orleans Program Offers Lessons In Pitfalls Of Predictive Policing

Sign: NO PD Police Do Not Cross
Sign: NO PD Police Do Not Cross
Jay Stanley,
Senior Policy Analyst,
老澳门开奖结果 Speech, Privacy, and Technology Project
Share This Page
March 15, 2018

Should police gather up statistical information from a variety of sources and then use computer algorithms to try to predict who is likely to be involved in violent crime in the future? Just such an attempt has been underway in New Orleans, as the Verge Feb. 27, and the New Orleans Times-Picayune described in a on March 1.

In the wake of these reports, the Times-Picayune that New Orleans has decided to end its partnership with the data mining company Palantir, whose software the program used. Nevertheless, there are several important lessons that we can draw from the city鈥檚 experience.

The New Orleans Police Department (NOPD) program, according to the Verge, centered around

an intelligence technique called social network analysis (or SNA) to draw connections between people, places, cars, weapons, addresses, social media posts, and other indicia in previously siloed databases鈥. After entering a query term 鈥 like a partial license plate, nickname, address, phone number, or social media handle or post 鈥 NOPD鈥檚 analyst would review the information scraped by Palantir鈥檚 software and determine which individuals are at the greatest risk of either committing violence or becoming a victim, based on their connection to known victims or assailants.

The data on individuals came from information scraped from social media as well as NOPD for ballistics, gangs, probation and parole information, jailhouse phone calls, calls for service, the central case management system (i.e., every case NOPD had on record), and the department鈥檚 repository of field interview cards.

Police officials and Palantir executives disputed the characterizations in this report鈥攂ut the NOPD did acknowledge to the Times-Picayune that they had created a 鈥渞isk assessment database鈥 鈥 a 鈥渄atabase of about 1 percent of the city鈥檚 population who are more likely to be perpetrators or victims of gun violence,鈥 as the paper put it.

Predictive policing and police 鈥渢hreat scores鈥 are deeply problematic applications of 鈥渂ig data鈥 analytics. We identified eight big problems with one such program in 2016. The reports on the New Orleans program, however, add important new information to our picture of how one such program has actually played out in the messy reality of a big American city.

Transparency, Oversight, and Community Support

First, as the Verge exhaustively demonstrates, top New Orleans political and community leaders 鈥 including city council members 鈥 were not told about this program, let alone asked whether they thought it was a good idea. Because the program was presented as a philanthropic gift to the city by Palantir, the Verge notes, combined with the strong unilateral powers possessed by the mayor in New Orleans, the agreement between Palantir and the city never had to pass through a public procurement process, which would have required the signoff of the city council, and provided an opportunity for it to be publicly debated.

In another indicator of the lack of transparency around this program, the Times-Picayune obtained information about police use of data, including a heat list called a 鈥済ang member scorecard,鈥 but the police department refused the paper鈥檚 requests for interviews about the programs. They relented after the Verge piece was published, mostly in order to dispute some of what the Verge reported 鈥 but still refused to divulge the factors used to identify and rank those listed on the scorecard.

The solution for this kind of problem is for cities like New Orleans to enact legislation that we at the 老澳门开奖结果 are advocating through our effort called Community Control Over Police Surveillance, or CCOPS. CCOPS is a push to enact legislation at the state and local level that would prohibit police departments or other agencies from acquiring or using surveillance technologies without public input and the approval of elected representatives. If New Orleans had had such a statute in place, this technology could not have been deployed before the community knew about it.

It is true that in this case the program was actually not a secret. Palantir and city officials talked about it from time to time. I was actually aware of it myself, having heard a presentation on it at a 鈥渂ig data鈥 conference. Does that mean that CCOPS is not the remedy here? To the contrary. The fact that, as the Verge exhaustively shows, city officials and communities didn鈥檛 know about it highlights the need for regular processes by which to put programs like this before elected officials and the communities they serve. New Orleans city council members and other community leaders have a lot on their plates and generally do not focus on big data issues. They can鈥檛 be expected to have known about it from discussions within that specialized community.

CCOPS also requires law enforcement to secure city council approval of operational policies for surveillance technologies. That helps communities verify that they don鈥檛 run afoul of civil rights and civil liberties principles 鈥 a significant risk with predictive policing software.

We couldn鈥檛 summarize it any better than former NOPD crime analyst Jeff Asher, who told the Times-Picayune that this kind of technology 鈥渘eeds oversight, it needs transparency, it needs community support.鈥 None of those mutually reinforcing and necessary (but not sufficient) conditions can be met without the right institutional structure in place 鈥 the kind of structure that CCOPS requires.

Fortunately, there are strong indications that this wisdom is beginning to sink in. The Verge quotes one of Palantir鈥檚 own employees, Courtney Bowman, as saying, 鈥淭hese sorts of programs only work if the community is comfortable with the degree to which this type of information is being applied and if they鈥檙e aware of how the information is being used.鈥 And while police departments around America continue to acquire and deploy sensitive new technologies in secret, a number of the savvier police chiefs I鈥檝e spoken with appreciate the need to get community buy-in before they deploy controversial new technologies.

Stop and frisk

One of the key data inputs for this predictive policing program, as mentioned above, was the NOPD database of field interview cards (FICs). Under the (and possibly unconstitutional) FIC program, NOPD officers were instructed to fill out information on every encounter with citizens, even where there was no arrest. Police officials tout the intelligence benefits of this data.

Although the FIC program has been improved in recent years, there is every reason to believe there鈥檚 a heavy racial bias in whose information is entered into this database. Certainly there has been a strong racial bias in stops made under the infamous 鈥渟top and frisk鈥 program, as well as similar programs in Milwaukee, , , and . The police may stop and hassle people for no reason and collect information on them, but they鈥檙e a lot less likely to do that in affluent White neighborhoods. Police know that affluent Whites are just not part of the 鈥mistreatable class.鈥

And of course it鈥檚 not just FIC data that has a racial bias; so does arrest, conviction, and much other data tied to the criminal justice system. As everyone knows, when it comes to computer algorithms, bad data produces bad results 鈥 鈥済arbage in, garbage out.鈥 Or in this case, 鈥渞acism in, racism out.鈥 Tying the FIC database to the city鈥檚 risk assessment program also increases the NOPD鈥檚 incentives to perpetuate the collection of more and more data.

Carrots and sticks

Another lesson from the New Orleans experience has to do with the outputs of predictive risk assessments. Four years ago, when we first heard about this kind of law enforcement application of data analytics in the form of Chicago鈥檚 鈥渉eat list,鈥 I wrote:

Overall, the key question is this: will being flagged by these systems lead to good things in a person鈥檚 life, like increased support, opportunities, and chances to escape crime鈥攐r bad things, such as surveillance and prejudicial encounters with the police?

The Verge reports that New Orleans settled on both 鈥 a 鈥渃arrot and stick鈥 approach called the 鈥淐easeFire鈥 program. The New Orleans police

used the list of potential victims and perpetrators of violence generated by Palantir to target individuals for the city鈥檚 CeaseFire program鈥. In the program, law enforcement informs potential offenders with criminal records that they know of their past actions and will prosecute them to the fullest extent if they re-offend. If the subjects choose to cooperate, they are 鈥渃alled in鈥 to a required meeting as part of their conditions of probation and parole and are offered job training, education, potential job placement, and health services.

That鈥檚 not quite how things worked out, however. The Verge quotes a community activist named Robert Goodman who worked with people identified as at risk in the CeaseFire program.

Over time, Goodman noticed more of an emphasis on the 鈥渟tick鈥 component of the program and more control over the non-punitive aspects of the program by city hall that he believes undermined the intervention work.

The numbers tell the story. According to the Times-Picayune, there was a sharp drop in 鈥渃all-ins,鈥 in which those identified as high risk are offered social services and support.

Records show the city hosted 10 call-ins from October 2012 through November 2015, bringing in 285 participants. Since November 2015, only one call-in has been held 鈥 in March 2017, records show.

On the other hand, the Verge reports:

By contrast, law enforcement vigorously pursued its end of the program. From November 2012, when the new Multi-Agency Gang Unit was founded, through March 2014, racketeering indictments escalated: 83 alleged gang members in eight gangs were indicted in the 16-month period, according to an .

The fact that New Orleans emphasized the stick over the carrot suggests further reasons for skepticism about such programs. As I have discussed, how we evaluate the potential pitfalls and benefits of such programs is much different when data analytics are used to offer benefits to people than when it results in adverse consequences. The consequences of inaccurate identifications are far greater when people are hurt, for example. Unfortunately, when analytics programs such as this are introduced in a national context and culture of a justice system that functions as a racist 鈥淣ew Jim Crow,鈥 this kind of outcome 鈥 a tiny, shriveled carrot and a big, brutal stick 鈥 is all too predictable.

Social media and 鈥渞isk assessments鈥

In the wake of the publication of the Verge鈥檚 report, New Orleans officials and Palantir disputed the portrayal of the program in that piece. Police officials, including the NOPD鈥檚 top data analyst, the Times-Picayune, and Palantir鈥檚 Courtney Bowman reached out to us at the 老澳门开奖结果.

For example, the Verge report repeatedly asserts that the New Orleans program made use of 鈥渟ocial media posts鈥 and 鈥渋nformation scraped from social media.鈥 The use of social media as part of any law enforcement risk assessment program is a big concern because it might chill speech, organization, and dissent. Bowman told us that 鈥渢here was no bulk social media data collection or scraping鈥 as part of this program, and that social media was only 鈥渦sed on an ad hoc basis,鈥 for example when collected in the course of specific criminal investigations (a use we do not object to). Police officials told the Times-Picayune, meanwhile, that their 鈥済ang scorecard鈥 was simply a (non-Palantir) spreadsheet with names sorted by 鈥渢he number of gun related events,鈥 and said it hadn鈥檛 been used. Bowman told us that 鈥渙paque algorithms, statistical models, machine learning, or AI were never elements of this effort,鈥 and emphasized that (as with its other clients) Palantir did not actually collect or store any data itself.

I鈥檓 glad to hear about these limitations on what the program involved. None of that makes any difference, however, for the points I make above. If officials believe that the details of what they are doing have been exaggerated or misunderstood, they have only themselves to blame in failing to build the transparency and trust that would make the program鈥檚 contours clear to all. And what police officials did acknowledge building 鈥 that 鈥渞isk assessment database鈥 鈥 is troubling enough.

Finally, what are we to make of today鈥檚 news that the city has terminated its agreement with Palantir? First, it鈥檚 a reminder that the benefits of this kind of approach to policing are unproven. Transparency about the uses of analytics in law enforcement is important because it鈥檚 so new 鈥 but that very novelty also means that we don鈥檛 know how experiments in this area will turn out. Sometimes new technologies are the subject of a lot of hype, excitement, and sales pitches and are eagerly adopted by police departments, but then die on the vine because they don鈥檛 actually prove to be effective or practical. That happened with face recognition right after 9/11, for example. Here the mayor鈥檚 office told the Times-Picayune that the contract is not being reviewed because 鈥淭his technology is no longer being utilized in day-to-day operations.鈥 That fact may speak louder than any statement the New Orleans police may issue. The incoming mayor told the paper that 鈥渢his particular model of policing will no longer come under review.鈥

At the same time, as with face recognition, we have to assume that police departments 鈥 perhaps including New Orleans 鈥 will continue to experiment with data analytics in ways that will raise civil liberties issues. If they do, we can only hope they absorb the lessons above.

Learn More 老澳门开奖结果 the Issues on This Page