Back to News & Commentary

Power Loves the Dark

LA Protests After Grand Jury Decides Not to Indict Officer Darren Wilson
LA Protests After Grand Jury Decides Not to Indict Officer Darren Wilson
Matthew Harwood,
Former Managing Editor,
老澳门开奖结果
Jay Stanley,
Senior Policy Analyst,
老澳门开奖结果 Speech, Privacy, and Technology Project
Share This Page
May 19, 2016

This piece originally appeared at .

Can鈥檛 you see the writing on the touchscreen? A techno-utopia is upon us. We鈥檝e gone from smartphones at the turn of the twenty-first century to smart fridges and smart cars. The revolutionary changes to our everyday life will no doubt keep barreling along. By 2018, so , an information technology research and advisory company, more than three million employees will work for 鈥渞obo-bosses鈥 and soon enough we 鈥 or at least the wealthiest among us 鈥 will be shopping in fully automated supermarkets and sleeping in robotic hotels.

With all this techno-triumphalism permeating our digitally saturated world, it鈥檚 hardly surprising that law enforcement would look to technology 鈥 鈥渟mart policing,鈥 anyone? 鈥 to help reestablish public trust after the 2014 death of Michael Brown in Ferguson, Missouri, and the of other unarmed black men killed by cops in Anytown, USA. The idea that technology has a decisive role to play in improving policing was, in fact, a central plank of President Obama鈥檚 policing reform task force.

In its report, released last May, the Task Force on 21st Century Policing emphasized the crucial role of technology in promoting better law enforcement, highlighting the use of police body cameras in creating greater openness. 鈥淚mplementing new technologies,鈥 it , 鈥渃an give police departments an opportunity to fully engage and educate communities in a dialogue about their expectations for transparency, accountability, and privacy.鈥

Indeed, the report emphasized ways in which the police could engage communities, work collaboratively, and practice transparency in the use of those new technologies. Perhaps it won鈥檛 shock you to learn, however, that the on-the-ground reality of twenty-first-century policing looks nothing like what the task force was promoting. Police departments nationwide have been adopting powerful new technologies that are remarkably capable of intruding on people鈥檚 privacy, and much of the time these are being deployed in secret, without public notice or discussion, let alone permission.

And while the task force鈥檚 report says all the right things, a little digging reveals that the feds not only aren鈥檛 putting the brakes on improper police use of technology, but are encouraging it 鈥 even subsidizing the misuse of the very technology the task force believes will keep cops honest. To put it bluntly, a techno-utopia isn鈥檛 remotely on the horizon, but its flipside may be.

Getting Stung and Not Even Knowing It

Stingray surveillance van

Shemar Taylor was charged with robbing a pizza delivery driver at gunpoint. The police got a warrant to search his home and arrested him after learning that the cell phone used to order the pizza was located in his house. How the police tracked down the location of that cell phone is what Taylor鈥檚 attorney .

The Baltimore police detective called to the stand in Taylor鈥檚 trial was evasive. 鈥淭here鈥檚 equipment we would use that I鈥檓 not going to discuss,鈥 he said. When Judge Barry Williams ordered him to discuss it, he still refused, insisting that his department had signed a nondisclosure agreement with the FBI.

鈥淵ou don鈥檛 have a nondisclosure agreement with the court,鈥 replied the judge, threatening to hold the detective in contempt if he did not answer. And yet he refused again. In the end, rather than reveal the technology that had located Taylor鈥檚 cell phone to the court, prosecutors decided to withdraw the evidence, jeopardizing their case.

And don鈥檛 imagine that this courtroom scene was unique or even out of the ordinary these days. In fact, it was just one sign of a striking nationwide attempt to keep an invasive, constitutionally questionable technology from being scrutinized, whether by courts or communities.

The technology at issue is known as a 鈥淪tingray,鈥 a brand name for what鈥檚 generically called a cell site simulator or IMSI catcher. By , this device, for overseas battlefields, gets nearby cell phones to connect to it. It operates a bit like the children鈥檚 game Marco Polo. 鈥淢arco,鈥 the cell-site simulator shouts out and every cell phone on that network in the vicinity replies, 鈥淧olo, and here鈥檚 my ID!鈥

Thanks to this call-and-response process, the Stingray knows both what cell phones are in the area and where they are. In other words, it gathers information not only about a specific suspect, but any bystanders in the area as well. While the police may indeed use this technology to pinpoint a suspect鈥檚 location, by casting such a wide net there is also the potential for many kinds of constitutional abuses 鈥 for instance, sweeping up the identities of every person attending a demonstration or a political meeting. Some Stingrays are capable of collecting not only cell phone ID numbers but also numbers those phones have dialed and even phone conversations. In other words, the Stingray is a technology that potentially opens the door for law enforcement to sweep up information that not so long ago wouldn鈥檛 have been available to them.

All of this raises the sorts of constitutional issues that might normally be settled through the courts and public debate... unless, of course, the technology is kept largely secret, which is exactly what鈥檚 been happening.

After the use of Stingrays was first in 2011, the 老澳门开奖结果 (老澳门开奖结果) and other activist groups attempted to find out more about how the technology was being used, only to quickly run into heavy resistance from police departments nationwide. Served with 鈥渙pen-records requests鈥 under Freedom of Information Act-like state laws, they almost uniformly resisted disclosing information about the devices and their uses. In doing so, they regularly cited nondisclosure agreements they had signed with the Harris Corporation, maker of the Stingray, and with the FBI, prohibiting them from telling anyone (including other government outfits) about how 鈥 or even that 鈥 they use the devices.

Sometimes such evasiveness reaches near-comical levels. For example, police in the city of Sunrise, Florida, served with an open-records request, refused to confirm or deny that they had any Stingray records at all. Under cover of a controversial national security court ruling, the CIA and the NSA sometimes resort to just this evasive tactic (known as a 鈥溾). The Sunrise Police Department, however, is not the CIA, and no provision in Florida law would allow it to take such a tack. When the 老澳门开奖结果 pointed out that the department had already posted purchase records for Stingrays on its public website, it generously provided duplicate copies of those very documents and then tried to charge the 老澳门开奖结果 $20,000 for additional records.

In a no-less-bizarre incident, the Sarasota Police Department was about to turn some Stingray records over to the 老澳门开奖结果 in accordance with Florida鈥檚 open-records law, when the U.S. Marshals Service swooped in and seized the records first, claiming ownership because it had deputized one local officer. And excessive efforts at secrecy are not unique to Florida, as those charged with enforcing the law commit themselves to Stingray secrecy in a way that makes them lawbreakers.

And it鈥檚 not just the public that鈥檚 being denied information about the devices and their uses; so are judges. Often, the police get a judge鈥檚 sign-off for surveillance without even bothering to mention that they will be using a Stingray. In fact, officers regularly avoid describing the technology to judges, claiming that they simply can鈥檛 violate those FBI nondisclosure agreements.

More often than not, police use Stingrays without bothering to get a warrant, instead seeking a court order on a legal standard. This is part of the charm of a new technology for the authorities: nothing is settled on how to use it. Appellate judges in Tallahassee, Florida, for instance, revealed that local police had used the tool more than 200 times without a warrant. In Sacramento, California, police that they had, in more than 500 investigations, used Stingrays without telling judges or prosecutors. That was 鈥渁n estimated guess,鈥 since they had no way of knowing the exact number because they had conveniently deleted records of Stingray use after passing evidence discovered by the devices on to detectives.

Much of this blanket of secrecy, spreading nationwide, has indeed been orchestrated by the FBI, which has required local departments eager for the hottest new technology around to sign those . One agreement, in Oklahoma, explicitly instructs the local police to find 鈥渁dditional and independent investigative means鈥 to corroborate Stingray evidence. In short, they are to cover up the use of Stingrays by pretending their information was obtained some other way 鈥 the sort of dangerous constitutional runaround that is known euphemistically in law enforcement circles as a 鈥.鈥

Now that information about the widespread use of this new technology is coming out 鈥 as in the Shemar Taylor trial in Baltimore 鈥 judges are beginning to rule that Stingray use does indeed require a warrant. They are also insisting that police must accurately inform judges when they intend to use a Stingray and disclose its privacy implications.

Garbage In, Garbage Out

Predictive Policing

And it鈥檚 not just the Stingray that鈥檚 taking local police forces into new and unknown realms of constitutionally questionable but deeply seductive technology. Consider the hot new trend of 鈥減redictive policing.鈥 Its products couldn鈥檛 be high-techier. They go by a variety of names like PredPol (yep, short for predictive policing) and HunchLab (and there鈥檚 nothing wrong with a hunch, is there?). What they all promise, however, is the same thing: supposedly bias-free policing built on the latest in computer software and capable of leveraging big data in ways that 鈥 so their salesmen will tell you 鈥 can coolly determine where crime is most likely to occur next.

Such technology holds out the promise of allowing law enforcement agencies to deploy their resources to areas that need them most without that nasty element of human prejudice getting involved. 鈥淧redictive methods allow police to work more proactively with limited resources,鈥 the RAND Corporation. But the new software offers something just as potentially alluring as efficient policing 鈥 exactly what the president鈥檚 task force called for. According to market leader , its technology 鈥減rovides officers an opportunity to interact with residents, aiding in relationship building and strengthening community ties.鈥

How idyllic! In post-Ferguson America, that鈥檚 a winning sales pitch for decision-makers in blue. Not so surprisingly, then, PredPol is now used in the United States, and investment capital just keeps pouring into the company. In 2013, SF Weekly that over 150 departments across the nation were already using predictive policing software, and those numbers can only have risen as the potential for cashing in on the craze has attracted tech heavy hitters like , , and , the co-creation of PayPal co-founder Peter Thiel.

Like the Stingray, the software for predictive policing is yet another spillover from the country鈥檚 distant wars. PredPol was, according to SF Weekly, initially designed for 鈥渢racking insurgents and forecasting casualties in Iraq,鈥 and was financed by the Pentagon. One of the company鈥檚 advisors, , used to work for , the CIA鈥檚 venture capital firm.

Civil libertarians and civil rights activists, however, are less than impressed with what鈥檚 being hailed as breakthrough police technology. We tend to view it instead as a set of potential new ways for the police to continue a long history of profiling and pre-convicting poor and minority youth. We also question whether the technology even performs as advertised. As we see it, the old saying 鈥済arbage in, garbage out鈥 is likely to best describe how the new software will operate, or as the RAND Corporation puts it, 鈥減redictions are only as good as the underlying data used to make them.鈥

If, for instance, the software depends on historical crime data from a racially biased police force, then it鈥檚 just going to send a flood of officers into the very same neighborhoods they鈥檝e always over-policed. And if that happens, of course, more personnel will find more crime 鈥 and presto, you have the potential for a perfect feedback loop of prejudice, arrests, and high-tech 鈥渟uccess.鈥 To understand what that means, keep in mind that, without a computer in sight, nearly four times as many blacks as whites are arrested for marijuana possession, even though usage among the two groups is .

If you leave aside issues of bias, there鈥檚 still a fundamental question to answer about the new technology: Does the software or, for that matter, reduce crime? Of course, the companies peddling such products insist that it does, but no independent analyses or reviews had yet verified its effectiveness until last year 鈥 or so it seemed at first.

In December 2015, the Journal of the American Statistical Association a study that brought joy to the predictive crime-fighting industry. The study鈥檚 researchers concluded that a predictive policing algorithm outperformed human analysts in indicating where crime would occur, which in turn led to real crime reductions after officers were dispatched to the flagged areas. Only one problem: five of the seven authors held PredPol stock, and two were co-founders of the company. On its , PredPol identifies the research as a 鈥淯CLA study,鈥 but only because PredPol co-founder Jeffery Brantingham is an anthropology professor there.

Predictive policing is a brand new area where question marks abound. Transparency should be vital in assessing this technology, but the companies generally won鈥檛 allow communities targeted by it to examine the code behind it. 鈥淲e wanted a greater explanation for how this all worked, and we were told it was all proprietary,鈥 Kim Harris, a spokeswoman for Bellingham, Washington鈥檚 Racial Justice Coalition, the Marshall Project after the city purchased such software last August. 鈥淲e haven鈥檛 been comforted by the process.鈥

The Bellingham Police Department, which predictive software made by Bair Analytics with a $21,200 Justice Department grant, didn鈥檛 need to go to the city council for approval and didn鈥檛 hold community meetings to discuss the development or explain how the software worked. Because the code is proprietary, the public is unable to independently verify that it doesn鈥檛 have serious problems.

Even if the data underlying most predictive policing software accurately anticipates where crime will indeed occur 鈥 and that鈥檚 a gigantic if 鈥 questions of fundamental fairness still arise. Innocent people living in or passing through identified high crime areas will have to deal with an increased police presence, which, given recent history, will likely mean more questioning or stopping and frisking 鈥 and arrests for things like marijuana possession for which more affluent citizens are rarely brought in. Moreover, the potential inequality of all this may only worsen as police departments bring online other new technologies like facial recognition.

We鈥檙e on the verge of 鈥渂ig data policing,鈥 law professor Andrew Ferguson, which will 鈥渢urn any unknown suspect into a known suspect,鈥 allowing an officer to 鈥渟earch for information that might justify reasonable suspicion鈥 and lead to stop-and-frisk incidents and aggressive questioning. Just imagine having a decades-old criminal record and facing police armed with such powerful, invasive technology.

This could lead to 鈥溾 and a Faustian bargain in which the public increasingly forfeits its freedoms in certain areas out of fears for its safety. 鈥淭he Soviet Union had remarkably little street crime when they were at their worst of their totalitarian, authoritarian controls,鈥 MIT sociologist Gary Marx . 鈥淏ut, my god, at what price?鈥

To Record and Serve... Those in Blue

mytubethumb
play

%3Ciframe%20allowfullscreen%3D%22%22%20frameborder%3D%220%22%20height%3D%22315%22%20src%3D%22https%3A%2F%2Fwww.youtube.com%2Fembed%2F3_j9_GxR_nU%3Fautoplay%3D1%26version%3D3%22%20width%3D%22560%22%3E%3C%2Fiframe%3E

Privacy statement. This embed will serve content from youtube.com.

On a June night in 2013, Augustin Reynoso discovered that his bicycle had been stolen from a CVS in the Los Angeles suburb of Gardena. A store security guard called the police while Reynoso鈥檚 brother Ricardo Diaz Zeferino and two friends tried to find the missing bike in the neighborhood. When the police arrived, they promptly ordered his two friends to put their hands up. Zeferino ran over, protesting that the police had the wrong men. At that point, they told him to raise his hands, too. He then lowered and raised his hands as the police yelled at him. When he removed his baseball hat, lowered his hands, and began to raise them again, he was shot to death.

The police insisted that Zeferino's actions were "threatening" and so their shooting justified. They had two videos of it taken by police car cameras 鈥 but refused to release them.

Although police departments nationwide have been fighting any spirit of new openness, car and body cameras have at least offered the promise of bringing new transparency to the actions of officers on the beat. That鈥檚 why the 老澳门开奖结果 and many civil rights groups, as well as President Obama, have spoken out in favor of the technology鈥檚 potential to improve police-community relations 鈥 but only, of course, if the police are obliged to release videos in situations involving allegations of abuse. And many departments are fighting that fiercely.

In Chicago, for instance, the police notoriously opposed the release of dashcam video in the shooting death of Laquan McDonald, citing the supposed imperative of an 鈥渙ngoing investigation.鈥 After more than a year of such resistance, a judge finally ordered the video made public. Only then did the scandal of seeing Officer Jason Van Dyke unnecessarily into the 17-year-old鈥檚 body explode into national consciousness.

In Zeferino's case, the police settled a lawsuit with his family for $4.7 million and yet continued to refuse to release the videos. It took two years before a judge their release, allowing the public to the shooting for itself.

Despite this, in April 2015 the Los Angeles Board of Police Commissioners a body-camera policy that failed to ensure future transparency, while protecting and serving the needs of the Los Angeles Police Department (LAPD). In doing so, it ignored the sort of best practices advocated by the White House, the president鈥檚 task force on policing, and even the Police Executive Research Forum, one of the profession鈥檚 most respected think tanks.

On the possibility of releasing videos of alleged police misconduct and abuse, the new policy remained silent, but LAPD officials, including Chief Charlie Beck, didn鈥檛. They made it clear that such videos would generally be exempt from California鈥檚 public records law and wouldn鈥檛 be released without a judge鈥檚 orders. Essentially, the police to release video when and how they saw fit. This self-serving policy comes from the most lethal large police department in the country, whose officers last year.

Other departments around the country have made similar moves to ensure control over body camera videos. Texas and South Carolina, among other states, have even changed their open-records laws to give the police power over when such footage should (or should not) be released. In other words, when a heroic cop saves a drowning child, you鈥檒l see the video; when that same cop guns down a fleeing suspect, don鈥檛 count on it.

Curiously, given the stated positions of the president and his task force, the federal government seems to have no fundamental problem with that. In May 2015, for example, the Justice Department competitive grants for the purchase of police body cameras, officially tying funding to good body-cam-use policies. The LAPD applied. Despite letters from groups like the 老澳门开奖结果 pointing out just how poor its version of body-cam policy was, the Justice Department awarded it to purchase approximately 700 cameras 鈥 accountability and transparency be damned.

To receive public money for a tool theoretically meant for transparency and accountability and turn it into one of secrecy and impunity, with the feds鈥 complicity and financial backing, sends an unmistakable message on how new technology is likely to affect America鈥檚 future policing practices. Think of it as a door slowly opening onto a potential policing dystopia.

Hello Darkness, Power鈥檚 Old Friend

Keep in mind that this article barely scratches the surface when it comes to the increasing numbers of ways in which the police鈥檚 use of technology has infiltrated our everyday lives.

In states and cities across America, some public bus and train systems to add to video surveillance, the surreptitious recording of the conversations of passengers, a potential body blow to the concept of a private conversation in public space. And whether or not the earliest versions of predictive policing actually work, the law enforcement community is already moving to technology that will try to predict who will in the future. In Chicago, the police are using social networking analysis and prediction technology to draw up 鈥溾 of those who might perpetuate violent crimes someday and pay them visits now. You won鈥檛 be shocked to learn which side of the tracks such future perpetrators live on. The rationale behind all this, as always, is 鈥減ublic safety.鈥

Nor can anyone begin to predict how law enforcement will avail itself of science-fiction-like technology in the decade to come, much less decades from now, though cops on patrol may very soon know a lot about you and your past. They will be able to cull such information from a multitude of databases at their fingertips, while you will know little or nothing about them 鈥 a striking power imbalance in a situation in which one person can deprive the other of liberty or even life itself.

With little public debate, often in almost total secrecy, increasing numbers of police departments are wielding technology to empower themselves rather than the communities they protect and serve. At a time when trust in law enforcement is dangerously low, police departments should be embracing technology鈥檚 democratizing potential rather than its ability to give them almost superhuman powers at the expense of the public trust.

Unfortunately, power loves the dark.

Learn More 老澳门开奖结果 the Issues on This Page