Back to News & Commentary

Four Problems with the ShotSpotter Gunshot Detection System

Street camera with building in background
A new investigation points to yet another problematic outcome of the technology鈥檚 use and the company鈥檚 lack of transparency.
Street camera with building in background
Jay Stanley,
Senior Policy Analyst,
老澳门开奖结果 Speech, Privacy, and Technology Project
Share This Page
August 24, 2021

(Updated below)

A on the ShotSpotter gunshot detection system issued today by the City of Chicago鈥檚 Inspector General (IG) is the latest indication of deep problems with the gunshot detection company and its technology, including its methodology, effectiveness, impact on communities of color, and relationship with law enforcement. The report questioned the 鈥渙perational value鈥 of the technology and found that it increases the incidence of stop and frisk tactics by police officers in some neighborhoods.

The IG鈥檚 report follows a similarly critical and by the Northwestern School of Law鈥檚 MacArthur Justice Center and devastating by Vice News and the Associated Press. Last week, the AP Michael Williams, a man who spent a year in jail on murder charges based on evidence from ShotSpotter before having his charges dismissed when prosecutors admitted they had insufficient evidence against him.

Shotspotter installs 20 to 25 microphones per square mile in the cities where it is installed, and uses those microphones to try to identify and locate the sound of gunshots. In the past, we have scrutinized this company and its technology from a privacy perspective. Placing live microphones in public places raises significant privacy concerns. After looking at the details of ShotSpotter鈥檚 system, we didn鈥檛 think it posed an active threat to privacy, but we were concerned about the precedent it set (and agreed).

But aural privacy is not the main problem with ShotSpotter, it turns out. There are several other very significant civil liberties problems with the technology.

First, as the MacArthur Justice Center , ShotSpotter is deployed overwhelmingly in communities of color, which already disproportionately bear the brunt of a heavy police presence. The police say they pick neighborhoods for deployment based on where the most shootings are, but there are several problems with that:

  • ShotSpotter false alarms send police on numerous trips (in Chicago, more than 60 times a day) into communities for no reason and on high alert expecting to potentially confront a dangerous situation. Given the already tragic number of shootings of Black people by police, that is a recipe for trouble.
  • Indeed, the Chicago IG鈥檚 analysis of Chicago police data found that the 鈥減erceived aggregate frequency of ShotSpotter alerts鈥 in some neighborhoods leads officers to engage in more stops and pat downs.
  • The placement of sensors in some neighborhoods but not others means that the police will detect more incidents (real or false) in places where the sensors are located. That can distort gunfire statistics and create a circular statistical justification for over-policing in communities of color.

Second, ShotSpotter鈥檚 methodology is used to provide evidence against defendants in criminal cases, but isn鈥檛 transparent and hasn鈥檛 been peer-reviewed or otherwise independently evaluated. That simply isn鈥檛 acceptable for data that is used in court.

The company鈥檚 sensors automatically send audio files to human analysts when those sensors detect gunshot-like sounds. Those analysts then decide whether the sounds are gunshots or other loud noises such as firecrackers, car backfires, or construction noises. They also triangulate the timing of when sounds reach different microphones to try to establish a location for the noise, and if it is believed to be the sound of gunshot, they make an effort to figure out how many shots were fired and what kind of gun is involved (such as a pistol versus a fully automatic weapon).

ShotSpotter all of this as a straightforward and objective process, but it is anything but. Vice News and the AP note examples of the company鈥檚 analysts changing their judgments on all of the above types of results (which ShotSpotter ). In addition, the company uses AI algorithms to assist in the analysis 鈥 and as with all AI algorithms, that raises questions about reliability, transparency, and the reproducibility of results. The company a request by the independent security technology research publication IPVM to carry out independent tests of its methodologies.

Further calling into question the appropriateness of ShotSpotter evidence for use in court is a third problem: the company's apparent tight relationship with law enforcement. A ShotSpotter expert admitted in a 2016 trial, for example, that the company reclassified sounds from a helicopter to a bullet at the request of a police department customer, saying such changes occur 鈥渁ll the time鈥 because 鈥渨e trust our law enforcement customers to be really upfront and honest with us.鈥 ShotSpotter also uses reports from police officers as 鈥済round truth鈥 in training its AI algorithm not to make errors. A close relationship between ShotSpotter and police isn鈥檛 surprising 鈥 police departments are the company鈥檚 customers and the company needs to keep them happy. But that isn鈥檛 compatible with the use of its tool as 鈥渙bjective data鈥 used to convict people of crimes.

Finally, still up for debate is whether ShotSpotter鈥檚 technology is even effective. We can argue over a technology鈥檚 civil liberties implications until the end of time, but if it鈥檚 not effective there鈥檚 no reason to bother. A number of cities have stopped using the technology after deciding that ShotSpotter creates too many false positives (reporting gunshots where there were none) and false negatives (missing gunshots that did take place). The MacArthur Justice Center鈥檚 report found that in Chicago, initial police responses to 88.7 percent of ShotSpotter alerts found no incidents involving a gun. The company whether this means its technology is inaccurate, pointing out that someone can shoot a gun but leave no evidence behind. But a of the accuracy debate by IPVM concluded that 鈥渨hile public data does not enable a definitive estimation of false alerts,鈥 the problem 鈥渋s likely significantly greater than what ShotSpotter insinuates鈥 because the company 鈥渦ses misleading assumptions and a misleading accuracy calculation鈥 in their advertised accuracy rates.

Given all of these problems, communities and the police departments serving them should reject this technology, at least until these problems are addressed, including through full transparency into its operation and efficacy.

Update (10/14/21):

Shotspotter CEO Ralph Clarke reached out to us to vigorously dispute the sources of information that we relied upon for this post, and also pushed back on the company鈥檚 critics in a published last month in the Buffalo News. Most recently, his company a defamation against Vice News; their complaint is a voluminous argument for the company鈥檚 technology. Two points in particular seem worth highlighting:

First, pressed on reports that the company has changed its evaluation of the details of gunshots in court, Clarke told me that the company provides two kinds of data about gunshots: an initial, real-time alert sent to police shortly after a gunshot is detected, and a much more thorough 鈥渄etailed forensic report鈥 that is prepared for court cases. Clarke said that what has been reported as Shotspotter 鈥渃hanging its story鈥 reflects the differences between the real-time and detailed forensic reports.

Second, one of the elements in the reporting on Shotspotter that alarmed me the most were references to the fact that the company was using AI as part of its system. The use of evidence in court derived from AI algorithms raises severe issues of transparency, accuracy, and fairness. Clarke said that they have algorithms that are used to 鈥渄o the math鈥 in triangulating the location of gunshots based on the timing of acoustic data from their sensors but, pressed on what that meant, he said they are not opaque deep-learning black boxes, but simply algorithms doing math that could otherwise be done by hand. Clarke said a more complex AI algorithm is used to filter out 鈥減ops, booms, and bangs鈥 picked up by the company鈥檚 sensors that are believed to actually be gunshots before the audio is sent to human analysts for review. That鈥檚 not as much of a concern; inaccuracies in such an algorithm might result in some missed gunshots but aren鈥檛 going to lead to unfair evidentiary judgments.

Clarke also pushed back on criticisms of Shotspotter鈥檚 efficacy and cost-benefit value. Those involve complex assessments of real-world data as well as value judgments that experts and communities will have to monitor, evaluate, and debate. As always, we don鈥檛 think any police technology should be deployed or used unless affected communities clearly want them.

Learn More 老澳门开奖结果 the Issues on This Page