At ShotSpotter, we appreciate and respect the ACLU’s mission to defend and preserve individual rights and liberties. With its influence and prominent platform for doing good, though, comes the responsibility to be accurate and fair. Unfortunately, the August 2021 byline titled “Four Problems with the ShotSpotter Gunshot Detection System” contained a number of false assertions that we correct today so readers can have a clear view of our technology, who we are as a company, and our value for the world.
ShotSpotter’s Impact on Communities of Color
Our coverage areas are determined by police using objective historical data on shootings and homicides to identify areas most impacted by gun violence. The sad fact is that gun violence disproportionately impacts communities of color. According to the CDC, the number one cause of death among young Black men between the ages of 18-44 is gun violence, and the number one risk to Black and Brown communities nationwide is gun violence.
“Our minority communities are most adversely affected by violent crime. That’s a fact, whether people want to admit it or not,” Fresno Police Chief Paco Valderrama said in a recent interview. “A technology like ShotSpotter helps us protect these communities, helps us respond more adequately, faster, with more information. I find it really difficult to argue against that.”
Responses to a 2019 ShotSpotter Survey Evaluation Report in Cincinnati indicate that residents in the Price Hill neighborhood find it hard to argue against the value of ShotSpotter technology, with 89% agreeing they would recommend ShotSpotter to another neighborhood.
It is important to remember that more than 80% of gunfire incidents nationwide are not reported to the police, a staggering statistic. We believe responsible city leadership cannot ignore acknowledging and responding to criminal gunfire incidents. All residents who live in communities experiencing persistent gunfire deserve rapid police response, which gunshot detection technology like ShotSpotter enables.
ShotSpotter as Evidence in Court Cases
The ACLU also sought to raise questions about the credibility of ShotSpotter evidence in court – questions that fall in the face of the facts.
To start, ShotSpotter evidence and expert witness testimony have been successfully used in over 200 court cases in 20 states. Our detailed forensic reports (DFRs) are prepared by experienced forensic engineers and take eight hours on average to compile. Neither police nor prosecutors have input into how forensic reports are written and prepared or what they say. Our technology provides the facts about where and when guns were fired, regardless of whether the facts support a conviction or an acquittal.
Our evidence has repeatedly survived scrutiny by courts in at least 14 Frye hearings and 2 Daubert hearings. For context, Frye hearings relate to the Frye standard, or a general acceptance test used in United States courts to determine the admissibility of an expert’s scientific testimony, and Daubert hearings relate to the Daubert standard, or the standard a trial judge uses to assess whether an expert witness’s scientific testimony is based on scientifically valid reasoning that can properly be applied to the facts at issue.
ShotSpotter would never fabricate evidence and any assertion along those lines would not only be an outrageous and false allegation but also a criminal act. It is worth noting that a growing number of media outlets, including the Associated Press, the Daily Mail, The Register, the University of Illinois at Chicago Law Review, Data Science Central, and HotHardware have recently retracted, corrected, or clarified such a demonstrably false claim.
ShotSpotter’s Use of Algorithms
The ACLU rightly corrected prior false statements about ShotSpotter’s use of a machine classification algorithm in an update to the article after learning the facts about how it works. We once again share the facts here.
ShotSpotter uses two primary algorithms in the real-time analysis of sounds: first, an algorithm to determine the location of “pops, booms, and bangs,” and second, a machine learning algorithm for filtering out non-gunshot sounds.
The first algorithm determines the location of sounds heard in an area based on the speed of sound and the times at which the sound reaches different sensors. This “time of arrival” technique is based on widely-accepted scientific principles and has been used since WWI, as explained in this whitepaper.
The second algorithm used to classify sounds is the one that tends to generate alarm and which we think merits further discussion.
There is a misconception that ShotSpotter relies on this machine learning classification algorithm to alert police departments of gunfire. In reality, ShotSpotter’s machine learning system is used to filter out obviously non-gunshot impulsive sounds such as aerial fireworks and helicopters. This frees up human reviewers at our Incident Review Centers to focus on accurately classifying a smaller number of sounds and alerting law enforcement only when they are confident a sound is gunfire.
The human reviewers at our IRCs undergo a rigorous training and certification process as well as in-service training. We hold each reviewer to a 99.9% aggregate classification accuracy on a monthly basis. We have effective controls and processes to monitor and maintain our accuracy levels and heavily invest in the IRC’s tools, training, and management. These highly trained reviewers – not machine-learning algorithms – make the decision on whether or not to publish an alert to law enforcement.
ShotSpotter and Privacy
The ACLU has previously reviewed ShotSpotter’s impact on privacy and concluded: “we didn’t think it posed an active threat to privacy.” In part, that was because ShotSpotter voluntarily submitted to a third-party privacy audit from NYU’s Policing Project. The report stated:
“Although ultimately concluding that the risk of voice surveillance was extremely low in practice, we offer ShotSpotter Technologies, Inc. (SST) a variety of recommendations on how to make its gunshot detection product even more privacy protective. As detailed in our report, SST has adopted nearly all of our recommendations verbatim, with only slight modifications or qualifications based on how ShotSpotter functions.”
In addition, this audit received widespread acclaim from members of academia and privacy commissions completely unaffiliated with the audit.
“I think the ShotSpotter audit is a really welcome development,” said Catherine Crump, director of the Samuelson Law Technology & Public Policy Clinic at the University of California at Berkeley School of Law. “There has been far too little attention paid to the details of how surveillance technologies operate: what data they collect, how that data is shared, how that data is kept.”
Brian Hofer, a privacy advocate chair of the Oakland-based Privacy Advisory Commission, added: “We haven’t had a vendor that’s gone so far out of its way to do everything correctly. “[ShotSpotter] didn’t just do a privacy audit or just talk to the ACLU or just talk to experts. After, they amended their practices and really made these significant steps in the right direction.” Hofer added that he hopes that other companies see a privacy audit as a competitive advantage as well.
Actions speak louder than words, and our actions reflect that we clearly take privacy seriously. In addition to the privacy audit, the technology has also received unanimous approval from the municipal privacy commissions in San Francisco and Oakland, with Oakland holding the distinction of having the strongest surveillance oversight law in the country.
ShotSpotter’s Effectiveness, Accuracy, and Customer Retention
Like any business, our customers’ feedback is the best barometer of ShotSpotter’s effectiveness. When it comes to gunshot detection technology, accuracy is paramount. We are proud to report a 97% accuracy rate, including a 0.5% false-positive rate, for real-time detections across all customers over the last three years. Our accuracy rate is derived directly from police department customer feedback that is reported to ShotSpotter, and has been independently confirmed by Edgeworth Analytics, a data science firm in Washington, D.C. Additionally, our 98% customer retention rate speaks for itself, verifying that our system not only works well, but benefits customer communities.
Critics often rely on the misconception that the amount of evidence collected at the scene of a shooting somehow equates with our technology’s accuracy. Nothing could be further from the truth. It is a fact that police work frequently confronts real-world challenges that impact the amount of evidence collected — and with that, the accuracy of reporting. A myriad of factors come into play here — weapons that do not produce shell casings, casings picked up by perpetrators, officers struggling to find shell casings when it is dark, suspects immediately fleeing the scene to avoid arrest, police data reporting procedures, and many other issues.
Additionally, we’d like to clarify that there is zero evidence that ShotSpotter alerts result in police arriving on a scene “hyped up” or behaving any differently than they would to a 911 call. In fact, ShotSpotter equips police officers with valuable information prior to arriving at the scene of a gunshot incident, such as the number of rounds fired and whether there are automatic weapons or multiple shooters. More information leads to more effective preparation and responses.
ShotSpotter’s Impact — Saving Lives and Reducing Gun Violence
The most important aspect of ShotSpotter that was ignored in the ACLU piece is that ShotSpotter save lives. Period.
Academic research shows that ShotSpotter alerts result in faster response times by police officers, as well as a reduction in transport time to the hospital. Police agencies and cities consistently report that ShotSpotter helps them find and aid victims when no one called 911. In Oakland, CA, 101 gunshot wound victims were located and aided by police officers due to ShotSpotter alerts in 2020. In Pittsburgh, PA, ShotSpotter notifications saved the lives of 13 people between 2019-2020. In 2021, the U.S. Conference of Mayors recognized West Palm Beach, Florida’s use of the ShotSpotter system as a “best practice” for enabling quick emergency response times. Furthermore, scrolling Twitter any day of the week reveals reports of local police responding to alerts to find victims of gun violence.
Moreso, two independent studies show a reduction in gun violence when comparing ShotSpotter coverage areas to non-coverage areas. For example, across the eight police beats in St. Louis County, Missouri, the use of ShotSpotter resulted in a 30% decline in gun-related assaults compared to those without the technology. And Cincinnati, Ohio saw a 46% reduction in violent crime where ShotSpotter was deployed.
More examples of ShotSpotter’s impact on public safety are found on our Results page.
The Big Picture
ShotSpotter respects the ACLU and has always sought to have an open dialogue. In this case, we strongly disagree with the ACLU’s conclusion about our technology – first and foremost because it was not based on the facts. And robust public debate needs to be informed by the facts and nothing else, a principle with which we know the ACLU agrees. The truth is this: ShotSpotter detects unreported gunfire to enable a rapid police response, which is why over 120 cities are using our technology to save lives, improve evidence collection, and keep their communities safe.