On August 19, 2021, the Associated Press published an article titled: “Police Jailed a Man for Murder; Algorithm was Key Evidence”. Unfortunately, it contained a false narrative claiming that ShotSpotter and its “secret” algorithms are somehow primarily responsible for the unjust charging, arrest and extended jail time of a defendant, Mr. Michael Williams, in a Chicago homicide case. Such bogus claims could not be further from the truth.

While we did our best to explain the facts to the AP – and they included a lengthy rebuttal from us in the context of the story – I wanted to provide the facts about four major flaws in the story.

At the outset, I want to state that the shooting death of Safarian Herring — a homicide that remains unresolved — was an unspeakable, senseless tragedy, one that has become all too commonplace in communities across the country. We mourn his loss.

Yet, the facts matter, and the injustices borne from this incident will only be perpetuated if lies are permitted to cover up the facts.

Problem One: The AP wrongly blamed ShotSpotter as being primarily responsible for Mr. Williams’s incarceration.

The facts show that neither ShotSpotter’s initial alert nor its forensic report caused Mr. Williams’s arrest or incarceration, and we believe our forensic report may have contributed to his release in July 2021.

ShotSpotter indeed alerted police to the shooting event at issue in May 2020, but the alert did not lead to a police response nor an arrest of anyone at the scene. Mr. Williams was arrested approximately three months later after a police investigation. Also, a request for a ShotSpotter forensic report was not made until February 1, 2021, approximately six months after Mr. William’s arrest based on the AP’s report. Our forensic reports make it clear that this court-admissible evidence is for instances of outdoor gunfire, not indoor or in-car weapon discharges.

In other words, the prosecution understood that, if their theory relied on ShotSpotter detecting a shot fired from within the car, they learned it was in conflict with how ShotSpotter would likely have to testify to the forensic report given the qualification language. We believe this likely contributed to the charges against Mr. Williams being dropped.

Problem Two: ShotSpotter doesn’t identify, charge nor arrest anyone.

ShotSpotter’s real-time alerts and forensic reports have never, will never, and technologically cannot identify shooters. Our technology can only identify shootings. Our sensors have no video capability, and we do not analyze fingerprints or DNA samples from guns or casings.

Our technology is narrowly focused and extremely precise and powerful in identifying exactly three things: (1) where a shooting occurs; (2) when it occurs; and (3) how it occurs (e.g., round count) combined with the recorded audio snippet of the shooting.

ShotSpotter is studiously neutral in our work and relies only on facts, data and science. While prosecutors primarily request forensic reports, we are beholden to neither the prosecution nor the defense. Our duty is to the court, and our reports are restricted to the where, when and how many rounds were associated with it. Our forensic experts, who bring strong qualifications and experience, also testify to the preservation and chain of custody of the recorded audio files that are often played in court so that defense counsel, prosecution, judges, and jurors can hear the recorded gunshot audio for themselves. To assure the integrity of our reports, we have implemented hashing technology that prevents any alteration of the audio and metadata coming from the sensors in order to protect the chain of custody and preserve the digital evidence.

Problem Three: The facts do not support the false secret algorithm narrative and headline.

It is a well-established, public fact that every single published ShotSpotter real-time alert and every forensic report that is ultimately produced involves human analysis. Our classification algorithms do not publish anything — humans do. This makes our work product better for our customers and for the courts.

Our real-time alerting and classification process is driven by a human reviewer using several inputs. That reviewer plays back recordings from multiple sensors, analyzes acoustic wave forms, reviews sensor participation and checks the machine-determined probability that it was gunfire. The probability assigned to the incident does not determine whether an alert is published or dismissed by the reviewer. This requires diligence and excellence on the part of our reviewers who meet our high standards through rigorous training and constant team monitoring so they can maintain and improve on those standards.

Humans prepare our forensic reports, which require on average 8 hours to produce. The only thing our classifier algorithm does independently is dismiss non-gunshot detections (e.g. fireworks, helicopters). The benefit is that it filters out significant amounts of non-gunshot incidents allowing reviewers to spend more of their time focusing on evaluating what are more likely to be gunshots.

Problem Four: The article falsely represented ShotSpotter’s efficacy through gross omissions of facts about ShotSpotter.

While the AP acknowledged that we have 120 customers, the story failed to mention the tremendous wealth of success stories and positive results our service has generated for agencies across the country. We know our solution saves lives and, without it, there would be no police response to the vast majority of gunfire incidents, ever-increasing gun violence normalization, and limited deterrence. Indeed, the Brookings Institution independently reported that 88 percent of gunfire incidents go unreported.

As you know, our solution allows police to rapidly and precisely respond to criminal gunfire, leading to deterrence, increased evidence recovery to enhance investigations, and, most importantly, saving the lives of gunshot wound victims. To name just one example, the Oakland Police Department publicly reported that in 2020 they were able to get first responder trauma intervention to victims in more than 100 shooting cases exclusively because of ShotSpotter alerts.

In addition, it is false to say that ShotSpotter has not been tested in court. ShotSpotter evidence and expert witness testimony have been successfully admitted in over 200 court cases in 20 states. ShotSpotter evidence has prevailed in 13 successful Fry challenges and one successful Daubert challenge in courts throughout the United States. As in any court proceeding, that evidence and testimony are open to cross-examination, whether from the defense or the prosecution. In this way, ShotSpotter is like any other type of evidence introduced in a criminal case. Just like with fingerprints, both sides have the opportunity to inspect and engage in cross-examination.

We are working hard to ensure our message is heard accurately and in context. We know the power of our product and how it helps you to save lives, reduce gun violence, and make your communities safer.

Very truly yours,

Ralph Clark