In a deep dive, ProPublica tracks how Tracy Harpster, a deputy chief from Ohio, turned a pompous yet ridiculous claim of knowing “what a guilty father, mother or boyfriend sounds like” into a cottage industry and junk science.
Harpster tells police and prosecutors around the country that they can do the same. Such linguistic detection is possible, he claims, if you know how to analyze callers’ speech patterns — their tone of voice, their pauses, their word choice, even their grammar. Stripped of its context, a misplaced word as innocuous as “hi” or “please” or “somebody” can reveal a murderer on the phone.
It’s not as if there were any empirical basis for this absurd claim. After all, different people have different speech patterns. Different people react to situations differently. Different people use different words, different characterizations, different linguistic tics. But Harpster decided that he possessed some magic power to discern innocent 911 callers from guilty. And prosecutors paid for it.
People who call 911 don’t know it, but detectives and prosecutors are listening in, ready to assign guilt based on the words they hear. For the past decade, Harpster has traveled the country quietly sowing his methods into the justice system case by case, city by city, charging up to $3,500 for his eight-hour class, which is typically paid for with tax dollars.
What is this voodoo that Harpster, who came up with the idea after attending a 10-week FBI training course at Quantico, was selling?
Based on patterns he heard in the tapes, Harpster said he was able to identify certain indicators that correlated with guilt and others with innocence. For instance, “Huh?” in response to a dispatcher’s question is an indicator of guilt in Harpster’s system. So is an isolated “please.” He identified 20 such indicators and then counted how often they appeared in his sample of guilty calls.
Huh? Put into action, it looked like this.
The widow said the word “blood,” for example, and that’s a guilty indicator. (“Bleeding,” however, is not.) She said “somebody” at different points, which shows a lack of commitment. “Witnesses to a crime scene should be able to report their observations clearly,” Harpster and Adams wrote. She was inappropriately polite because she said “I’m sorry” and “thank you.” She interrupted herself, which “wastes valuable time and may add confusion.” She tried to divert attention by saying, “God, who would do this?” Harpster and Adams commented: “This is a curious and unexpected question.”
Was there any possibility that this could be true, that there were some universal word choices that differentiated guilty from innocent callers?
Then, in a 2020 study, experts from the bureau’s Behavioral Analysis Unit finally tried to see whether the methods had any actual merit. They tested Harpster’s guilty indicators against a sample of emergency calls, mostly from military bases, to try to replicate what they called “groundbreaking 911 call analysis research.”
Instead, they ended up warning against using that research to bring actual cases. The indicators were so inconsistent, the experts said, that some went “in the opposite direction of what was previously found.”
Yet, they kept trying to find some validity, any validity, in this cool new tool to prove who was guilty.
Academic researchers at Villanova and James Madison universities have come to similar conclusions. Every study, five in total, clashed with Harpster’s. The verdict: There was no scientific evidence that 911 call analysis worked.
It’s one thing for police to consider the content of a 911 call in the context of all other evidence, statements of witnesses for example, to believe that something sounded “off” about the call, the word choices, responses and collateral statements. Cops have long believed they have magical abilities to “sense” when someone was lying or hiding something. Sometimes, they were right. Other times, they weren’t. But at no time was this elevated to the lofty position of “science,” given a name like “911 Call Analysis” and the pretense that there are actual rules that applied and could be replicated. Of course, it couldn’t be replicated because it wasn’t science at all.
Junk science can catch fire in the legal system once so-called experts are allowed to take the stand in a single trial. Prosecutors and judges in future cases cite the previous appearance as precedent. But 911 call analysis was vexing because it didn’t look like Harpster had ever actually testified.
Initially, Harpster avoided judicial scrutiny by training police and prosecutors how to use his method, and then having them regurgitate the conclusions as if fact rather than expert witness, circumventing its admissibility into evidence as a scientific method while getting the bought and paid for results before the jury. Eventually, a Michigan judge, and former prosecutor, broke the cherry.
The judge in Riley’s case, a former prosecutor named John McBain, was more credulous. He let Merritt testify as an expert and accepted 911 call analysis on its face. McBain explained his reasoning: Harpster’s course is recognized by the Michigan Commission on Law Enforcement Standards. This, McBain said, was proof of 911 call analysis’ value.
Whether admitted as junk science or presented as police fact evidence to circumvent its Daubert/Frye problem, it remains both used by police and prosecutors and, like most pseudo-scientific evidence, persuasive with juries as it takes the burden off their shoulders to make difficult decisions by putting the gloss of science on this wholly baseless analysis.
Next, prime jurors during jury selection and opening arguments about how a normal person should and shouldn’t react in an emergency. Give them a transcript of the 911 call and then play the audio. “When they hear it,” a prosecutor in Louisiana once told Harpster, “it will be like a Dr. Phil ‘a-ha’ moment.” Finally, remind jurors about the indicators during closing arguments. “Reinforce all the incriminating sections of the call,” another prosecutor wrote, “omissions, lack of emotion, over emotion, failure to act appropriately.”
“Juries love it, it’s easy for them to understand,” Harpster once explained to a prosecutor, “unlike DNA which puts them to sleep.”
In the meantime, Harpster has turned his magic into a lucrative training business.
All the while, he has maintained a steady stream of training sessions, often at police conferences. Those conferences, I discovered, appear to be one of the most efficient platforms for spreading junk science. Harpster spoke at more than 130 between 2006 and 2017, according to his resume.
One weekend in October 2019, he addressed more than 100 Arizona police officers and prosecutors at the Orleans Hotel and Casino in Las Vegas. They worked at some of the most powerful agencies in the state, including a local FBI office and the state attorney general’s office.
Good work if you can get it. And while Harpster’s junk science has apparently become widely known and used within law enforcement circles, it’s barely made a ripple outside so that it can be ripped to shreds for the utter junk it is. Until ProPublica’s deep dive.