There’s rarely time to write about every cool science-y story that comes our way. So this year, we’re once again running a special Twelve Days of Christmas series of posts, highlighting one science story that fell through the cracks in 2020, each day from December 25 through January 5. Today: the structure of folklore can help explain how unrelated facts and false information connect into a compelling narrative framework, that can then go viral as a conspiracy theory.
Mark Twain is often credited with the saying, “A lie can travel halfway around the world while the truth is still putting on its shoes.” Twain never actually said it; it appears to be a mutated version of something essayist Jonathan Swift once wrote—a misattribution that aptly illustrates the point. The same is true of a good conspiracy theory, comprised of unrelated facts and false information that somehow get connected into a loose narrative framework, which then spreads rapidly as perceived “truth.” According to a June paper published in PLOS ONE, the structure of folklore can yield insights into precisely how these connections get made, and hence into the origins of conspiracy theories.
“We tell stories all the time, and we use them to explain and to signal our various cultural ideologies, norms, beliefs, and values,” co-author Timothy Tangherlini, a self-described computational folklorist at the University of California, Berkeley, told Ars. “We’re trying to get people either to acknowledge them or align with them.” In the case of conspiracy theories, those stories can have serious real-world consequences. “Stories have been impactful throughout human history,” he said. “People take real world action on these. A lot of genocide can be traced back to certain stories and ‘rumors,’ as well as conspiracy theories.”
Tangherlini and his co-authors at the University of California, Los Angeles, combined their knowledge of folklore with machine learning to analyze some 18,000 posts from Reddit and Voat discussion boards between April 2016 and February 2018, pertaining to the thoroughly debunked conspiracy theory dubbed “Pizzagate.” They then used that data to produce a graphic representation of the emerging narratives, with multiple layers representing the various subplots. Relationships between key people (“actants”), places, things, organizations, and other elements were indicated by connecting lines within and among those layers.
Granted, there’s a lot of noise in social media forums, with plenty of irrelevant pieces. But the AI enabled Tangherlini et al. to tease out the hidden narratives that fed into the Pizzagate conspiracy theory, and determine the difference between the storytelling elements of a debunked conspiracy, and a fact-based real-world conspiracy.
They found that conspiracy theories tend to form around certain narrative threads that connect various characters, places, and things, across discrete domains of interaction that are otherwise not aligned. It’s a fragile construct: cut one of those crucial threads, and the story loses cohesiveness, and hence its viral power. This is not true of a factual conspiracy, which typically can hold up even if certain elements of the story are removed.
Pizzagate, for example, emerged during the 2016 presidential election, after the March spear-phishing hack of the personal emails of then-Democratic candidate Hillary Clinton’s campaign manager, John Podesta. Wikileaks published the emails in November 2016, and false rumors (or “creative interpretations,” if one is feeling charitable) began swirling that the Podesta emails contained coded messages about an alleged human trafficking and child sex ring. (Meanwhile, mainstream liberals were obsessing over Podesta’s apparently controversial recipe for risotto.)
The rumors soon blossomed into a full-scale conspiracy theory connecting high-ranking Democratic party officials and several US restaurants, most notably the Comet Ping Pong pizzeria in Washington, DC. The hoax spread like wildfire on 4chan, 8chan, Reddit subgroups (/r/TheDonald and /r/pizzagate), Twitter, and various alt-right and conservative media outlets, including InfoWars. (InfoWars host Alex Jones would eventually apologize to Comet Ping Pong’s owner, James Alefantis, in February 2017 for spreading the conspiracy theory, under threat of a libel lawsuit.)
Alefantis and several staff matters received multiple death threats from true believers as the conspiracy hoax spread far and wide. The mania culminated on December 4, 2016, when 28-year-old Edgar Maddison Welch of North Carolina came to DC and fired three shots from an AR-15-stye rifle into the pizzeria—convinced he would be a hero for rescuing the alleged child sex slaves being held in the restaurant’s non-existent basement. Mercifully, no one was injured and Welch surrendered to police. He was found guilty of assault and firearm charges and sentenced to 4-1/2 years in prison, apologizing during sentencing for his “foolish and reckless” behavior.
Per Tangherlini et al.’s analysis, the Pizzagate conspiracy centered on Hillary Clinton, clearly a major player in Democratic politics in 2016—that would be one domain of interaction. As a mom, she might belong to a casual dining/going out for pizza domain, which (in the minds of conspiracy theorists) links her to Alefantis and Comet Ping Pong. John Podesta and his brother Tony belong to yet another domain (the Podesta family), and also like pizza, which would link them to Alefantis and the casual dining domain. And of course, Podesta’s affiliation with Clinton puts him in the Democratic politics domain.
“You’ve got these three domains that wouldn’t really interact, but they have alignments between them and those became important” in the minds of conspiracy theorists, Tangherlini said. This then mushrooms into coded messages in Podesta’s emails, child sex trafficking, and so forth, fueled by the Wikileaks component. The narrative frameworks around conspiracy theories typically build up and stabilize fairly quickly, compared to factual conspiracies, which often take years to emerge, according to Tangherlini. Pizzagate stabilized within one month of the Wikileaks dump and remained relatively consistent for the next three years.
The good news is that as quickly and easily as a conspiracy theory forms, it can also fall apart, separating back into discrete non-interacting domains. In the case of Pizzagate, remove the Wikileaks element, and the other connections simply don’t hold up. “It’s a classic network thing,” said Tangherlini. “Which nodes and edges do I have to delete to get it to fall apart? In this conspiracy, the Wikileaks email dump and how theorists creatively interpret the content of what was in the emails are the only glue holding the conspiracy together.”
That said, it’s also fairly easy for a conspiracy theory to gain a second life with new interconnected circles. “It’s not like you need a lot of actants and relationships to put them back together,” Tangherlini said. Last June, Pizzagate found renewed popularity with young people on TikTok, where the hashtag garnered nearly 80 million views.
Tangherlini et al. tested all of this against a factual conspiracy: the 2013 Fort Lee lane closure scandal—aka “Bridgegate”—that helped tank former New Jersey Governor Chris Christie‘s presidential aspirations. On September 9, there were unannounced closures during the morning rush hour of two of three toll lanes set aside for local traffic in Fort Lee, New Jersey. (The other lanes at that toll plaza feed onto the upper level of the George Washington Bridge, which connects Fort Lee to New York City.) The resulting gridlock caused major delays in school transportation and the ability of police, paramedics, and firefighters to respond to emergency calls. The issue wasn’t resolved until Friday, September 13, after Port Authority Executive Director Patrick Foye directly intervened.
Initially, PA Deputy Executive Director Bill Baroni (a Christie appointee) told staffers it was part of a traffic flow study, and that giving advance notice would have adversely impacted the findings. But eventually hundreds of emails and internal documents came to light suggesting that the closures were orchestrated by Christie loyalists—Baroni; PA director of interstate capital projects David Wildstein (a former Christie high school chum); and Christie’s deputy chief of staff, Bridget Anne Kelly—apparently as political retaliation against Fort Lee’s mayor, Democrat Mark Sokolich, after Sokolich declined to endorse Christie in the 2013 New Jersey gubernatorial election.
Wildstein, Baroni, and Kelly were all found guilty of felony conspiracy in November 2016. Christie himself denied any involvement in the closures and pronounced himself “embarrassed and humiliated” by his staff’s behavior in a January 2014 press conference. An official misconduct case was filed against Christie, but prosecutors ultimately dropped the complaint, because they didn’t believe Christie’s guilt could be proven beyond a reasonable doubt. Kelly”s and Baroni’s convictions were later overturned by the US Supreme Court. (Wildstein entered into a plea agreement in exchange for testifying against Kelly and Baroni, and got probation.)
“Bridgegate fascinated me because, well, why would you do that?” Tangherlini said. “The stakes are so low and the impact is potentially so high. People were stuck in traffic for days.” So is that factual conspiracy the same thing as a conspiracy theory from a narrative structure perspective? The answer is no. The team couldn’t find any set of nodes of edges in the network—no key story element—they could delete that would make the network fall apart.
Tangherlini attributes this to the fact that even though all the major figures in Bridgegate had multiple points of connection, they all belonged to the same domain of interaction: New Jersey politics. “We’re not aligning disparate domains,” he said. “The narrative framework is robust to deletion. That might actually be one of the telltales between an actual conspiracy and a conspiracy theory.”