Conspiracy theories existed long before social media, but 21st century technologies — and the algorithmic models and financial incentives that govern them — have only made it harder to combat the phenomena, according to two Harvard fellows who have studied and battled both misinformation and disinformation.
“What social media has done is made it really hard for us to have a shared reality,” said Jesselyn Cook, a 2025 Nieman Fellow and journalist who recently published a book about the QAnon movement. “All of us are traversing different versions of it online … this digital technology has accelerated our descent into suspicion and distrust.”
Cook was joined on a panel by Nieman Fellow Ben Reininga, a former head of editorial at Snapchat who is also a fellow at the Berkman Klein Center. Their appearance was part of a speaker series hosted by the Berkman Klein Center for Internet & Society’s Institute for Rebooting Social Media.
Launched in 2021 and led by two faculty directors — Jonathan Zittrain ’95, the George Bemis Professor of International Law and a professor of computer science, and James Mickens, an associate professor of computer science — the Institute for Rebooting Social Media is a three-year effort to accelerate addressing social media’s most urgent problems, including misinformation, privacy breaches, harassment, and content governance.
Zittrain moderated the Nov. 20 event, called “The Impact of Conspiracy Theories: At Home and Online.” During the conversation, Reininga discussed his experiences with content moderation at Snapchat while Cook discussed her book, “The Quiet Damage: QAnon and the Destruction of the American Family,” which highlights how people’s participation in the QAnon movement affected them and their families. The pair also discussed ways to mitigate the harmful effects of social media, including the spread of conspiracy theories.
As some of the falsehoods from the 2024 presidential campaign may foretell — pet-eating immigrants, for instance — that won’t be easy, they said.
“Facts can’t fix this,” Cook said.
Reininga noted a report earlier this year from the Carnegie Endowment for International Peace that showed fact-checking and labeling untrue content have only modest effects. In fact, the pair said, top-down labeling contributes to the very mentality that makes people more susceptible to conspiracy theories: They feel they are being dictated to by “elites.”
Reininga said some of the work he did at Snapchat showed that news delivered by professional journalists who were not identified as such — for instance “a guy called Jonathan” — was better received than news from traditional media outlets.
“It’s almost like some of the institutional markers or the thing that makes a news organization look like a polished news organization have gone from being a [symbol] for trust … to actually a negative relationship,” Reininga said. “People don’t trust the New York Times because it looks like the New York Times.”
On the other hand, he acknowledged, “If you trust a guy called Jonathan to tell you your news, you’ll trust anyone else.”
In her work, Cook said she learned that what followers of QAnon and its conspiracy theories had in common wasn’t a particular socioeconomic, educational, or ideological background, but a feeling of being unfulfilled or victimized. Family members of believers who had success in helping them let go of their conspiracy beliefs did not try to educate them, she said.
“Try to put aside the ‘what’ of the belief … and try to understand the ‘why,’” she said. “You’re going to have a lot more progress [that way] instead of trying to cram facts down their throat. ‘How can I help you meet this need in an offline, productive, healthy way?’”
Cook said she had seen in her work that community-type content moderation — such as X’s Community Notes feature, which allows users to add context to posts — is more effective than any direct action taken by a platform.
But Reininga pointed out that user-flagged content at Snapchat often involved accounts featuring queer or interracial couples.
“You see people’s prejudices,” he said. “If you let the Snapchat community — or any community — decide what was or wasn’t acceptable, you’d never see two men holding hands … You can’t just let the people decide.”
Cook and Reininga agreed that traditional media have a role to play in combatting falsehoods. Cook said media outlets should focus more of their resources and reporting on the broad swath of Middle America rather than on the large coastal cities that dominate much of the news. Reininga said the media need to engage people more on their platforms in the first place.
“Most of them really struggle to reach people on the platforms,” he said. “I’m very interested in an idea that’s less about combatting misinformation and more about promulgating good information — not in the positive sense, but in the rigorous sense. In other words, put more news on the platforms.”
“I’m very interested in an idea that’s less about combatting misinformation and more about promulgating good information. In other words, put more news on the platforms.”
Ben Reininga
Reininga said he had recently looked at a dataset of 100,000 TikTok posts tagged on the election and was surprised to see how few of them had anything substantive to say.
“Most of them were memes and jokes,” he said. “Sure, misinformation is a problem, but I’m interested in the idea that another problem is you’re getting news from a place where there isn’t any.”
Above all, the economics of social media must change for any real progress to be made, the speakers agreed. Platforms and content creators are paid more by advertisers when posts go viral, regardless of their truth, so creators are incentivized to produce and amplify eye-catching content.
“It’s not so much about what we’re allowed to say or what we’re not, it’s more how this content is treated — what is eligible for monetization and algorithmic amplification,” Cook said. “I wouldn’t mind seeing a little bit more overly aggressive rules in place for dialing down that amplification and seeing how this content performs on its own without this unnatural boost.”
Reininga agreed, saying “If you give people an incentive to post a certain sort of content, it is almost impossible to create a moderation infrastructure that will stop them.”
Zittrain summed up the dynamic succinctly during the event’s Q&A session.
“It’s the power of incentives and the power of economics,” he said. “Markets eat laws or norms for breakfast.”
Want to stay up to date with Harvard Law Today? Sign up for our weekly newsletter.