Social media’s business model of personalized virality is incompatible with democracy, agreed experts at a recent Harvard Law School discussion on the state of democracy. Social media doesn’t merely threaten democracy in the future, they said, but may have already done considerable damage. As ethicist Tristan Harris argued during the online event, “You have to appeal to the [Facebook] algorithm to get elected; you have to appeal to the algorithm to get attention. The algorithm has primacy over media, over news, over newspaper publishers, over each of us, and it controls what we do.”

The panel, “Social Media and Democracy,” was the third session of the Harvard Law lecture series, “Democracy.” In his opening remarks, Harvard Law Professor Lawrence Lessig noted how profoundly the climate has changed since the optimistic early days of the internet. “When I was at Stanford [in 2009], if we’d had a panel about the internet and democracy, it would have been an unambiguous celebration about the potential the internet was producing. The first internet president had been elected, and many people attributed the opportunity to motivate and rally people to the potential of the technology. It’s hard to recover that sense of unambiguous good.”

Google and Facebook have more power over the information ecosystem than any institution since the pre-Reformation Catholic Church. Their algorithms and their content moderation policies are taking the form of law.

Stanford Law School Professor Nate Persily, co-editor of “Social Media and Democracy: The State of the Field, Prospects for Reform”

Stanford Law School Professor Nate Persily argued that social media (particularly Facebook and Google) has fostered a more toxic political dialogue that has infected politics at large, accelerating polarization and the decline of legacy media.

“A political communication system that privileges virality [also] privileges certain kinds of candidacies, strategies and communication — those that appeal to outrage and emotion,” he said, adding that the anonymity of online communication has encouraged hate speech, while Facebook groups and similar venues have stoked conspiracy theories. Meanwhile the lack of national control or regulation of social media has not only allowed Russian agents to influence U.S. elections, but has given unprecedented power to two companies.

“Google and Facebook are different to any media companies that pre-existed them,” he said. “They have more power over the information ecosystem than any institution since the pre-Reformation Catholic Church. Their algorithms and their content moderation policies are taking the form of law.”

He said that virality — the ability of any real or false information to spread quickly — is ironically both the most democratic feature of social media, and its biggest threat to democracy. “We’ve eliminated the monopoly of the three white guys who determine what’s news at 5 p.m. every day.” Yet this standardized news has often been replaced by disinformation and hate speech. “I think the genie is out of the bottle there, and that’s why we’ve turned to the rules Facebook and Google have to apply — because governments aren’t really stepping up.”

Tristan Harris’ organization, the Center for Humane Technology, was recently featured on Netflix’s “The Social Dilemma,” and he invoked that show’s characterizing of social media as “simultaneous utopia and dystopia.” Yet he argued that the dystopian threats can outweigh the benefits. “We have an incompatibility with social media’s business model of personalized virality and democracy.” Specifically, he decried the intersection of “godlike technology” and free speech. “We’re talking about free speech when we have a trillion-dollar market cap AI pointed at your brain, designed to find the next perfect boogeyman to light up your nervous system. That is not the same thing as what we used to call free speech when we were hanging out on the town square.”

(Social Media) can’t not polarize the population. No matter where you stand — if masks are your thing, or vaccines, or critical race theory — it doubles down on your perspective or reminds you why the other side is wrong.

Tristan Harris, co-founder and president of the Center for Humane Technology

Social media, he said, erodes both education and social solidarity. “[It] can’t not polarize the population. No matter where you stand — if masks are your thing, or vaccines, or critical race theory — it doubles down on your perspective or reminds you why the other side is wrong.” These extremes will in turn get adopted by political campaigns.

As a concrete example, he cited Facebook whistleblower Frances Haugen, whose recent testimony — that Facebook was amplifying hate speech and misinformation for its own gain — initially met with bipartisan support. “There was a particular Republican senator who never comes down positively on these issues, and I saw him come down very positively. Frances was invited to meet with that senator. Then a story went viral and you know what it said? That she was a fake whistleblower, she was a Democratic plant, she was funded by a billionaire and that this is all a setup by the government to control and silence her free speech. Now, why would that story go viral? Because it’s an incendiary topic.” The senator then called off the meeting because the story had inflamed his base.

Thus, he said, politicians as well as political commentators now tailor their speech according to what will get shared. “If [the algorithm] is rewarding the worst instincts of our nature, and the most divisive fault line in society, and you run society through that for ten years, it is no surprise that you don’t just get a broken fragmented society, but one that cannot make democratic decisions in the face of more crises.” This, in turn, fuels the drive toward authoritarianism.

Though Lessig invited him to offer rebuttal, Columbia Law School Professor Jamal Greene, a co-chair of Facebook’s oversight board, said he concurred with most of Harris’ points. But he outlined the particular challenges faced by the board, which hears appeals to content moderation decisions and makes policy recommendations to the company. Misinformation, he noted, is a tough thing to define, much less stamp out. “We like to be able to express ourselves, often in ways that an algorithm would catch as misinformation. Designing systems that reliably take down misinformation is not the same as [for example], designing systems that reliably take down human nipples. The spread of misinformation of social media is symptomatic of a larger problem that extends to cable news and lots of other places.

“There’s lots of good reasons not to trust social media companies to police misinformation; there’s lots of reasons not to trust governments [either],” Greene said. “Building out trusted institutions that are able to do that is the challenge.”

One of the audience questions came from former Facebook Vice President Elliot Schrage ’86, who asked how the panelists would modify applications of the First Amendment as applied to public forums. Replied Persily, “I would chip away at Citizens United in terms of the rights of these social media monopolies. I worry that lots of regulations that I would propose could be seen as violating those corporations’ First Amendment rights, and that’s where I would change things.” Countered Greene, “I’m not sure how much First Amendment law needs to change. If I were pushed on this question, I would encourage more context sensitivity when it comes to the First Amendment, so we care as much about what the government is trying to accomplish as about whose individual rights are challenged.”