Does Facebook or TikTok have the right to make editorial decisions about the content on its platforms?

On February 26, the United States Supreme Court will consider this simple question that could nonetheless have a monumental impact on the way the internet looks and works moving forward. The suit, NetChoice v. Paxton, is one of three significant sets of cases involving social media that the Court will decide this term, as Noah Feldman, the Felix Frankfurter Professor of Law, told Harvard Law Today last October.

In 2021, Texas Governor Greg Abbott signed House Bill 20, which banned social media companies with 50 million or more users from restricting posts based on viewpoint. The new law also required platforms to publicly disclose information about how and when they moderate content and create processes for how users can challenge specific editorial decisions.

“Social media websites have become our modern-day public square,” said Abbott upon signing the bill. “They are a place for healthy public debate where information should be able to flow freely — but there is a dangerous movement by social media companies to silence conservative viewpoints and ideas. That is wrong, and we will not allow it in Texas.”

Tech companies including Meta and Google quickly objected to the new law. NetChoice, a trade group that represents the two tech giants and other internet companies, quickly filed a lawsuit arguing that the Texas statute violates the First Amendment by removing platforms’ editorial discretion and requiring them to publish unwanted content.

The companies also complained that the law makes it difficult to moderate their platforms at all, an opinion shared by Feldman during September’s Rappaport Forum about the cases. The law, he said, would require platforms to engage in “far, far less moderation of things like hate speech and misinformation, and possibly even ordinary everyday offensiveness than [social media companies] practice under current circumstances.”

A tangled path to the Court

The case followed a tangled path to the Supreme Court. The district court sided with NetChoice, issuing an injunction that would stop the enforcement of HB 20. But a three-judge panel at the Fifth Circuit Court of Appeals disagreed with the lower court’s reasoning, allowing the law to go into effect. NetChoice then successfully petitioned the Supreme Court to block implementation of the law during litigation, as part of the Court’s so-called emergency, or shadow, docket.

According to Rebecca Tushnet, the Frank Stanton Professor of the First Amendment at Harvard, that initial intervention by the Court indicates that it saw significant issues with the Texas law. “That doesn’t mean it will ultimately strike down the law,” she says, “But, if it does turn out to violate the First Amendment, there’s no way to get back the suppressed speech rights of the platforms, so the stay made sense.”

“If the states can impose this kind of regulation on platforms, we will know that the past 70 years or so of First Amendment jurisprudence can no longer be relied on in any real way.”

Rebecca Tushnet, Frank Stanton Professor of the First Amendment

Afterward, when the Fifth Circuit once again reversed the injunction — this time on the merits — it explicitly and notably rejected the rationale used in a parallel case out of the Eleventh Circuit Court of Appeals. There, the court struck down a similar law in Florida, reasoning that content moderation by private companies is a protected exercise of editorial judgment under the Constitution.

Given the gravity of the question at hand — and the competing opinions from different courts — the Supreme Court agreed to hear the full case this term.

The arguments

Tushnet says that, until now, social media companies have mostly been allowed to moderate their platforms as they see fit, subject to rare constraints such as violations of antitrust law. It is an argument echoed by lawyers for NetChoice in their briefs to the Court.

“Until the Fifth Circuit’s decision below upheld Texas House Bill 20,” reads NetChoice’s petition for certiorari, “no judicial opinion in our Nation’s history had held that the First Amendment permits government to compel websites to publish and disseminate speech against their will.”

NetChoice contends that social media companies should not be compelled to carry speech they do not wish to disseminate, and that removing editorial control over their platforms violates the First Amendment. In essence, the companies “are arguing that Texas and Florida, seeking to punish the big tech companies for their perceived liberal bias, passed laws that drastically interfere with platforms’ editorial discretion, including their ability to moderate hate speech, and for example, to keep sexually explicit content off of YouTube Kids,” says Tushnet.

However, Texas insists it has a legitimate interest in protecting its residents’ ability to participate in public life through free expression. Large social media platforms have become like digital town squares, the state argues, and are therefore required to allow a wide array of speech.

In a reply brief, the state also contends that Court precedent enables protection against private companies that try to limit the free expression of others. “…Recognizing that a small number of modern communications platforms effectively control access to the modern, digital public square, HB 20 provides a remedy to individuals who are denied equal access to that square because those platforms disagree with their point of view. HB 20 also requires those platforms to share purely factual and uncontroversial information with consumers about how the platforms moderate their spaces.”

But in Tushnet’s view, the state’s claims don’t pass muster. “Texas is arguing that the big platforms should be treated as public utilities and forced to carry the content Texas wants them to carry, except that they would still be allowed to ban topics in their entirety,” she says. “I’m sorry, Texas’s argument doesn’t make much sense in conventional regulatory terms.”

Tushnet warns that it’s difficult to predict how the Court will ultimately decide the case. What is clear, she says, is that the stakes are extraordinarily high: “If the states can impose this kind of regulation on platforms, we will know that the past 70 years or so of First Amendment jurisprudence can no longer be relied on in any real way.”


Want to stay up to date with Harvard Law Today? Sign up for our weekly newsletter.