The US Supreme Court Holds the Future of the Internet in Its Hands

If the court backs provocative laws from Texas and Florida that limit social platforms’ ability to moderate content online, life could become radically different.
The U.S. Supreme Court building at sunset with white scribbles around it
Photo-illustration: WIRED Staff; Chip Somodevilla/Getty Images

The US Supreme Court seems torn over whether to trigger a radical transformation of the internet. The nation’s highest court heard arguments Monday over state laws in Florida and Texas that restrict how platforms like Facebook and YouTube moderate speech. If the court lets them take effect, social media feeds could look very different, with platforms forced to carry unsavory or hateful content that today is blocked or removed.

The high stakes gave long-standing questions about free speech and online regulation new urgency in Monday’s arguments. Are social platforms akin to newspapers, which have First Amendment protections that give them editorial control over content—or are they common carriers, like phone providers or telegraph companies, that are required to transmit protected speech without interference?

A ruling is expected by June, when the court typically issues many decisions, and could have sweeping effects on how social sites like Facebook, YouTube, X, and TikTok do business beyond Florida and Texas. “These cases could shape free speech online for a generation,” says Alex Abdo, litigation director of the Knight First Amendment Institute at Columbia University, which filed a brief in the case but did not take sides.

Florida and Texas passed the laws under debate in 2021, not long after social media platforms booted former president Donald Trump following the January 6 insurrection. Conservatives had long argued that their viewpoints were unfairly censored on major platforms. Laws barring companies from strict moderation were pitched as a way to restore fairness online.

The laws were quickly put on hold after two tech-industry trade associations representing social platforms, NetChoice and the Computer & Communications Industry Association, challenged them. If the Supreme Court now allows the laws to stand, state governments in Florida and Texas would gain new power to control social platforms and the content posted on them, a major shift from the situation today where platforms set their own terms of service and generally hire moderators to police content.

Polar Opposites

Monday’s arguments, spanning nearly four hours, underscored the legal confusion inherent to regulating the internet that remains. Justices raised questions about how social media companies should be categorized and treated under the law, and the states and plaintiffs provided opposing views of social media’s role in mass communication.

The laws themselves leave gaps as to how exactly their mandates would be enforced. The questions posed by the justices showed the court’s frustration at being “caught between two polar opposite positions, both of which have significant costs and benefits for freedom of speech,” says Cliff Davidson, a Portland-based attorney at Snell & Wilmer.

David Greene, senior staff attorney and civil liberties director at the digital rights group Electronic Frontier Foundation, which filed a brief urging the court to strike down the laws, says there are clear public benefits to allowing social platforms to moderate content without government interference. “When platforms have First Amendment rights to curate the user-generated content they publish, they can create distinct forums that accommodate diverse viewpoints, interests, and beliefs,” he says.

Greene argues that the laws raise “significant First Amendment and human rights concerns” and are “profound intrusions into social media sites’ ability to decide for themselves what speech they will publish and how they will present it to users.”

Florida’s law prevents social media companies from permanently banning candidates for office or “journalistic enterprises” from their platforms, even when they post content typically barred by a platform. Texas’ law says companies cannot moderate content based on the viewpoints it expresses—potentially neutering many moderation policies such as against hate speech. Lower courts have been split on the rules: Florida’s law was deemed unconstitutional by a federal appeals court, but Texas’ was upheld by a different appeals court.

Florida’s law is worded broadly, and justices wondered Monday whether it could cover platforms like Uber, Etsy, and Gmail, which carry out far different purposes than social media services like Facebook, YouTube, and TikTok. The Texas law is more narrow and applies only to social media companies with 50 million active monthly users.

The trade associations argue that social platforms are like newspapers and should be protected to publish content without government interference. But the states argued that now that social media companies have become public squares of the 21st century, they act more like telephone networks carrying messages between people and should be required to be neutral.

In the US, social media companies have long been protected by Section 230 of the Communications Decency Act from liability for content they host. The state laws, which would impose penalties for moderating content, would create new liabilities for companies, compromising the long-standing immunity provided by Section 230.

Tricky Decision

Some argue the fate of the two laws will have consequences that reach far beyond social platform moderation. If the court upholds injunctions on the Texas law, it could set a precedent that stifles the ability of Congress or state governments to write better laws to regulate social platforms, a group of liberal law professors argue in a brief filed in support of Texas in the case.

“Rather than lining up to give Meta, YouTube, X, and TikTok capacious constitutional immunity, the people who are worried about these laws should be focusing their energies on getting Congress to pass more sensible regulations instead,” Zephyr Teachout, a professor at Fordham Law School who joined the brief, wrote in The Atlantic.

The justices expressed skepticism around the laws during the arguments and “seemed to recognize that it would be unconstitutional to hand to the government the unfettered power to dictate what can be said on private platforms,” Abdo, of the Knight Institute, says. “We’re hopeful that the court will strike down the must-carry provisions but chart a path forward for reasonable social media transparency laws.”

Lasting solutions to long-running debates over how to regulate online speech will require more than a single court decision.