Supreme Court sends cases over tech firms' First Amendment rights back to lower courts – but appears willing to stop states from obstructing online content moderation

The U.S. Supreme Court has sent back to the lower courts the choice on whether states can prevent social media firms like Facebook and X (formerly Twitter) from regulating and controlling what users can post on their platforms.

Laws in Florida And Texas attempted to impose restrictions on social media platforms' internal policies and algorithms that influence which posts are promoted and widely shared and that are made less visible and even removed.

In the Unanimous decisionissued on July 1, 2024, the Supreme Court remanded the 2 cases, Moody v. NetChoice And NetChoice v. Paxtonto the eleventh and fifth U.S. Circuit Courts of Appeals. The court admonished the lower courts for failing to think about the complete scope of the law's application. It also admonished the lower courts to think about the Constitution's limits on government interference with private speech.

Different views on social media sites

In their arguments in court in February 2024, the 2 sides described competing visions of how social media matches into the customarily overwhelming flood of knowledge that defines modern digital society.

The states said the platforms were mere communication channels, or “Moderators of speeches”, just like traditional telephone firms, which were required to forward all calls and weren’t allowed to discriminate against users. The states declared that the platforms needed to forward all posts from users without discriminating against them based on their speech.

The states argued that the content moderation rules imposed by the social media firms weren’t examples of the platforms themselves speaking out — or selecting not to talk out. Rather, the states said, the principles influenced the platforms' behavior and led them to censor certain views by allowing them to dictate who could speak on what topics, which is just not protected by the First Amendment.

In contrast, the social media platforms represented by NetChoicea technology industry trade group, argued that platforms' policies about what is appropriate on their sites are protected by the First Amendment guarantee Freedom of speech without state interferenceThe firms say their platforms usually are not public forums which may be subject to government regulation, but private services that could make their very own editorial judgement about what appears and what doesn’t appear on their web sites.

They argued that their policies were facets of their very own free speech rights and that they ought to be allowed to develop and implement policies about what was acceptable speech on their platforms based on their very own First Amendment rights.

Here's what the First Amendment to the Constitution says and what it means.

A reorientation by the Supreme Court

All litigants—NetChoice, Texas, and Florida—framed the difficulty across the laws' impact on the platforms' content moderation policies, specifically whether the platforms practiced protected speech. The U.S. eleventh Circuit Court of Appeals upheld a lower court's injunction against the Florida law, finding that the platforms' content moderation policies were speech and the law was unconstitutional.

The U.S. fifth Circuit Court of Appeals reached the alternative conclusion, finding that the platforms weren’t engaged in speech but that the platforms' algorithms controlled the platforms' conduct, which was not protected by the First Amendment. The fifth Circuit Court of Appeals ruled that the conduct was censorship and overturned a lower court injunction against the Texas law.

However, the Supreme Court reframed the inquiry. The court found that the lower courts had not considered the complete range of activities covered by the laws. So while a First Amendment inquiry was appropriate, the lower courts' decisions and the parties' arguments were incomplete. The court added that neither the parties nor the lower courts had conducted a radical evaluation of whether and the way the state laws affected other elements of the platforms' products, similar to Facebook's direct messaging applications, or whether the laws had any impact in any respect on email providers or online marketplaces.

The Supreme Court asked the lower courts to research the laws and their effects rather more fastidiously and provided some guidelines.

Principles of the First Amendment

The court ruled that the content moderation policies reflect the platforms' constitutionally protected editorial decisions, no less than with regard to what the court describes as “core applications” of the laws – similar to Facebook's news feed and YouTube's homepage.

The Supreme Court asked lower courts to think about two core constitutional principles of the First Amendment. First, the Amendment protects speakers from being coerced into communicating messages they might somewhat keep quiet. The editorial discretion of firms, including social media firms, that compile and curate the speech of others is an activity protected by the First Amendment.

The other principle is that the amendment prevents the federal government from controlling private speech, even when it does so for the aim of balancing the marketplace of ideas. Neither the state nor the federal government may manipulate that marketplace to present a more balanced range of viewpoints.

The Court also confirmed that these principles apply to digital media in the identical way as they apply to traditional or conventional media.

In her 96-page opinion, Justice Elena Kagan wrote: “The First Amendment to the U.S. Constitution … remains valid when it comes to social media.” For now, it looks like social media platforms will proceed to regulate their content.

image credit : theconversation.com