The Supreme Court on Tuesday issued a brief order blocking a Texas law that would have taken control of the entire content moderation process on major social networks such as Facebook, Twitter and YouTube.
Texas law imposed such heavy requirements on these sites, including disclosure requirements that can be literally impossible to meet, which posed an existential threat to the entire social media industry. Facebook, for example, removes billions of content from its website every year. Texas law would require Facebook to post a written explanation of each of these decisions.
At the very least, the law would have prevented major social media sites from engaging in the most basic forms of content moderation, such as the suppression of literal Nazi posts advocating mass genocide or banning persecutors and harassers. their former romantic partners.
The vote in Netchoice vs. Paxton it was 5-4, although it is likely that Judge Elena Kagan voted dissent for procedural reasons outside the merits of the case.
The law effectively prohibits major social networking sites from banning a user, regulating or restricting a user’s content, or even altering algorithms that display content to other users because of a user’s “point of view.”
In practice, this rule would make content moderation impossible. Suppose, for example, that a Twitter user named @HitlerWasRight sent a tweet calling for the systematic execution of the entire Jewish people. Under Texas law, Twitter could not remove this tweet, or ban this user, if it did not do the same with any user who took the opposite view, that is, Jews should be able to continue living.
Texas Gov. Greg Abbott (R) said when he signed the law that he did so to thwart a “dangerous move by social media companies to silence conservative views and ideas.” The evidence that social media companies target conservatives in any systematic way is pretty meager, although some high-profile Republicans, such as former President Donald Trump, have been banned from some platforms: Trump was banned by Twitter and Facebook after it seemed to cheer in January. 6 attack on the United States Capitol.
The Court has not explained its reasoning, which is common when asked to temporarily block a law. And Tuesday’s order is only temporary: the court will probably have to make a final decision on the fate of Texas law at a future date.
But the decision of the majority is consistent with current law.
With few exceptions, it is well established that the First Amendment does not allow the government to force a media company, or anyone else, for that matter, to publish content that it does not want to publish. As recently as the 2019 Court decision Manhattan Community Access Corp. against Halleckthe Court reaffirmed that “when a private entity provides a forum for speech”, it may “exercise editorial discretion over the speech and speakers at the forum”.
Although the idea that a corporation like Twitter or Facebook has the rights of the First Amendment has been criticized from the left following the Supreme Court’s campaign funding decision on United Citizens v. FEC (2010), the rule that corporations have free speech protections is much earlier United Citizens. Newspapers, book publishers, and other media corporations have long been allowed to assert their First Amendment rights in the courts.
The most surprising thing about Tuesday’s order is that Kagan, a Liberal nominated by President Barack Obama, disagreed with the Texas Court’s suspension order.
Although Kagan did not explain why he disagreed, it is an open critique of the Court’s growing practice of deciding important cases in its “shadow case,” an expedited process where cases are decided without a full briefing and oral argument. Netchoice arose in the record in the shadow of the Court, so it is possible that Kagan disagreed to remain consistent with his earlier criticisms of that record.
Meanwhile, the three most conservative judges of the Court, Judges Clarence Thomas, Samuel Alito and Neil Gorsuch, joined an Alito dissent that would have left Texas law in force.
Alito’s dissent suggests that two limited exceptions to the First Amendment should be significantly extended.
Alito stated that the question of whether a state government can effectively take control of the content moderation of a social media company is unresolved, and noted two cases that created limited exceptions to the general rule that the government does not it may require a company to host an unwanted speech. to host.
The first, Pruneyard Shopping Center vs. Robins (1980), confirmed a California law that required malls open to the public to allow people to collect signatures for a petition to own the mall. The second, Turner Broadcasting vs. FCC (1994), confirmed a federal law requiring cable companies to carry local television stations.
But to the extent that Pruneja could be read to allow Texas law, the Court has repudiated this reading of the decision. In PG&E v. Public Service Commission (1986), four judges stated Pruneja “It does not undermine the proposal that the forced associations that carry the protected discourse are inadmissible.” Therefore, a social media company may refuse to associate with a user who posts offensive content.
Meanwhile, Judge Thurgood Marshall wrote this Pruneja should only be applied when a law is minimally “intrusive” to a business: a standard that is met by allowing a petitioner to collect signatures on your property, and not by Texas law, which would fundamentally alter operations social media companies and would prevent them. deleting the most offensive content.
Similarly, the Turner The case stated that cable companies are subject to greater regulation than most media companies because they often have sole physical control over the cables that carry TV stations to individual homes. This is not true of social networking websites. While some social media platforms may enjoy market dominance, they have no physical control over the infrastructure that brings the Internet to people’s homes and offices.
The case of the Supreme Court that regulates how the First Amendment is applied on the Internet is Reno v ACLU (1997), who argued that “our cases provide no basis for qualifying the level of First Amendment scrutiny that should be applied to” the Internet.
Had Alito’s approach prevailed, Texas law would probably have turned all major social media platforms into 4chan, a toxic dump of racial insults, misogyny, and targeted harassment that the platforms could not control. It could also have put all social media companies at the whim of the 50 states, which could impose 50 different content moderation regimes. What are they supposed to do on Twitter or Facebook, after all, if California, Nebraska or Wyoming approves a social media regulation that contradicts Texas law?
For now, this result is avoided. But why Netchoice came to the record in the shadow of the Court, and as the majority of the Court resolved this case in a brief order without any explanation of its reasoning, the question of whether the First Amendment allows the government to technically regulate the moderation of social media remains open, although the fact that the majority of the court intervened to block this law bodes well for the social media industry as its challenge to Texas law progresses.
Order of the Court in Netchoice it is temporary. It maintains the status quo until the Court can issue a final decision on how the First Amendment is applied to social media.
But this issue is unlikely to remain open for long. Two federal appeals courts have reached conflicting rulings on the legality of Texas-style laws. Therefore, the Supreme Court will have to intervene soon to resolve this dispute.