Getty Images/iStockphoto

Texas social media law brings content debate to Supreme Court

The Supreme Court recently blocked a Texas social media law that would have limited content removal, but that's likely not the end of the court's involvement on the issue.

Regulators are treading carefully to rein in the power of social media platforms and control the spread of misinformation, but it's a tricky issue that may eventually be settled by the Supreme Court.

While some policymakers want social media platforms like Twitter, as well as Instagram and Facebook owner Meta to take down information considered untruthful about issues such as COVID-19, others claim social media companies don't have the right to determine what is or isn't fact and remove content. Lawmakers in Texas proposed a law seeking to hinder social media companies from taking down content, but the Supreme Court last month stepped in to block its advancement for now.

First Amendment law protects freedom of expression and prohibits government interference unless direct harm is caused, meaning policymakers don't have much leeway when it comes to content regulation. But deciding how much power social media companies should wield when it comes to moderating content on their platforms is an issue that will likely play out in the courts, and ultimately the Supreme Court, said Kevin Klowden, executive director of the Milken Institute's Center for Regional Economics and California Center.

It has to play out in the courts, these are real fundamental issues.
Kevin Klowden Executive director, Milken Institute's Center for Regional Economics and California Center

"It has to play out in the courts -- these are real fundamental issues," Klowden said.

Regulating social media companies

To protect speech on their platforms, social media companies claim First Amendment rights, arguing that they exercise editorial rights similar to newspapers, said Nabiha Syed, CEO of The Markup, a publication that looks at the impacts of technology on society, and a fellow at Yale Law School. Syed was speaking during a Harvard T.H. Chan School of Public Health panel called "Dismantling disinformation."

However, while a newspaper is responsible for the content it puts out, social media platforms are not. Social media platforms are protected from liability for any content posted to their platforms by Section 230. Social media platforms have "unfettered First Amendment rights," giving them wide berth to operate how they want, Syed said.

Syed said that while policymakers should look at placing checks on social media companies, she believes the other extreme lies in laws like the ones proposed in Texas and Florida, which would limit the ability of social media firms to remove content.

"The most important reality of this moment is that it's not going to be either one; they're both chaos in their own ways," Syed said of both approaches to regulating social media companies. "We do have to craft a new version going forward, a new balance."

Regulating social media content

Regulating content is not an area the government should be involved in except in specific circumstances, such as national security concerns related to disinformation, said Renée DiResta, research manager at Stanford Internet Observatory, during the Harvard panel.

Disinformation is the spread of content with the intent of being deceitful, and it's something the U.S. Department of Homeland Security monitors, particularly coming from countries such as Iran, China and Russia. Misinformation, on the other hand, is the spread of incorrect information presented as factual.

Yet even monitoring disinformation raises eyebrows. The Department of Homeland Security's recently created Disinformation Governance Board was paused after it received substantial backlash from Republican lawmakers who took issue with the scope of the board's work when it came to monitoring content.

Though there is justification for a government response to a problem like disinformation, government involvement in content regulation and handling misinformation becomes riskier after that, DiResta said.

"The government should not be regulating the content on social media platforms, there are some real legal minefields associated with that," she said.

While regulating content may be too much government intervention, DiResta said some federally proposed bills provide a good start to reining in the power of social media giants without venturing into content regulation.

DiResta said the proposed Platform Accountability and Transparency Act, for example, could help the public understand the impact and potential harm caused by social media. The Platform Accountability and Transparency Act proposed last year by Sens. Chris Coons (D-Del.), Rob Portman (R-Ohio) and Amy Klobuchar (D-Minn.), would require social media platforms to provide data access to third-party researchers.

"In many ways, the questions people have -- 'Is my viewpoint being censored, are there unfair, disproportionate takedowns, do recommendation engines radicalize people' -- these are things where in order to answer those questions, in order to address those concerns, we need access," to social media data, DiResta said. "That's where I feel that bill is foundational."

The Supreme Court's role

While some proposed bills like the Platform Accountability and Transparency Act would provide some insight into social media platforms, Klowden said he believes that ultimately, the issue of moderating content will be decided in court.

Indeed, Syed said she expects the Supreme Court will likely take up the problem. The question for the court will be, "what is the responsibility of a private company when it comes to speech," she said.

The Supreme Court's recent 5-4 decision to block the Texas social media law was significant, Klowden said. However, he said such a close split in the vote raises the question of not whether the law will ultimately be upheld, but how much of it.

The Texas social media law could affect more than social media companies since it is broadly written, Klowden said. Any firm that operates an online forum could face new content rules if that type of law progressed, he said.

"Fundamentally, there is that belief that if you do not actually have that ability to mitigate this and control this and moderate this, all these companies will wind up vulnerable to a whole other set of lawsuits," Klowden said.

Makenzie Holland is a news writer covering big tech and federal regulation. Prior to joining TechTarget, she was a general reporter for the Wilmington StarNews and a crime and education reporter at the Wabash Plain Dealer.

Dig Deeper on Risk management and governance

Cloud Computing
Mobile Computing
Data Center
Sustainability
and ESG
Close