Texas social media law brings content debate to Supreme Court

Regulators are cautious about limiting the power of social media platforms and controlling the spread of misinformation, but it’s a tricky issue that could eventually be settled by the Supreme Court.

While some policymakers want social media platforms like Twitter, as well as Instagram and Facebook owner Meta, to remove information deemed misleading on issues such as COVID-19, others say the companies of social media have no right to determine what is or is not. t fact and delete the content. Texas lawmakers have proposed legislation to stop social media companies from removing content, but the Supreme Court intervened last month to block its progress for now.

The First Amendment law protects free speech and prohibits government interference unless direct harm is caused, meaning policymakers don’t have much leeway when it comes to regulating content. But deciding how much power social media companies should wield when it comes to moderating content on their platforms is a matter that will likely play out in court, and ultimately the Supreme Court, said Kevin Klowden, executive director. of the Center for Regional Regions of the Milken Institute. Economics and California Center.

It has to be played out in court, these are real fundamental questions.

Kevin Klowden Executive Director, Center for Regional Economics and California Center of the Milken Institute

“It has to be played out in court – these are real fundamental issues,” Klowden said.

Regulating social media companies

To protect speech on their platforms, social media companies are claiming First Amendment rights, arguing that they exercise editorial rights similar to those of newspapers, said Nabiha Syed, CEO of The Markup, a publication that examines the impacts of Technology on Society, and Fellow of Yale Law School. Syed was speaking at a Harvard TH Chan School of Public Health sign called “Dismantling disinformation”.

However, while a newspaper is responsible for the content it publishes, social media platforms are not. Social media platforms are protected from liability for any content posted on their platforms by Section 230. Social media platforms have “absolute First Amendment rights”, giving them a broad leeway to operate as they wish, Syed said.

Syed said that while policymakers should consider policing social media companies, she thinks the other extreme is laws like those proposed in Texas and Florida that would limit social media companies’ ability to remove content. contents.

“The most important reality of this moment is that it will be neither; they are both chaotic in their own way,” Syed said of the two approaches to regulating social media companies. . “We have to work out a new version in the future, a new balance.”

Regulate social media content

Content regulation is not an area where the government should be involved, except in specific circumstances, such as national security concerns related to disinformation, said Renée DiResta, head of research at the Internet Observatory. from Stanford, during the Harvard panel.

Disinformation is the dissemination of content with the intent to be misleading, and it’s something the US Department of Homeland Security monitors, especially from countries like Iran, China and Russia. Disinformation, on the other hand, is the dissemination of incorrect information presented as factual.

Yet even disinformation monitoring raises eyebrows. The Department of Homeland Security’s recently created Disinformation Governance Council has been suspended after receiving significant backlash from Republican lawmakers who challenged the scope of the council’s content monitoring work.

While there is a rationale for a government response to an issue like misinformation, government involvement in content regulation and dealing with misinformation becomes more risky after that, DiResta said.

“Government shouldn’t be regulating content on social media platforms, there are real legal minefields associated with that,” she said.

While content regulation may represent too much government intervention, DiResta said some federally proposed bills are a good start to limit the power of social media giants without venturing into content regulation.

DiResta said the project Platform Accountability and Transparency Act, for example, could help the public understand the impact and potential harm caused by social media. The Platform Accountability and Transparency Act proposed by Sens last year. Chris Coons (D-Del.), Rob Portman (R-Ohio) and Amy Klobuchar (D-Minn.), would require social media platforms to provide data access to third parties. party researchers.

“In many ways, the questions people are asking – ‘Is my view being censored, are there unfair and disproportionate takedowns, are recommendation engines radicalizing people? , we need access,” to social media data, DiResta said. “That’s where I think this bill is fundamental.”

The role of the Supreme Court

While some proposed bills like the Platform Accountability and Transparency Act would provide insight into social media platforms, Klowden said he believes that ultimately the issue of moderation content would be decided by a court.

Indeed, Syed said she expects the Supreme Court will likely take up the issue. The question for the court will be “what is a private company’s responsibility for speech,” she said.

The Supreme Court’s recent 5-4 decision to block Texas social media law was significant, Klowden said. However, he said such a split in the vote raises the question not of whether the law will ultimately be upheld, but to what extent.

Texas social media law could affect more than just social media companies because it’s broadly drafted, Klowden said. Any company that operates an online forum could face new content rules if this type of law progresses, he said.

“Fundamentally, there’s this belief that if you don’t really have the ability to mitigate this, control this and moderate this, all of these companies will find themselves vulnerable to a whole other round of lawsuits,” Klowden said.

Makenzie Holland is a news writer covering big tech and federal regulation. Before joining TechTarget, she was a generalist journalist for the Wilmington StarNews and crime and education reporter Wabash Plain Dealer.

About Linda Jackson

Check Also

Lessons for Elon Musk from Meta Content Moderation

How often do moderators appointed to keep unwanted content off social media get it wrong? …