A new bipartisan bill, introduced on Wednesday, could mark Congress’ first step toward combating the algorithmic amplification of harmful content. The NUDGE Social Media Act, written by the senses. Amy Klobuchar (D-MN) and Cynthia Lummis (R-WY), would lead the National Science Foundation and the National Academy of Sciences, Engineering, and Medicine to study “content-neutral” ways to add content. friction to sharing content online.
The bill asks researchers to identify a number of ways to slow the spread of harmful content and misinformation, whether by asking users to read an article before sharing it (as Twitter has done) or other measures. The Federal Trade Commission would then codify the recommendations and require social media platforms like Facebook and Twitter to implement them.
“For too long, tech companies have said, ‘Trust us, we’ve got this,'” Klobuchar said in a statement Thursday. “But we know that social media platforms have repeatedly put profits before people, with algorithms pushing dangerous content that hooks users and spreads misinformation.”
For years, Democrats have sought ways to tackle online misinformation, while Republicans have criticized such efforts as threats to free speech. But prompted by the testimony of Facebook whistleblower Frances Haugen in 2020, members of both sides have begun working together to find ways to regulate algorithms that address both children’s issues and misinformation. Lummis’ support for the bill signals a significant step forward in this process.
“The NUDGE Act is a good step to fully address Big Tech’s overreach,” Lummis said in a statement Thursday. “By empowering the [NSF] and [NASEM] to study the addiction of social media platforms, we will begin to fully understand the impact that the designs of these platforms and their algorithms have on our society. From here, we can build safeguards to protect Wyoming children from the negative effects of social media. »
Last March, representatives Anna Eshoo (D-CA) and Tom Malinowski (D-NJ) presented their Protecting Americans from Unsafe Algorithms Act, which also focused on algorithmic amplification. Unlike Klobuchar’s bill, the House measure would amend Section 230 of the Communications Decency Act for any platform with legal immunity when found to have amplified content that violates civil rights.
The removal of Section 230 liability protections has been the biggest hurdle faced by lawmakers seeking to combat harmful algorithmic amplification. Technology and public interest groups like Public Knowledge have already spoken out in favor of the Klobuchar measure, noting that its absence of 230 changes makes it one of the best models for regulating algorithms.
“Public Knowledge supports this legislation because it encourages informed decision-making to address a known problem: the promotion of misinformation,” Greg Guice, director of government affairs at Public Knowledge, said Thursday. “Most importantly, the bill does all of this without linking compliance to Section 230 immunity.”
There is little time for Congress to pass technology legislation before the midterm elections heat up later this year. In an interview with The edge last month, Klobuchar was optimistic about lawmakers’ ability to pass sweeping bipartisan bills before the end of the year.
Speaking on the NUDGE Social Media Act, Klobuchar said, “This bill will help combat these practices, including by implementing changes that increase transparency and improve user experience.” She continued, “It’s high time to enact meaningful reforms that tackle head-on the harms of social media to our communities.”