Kelly bill would allow lawsuits over social media algorithms that promote violence, extremism

Sen. Mark Kelly, D-Ariz., listens to testimony during a Sept. 20, 2023, Senate Environment and Public Works hearing on drinking water infrastructure for tribal communities. (Photo by Lux Butler/Cronkite News)

By Isabella Gomez | Cronkite News

WASHINGTON – Lawmakers have struggled for years to regulate social media platforms in ways that tamp down misinformation and extremism. 

Much of the criticism has been aimed at algorithms that feed users more and more of whatever they click on – the “rabbit hole” effect blamed for fueling conspiracy theories, depression, eating disorders, suicide and violence.

Federal law shields social media platforms and internet providers from lawsuits over content posted by users. First Amendment rights make it hard for Congress to regulate speech.

The latest effort comes from Sens. Mark Kelly, an Arizona Democrat, and John Curtis, a Utah Republican, who have teamed up to fight political violence since the Sept. 10 murder of conservative activist Charlie Kirk.

Under the Algorithm Accountability Act they unveiled Nov. 19, social media companies would lose that legal immunity if they use an algorithm to promote content that results in harm.

The bill would amend Section 230, the provision of the Communications Decency Act of 1996 that protects tech companies from lawsuits over user-generated content.

“Too many families have been hurt by social media algorithms designed with one goal: make money by getting people hooked,” Kelly said in a statement. “Over and over again, these companies refuse to take responsibility when their platforms contribute to violence, crime, or self-harm. We’re going to change that and finally allow Americans to hold companies accountable.” 

A Utah Republican and a Maryland Democrat, Reps. Mike Kennedy and April McClain Delaney, filed an identical measure in the House on Nov. 21.

Unlike broader reforms of Section 230, the Algorithm Accountability Act targets the recommendation process that puts content in front of users, rather than the content itself. 

The bill wouldn’t directly limit what platforms such as Facebook, TikTok, X and Instagram distribute, but would require them to “exercise reasonable care” in designing how they organize the content they serve up to users. 

“Algorithms make us see the world as more aggressive and more conflictual than it actually is. It makes us see the other side as more extreme and more of a threat,” said Yphtach Lelkes, a professor of communication and political science at the University of Pennsylvania.

study published Nov. 24 in the journal JAMA Network Open found that even a one-week “detox” from social media – cutting back on screen time – reduced anxiety, depression and insomnia in young adults.

“I think it’s coming from a good place,” Lelkes said of the Kelly-Curtis bill. But, he said, tech companies are motivated to keep users engaged, and tamping down harmful, inappropriate and “outrageous” speech could reduce usage.

“How do you get companies to promote public good over this need to keep people online as long as possible?” he said.

Tech companies warn that encouraging them to tweak algorithms to deemphasize certain political viewpoints means that instead of being neutral about user content, they are picking winners and losers.

That could mean trouble for organizing and recommending content in ways “the government would prefer you not to,” said Zach Lilly, director of government affairs at a trade association called Net Choice. “That’s where you start to ask those First Amendment questions.” 

Lilly also argued that algorithms are designed to personalize content to a users’ interests. 

Eliminating that, he said, “would result in such an extreme reduction in the ability to post our own content online without platforms feeling the need to remove our content for fear of liability.”

Smaller tech companies would find it especially costly to comply, he said. 

Curtis and other leaders in Utah have been outspoken about political violence since the murder of Kirk during a campus appearance in their state.

Kelly has also been a leading voice on the topic. His wife, former Rep. Gabby Giffords, was shot in the head in 2011 while meeting with congressional constituents in Tucson. The gunman also shot 18 others, killing six, including a federal judge and a 9-year-old girl. Giffords narrowly survived.

Lelkes said it’s important to keep in mind that algorithms alone don’t cause people to “end up in rabbit holes” online. 

“It’s more like they were already extreme and ended up in these places,” he said.

Meanwhile, artificial intelligence makes algorithms ever more effective at serving up content that keeps users from putting down their phones.

“AI systems are making decisions that are impacting people’s lives with little to no transparency about how those decisions are made, or accountability,” said Caitriona Fitzgerald, deputy director of the Electronic Privacy Information Center.

“We are playing catch-up,” she said. “We didn’t have the rules of the road in place that would have in any way limited the growth or the trajectory of AI.”

Share this!

Additional Articles

News Categories

Get Our Twice Weekly Newsletter!

* indicates required

Rose Law Group pc values “outrageous client service.” We pride ourselves on hyper-responsiveness to our clients’ needs and an extraordinary record of success in achieving our clients’ goals. We know we get results and our list of outstanding clients speaks to the quality of our work.

Casa Grande annexes 1,271 acres

By Justin Matthews | Pinal Post Key Points Casa Grande City Council unanimously approved annexing approximately 1,271 acres on December 1, 2025. The Project Saguaro annexation sits generally between Selma

Read More »