Fleshbot Loading...
Loading...

Rep. Jimmy Patronis Introduces Bill to Repeal Section 230, Targeting Big Tech Liability

LEGAL NEWS STRAIGHT

Florida Rep. Jimmy Patronis has introduced legislation in the U.S. House of Representatives that would fully repeal Section 230 of the Communications Decency Act, a legal provision that has long shielded online platforms from liability for user-generated content.

The bill, H.R. 7045, is formally titled the Promoting Responsible Online Technology and Ensuring Consumer Trust Act, or the PROTECT Act. It was introduced earlier this week and referred to the committee for consideration. If enacted, the measure would eliminate Section 230 in its entirety, removing the special legal immunity that protects social media companies, content platforms, and other interactive computer services from civil liability related to content posted by users.

Rep. Jimmy Patronis Introduces Bill to Repeal Section 230, Targeting Big Tech Liability

In announcing the legislation, Patronis framed the effort as a response to what he described as growing harm to children caused by unregulated online platforms.

“As a father of two young boys, I refuse to stand by while Big Tech poisons our kids without consequence,” Patronis said in a statement released by his office. He argued that technology companies are able to profit while avoiding responsibility for what he described as serious mental health consequences facing minors, including addiction, self-harm, and suicide.

Section 230, enacted in 1996 as part of the Communications Decency Act, generally protects platforms from being treated as the publisher or speaker of content provided by users. The provision has been widely credited with enabling the growth of the modern internet, while also drawing sustained criticism from lawmakers across the political spectrum.

Patronis’s bill would repeal Section 230 outright and make extensive conforming amendments across multiple federal statutes. The repeal would take effect immediately upon enactment, according to the bill text.

In his remarks, Patronis compared online platforms to regulated media, arguing that similar liability standards should apply.

“If a billboard or TV channel couldn’t publish bullying or violent materials without liability, why can Big Tech?” he said. “Let’s end the double standard.”

The congressman cited emerging risks tied to artificial intelligence and automated systems, including chatbots and algorithm-driven content that, he claimed, can encourage self-harm or normalize dangerous behavior. He said platforms design their products to maximize engagement, even when that engagement harms children.

“When children are told by an algorithm or a chatbot that the world would be better without them, and no one is being held responsible, something is deeply broken,” Patronis said.

The PROTECT Act joins a growing list of proposals aimed at dismantling or sunsetting Section 230. Two other repeal efforts are already pending in Congress: H.R. 6746, which would terminate Section 230 at the end of 2026, and S. 3546, a Senate bill that would repeal the provision two years after enactment.

Lawmakers on both sides of the aisle have criticized Section 230, though often for different reasons. Some argue it allows platforms to profit from illegal or harmful content, while others contend it gives companies too much discretion to moderate speech, including political viewpoints.

Industry attorneys and digital rights advocates have warned that repealing Section 230 could dramatically reshape the internet. They have also raised concerns that repeal efforts could lead to targeted carve-outs, similar to those created under FOSTA-SESTA, which removed liability protections for platforms accused of facilitating prostitution or sex trafficking.

Such carve-outs, critics argue, could expose platforms that host user-generated content, including adult websites, to sweeping legal risk and an influx of civil lawsuits.

Despite those warnings, Patronis has positioned the PROTECT Act as a necessary step to restore accountability.

“Our most vulnerable should no longer suffer while corporations make millions profiting from their decline,” he said.

Section 230 of the Communications Decency Act is widely regarded as one of the foundational laws that allows the internet to function as it does today. Enacted in 1996, the provision established a clear legal distinction between platforms that host content and the users who create it. That distinction enabled online services to scale, innovate, and exist at all.

At its core, Section 230 states that online platforms are not to be treated as publishers or speakers of information provided by their users. In practical terms, this means that a website or app generally cannot be legally held responsible for what millions of users post, upload, or say on its service.

Without this protection, the modern internet would be almost impossible to operate. Platforms that rely on user-generated content would face constant legal exposure for posts they did not create and could not realistically pre-screen. Every comment, photo, video, review, or message would carry potential liability, forcing companies either to heavily censor content before publication or to shut down interactive features altogether.

Section 230 enabled the rise of social media, search engines, comment sections, online marketplaces, review sites, forums, and video-sharing platforms. Services like social networks, message boards, and community-driven websites could not exist at scale if each user’s post exposed the platform to lawsuits. Even small websites would be vulnerable, as a single defamatory or unlawful post could trigger costly legal action.

The law also protects moderation. Section 230 allows platforms to remove or restrict content they consider harmful, offensive, or inappropriate without becoming legally responsible for everything they choose to leave up. This provision was designed to encourage good-faith moderation rather than discourage it. Without it, platforms would face a legal incentive to either remove large amounts of lawful speech or avoid moderating altogether.

Critically, Section 230 does not protect platforms from liability for their own actions. Companies can still be sued for illegal conduct, defective products, criminal activity, intellectual property violations, and federal crimes. The law narrowly applies to third-party content, not content created or materially contributed to by the platform itself.

The economic impact of Section 230 has been substantial. By reducing legal risk, the law lowered barriers to entry for startups and small companies, allowing innovation without the need for massive legal teams. Many of today’s largest technology companies, as well as countless small businesses and independent creators, emerged in an environment shaped by Section 230’s protections.

Supporters argue that repealing or weakening Section 230 would fundamentally alter the internet, shifting it from an open, participatory space to one dominated by heavily filtered, risk-averse platforms. Critics counter that the law has not kept pace with modern technology and has allowed harmful content to spread without sufficient accountability.

As far as H.R. 7045 goes, it’s still in the early stages of the legislative process. But we’ll keep you updated as more information becomes available.


Live Sex view more

BabyBellXO Preview
BabyBellXO US
26 years old
OdetteParker Preview
OdetteParker US
34 years old
DonaLucy Preview
DonaLucy US
44 years old
VixenMinx Preview
VixenMinx US
32 years old
TittieTia19 Preview
TittieTia19 GB
38 years old