Fleshbot Loading...
Loading...

TikTok Loses Section 230 Protection in “Blackout Challenge” Death Suit

LEGAL NEWS STRAIGHT

A federal appeals court has dealt a significant blow to TikTok’s legal defenses, ruling that Section 230 of the Communications Decency Act does not shield the social media giant from liability in a lawsuit filed by the mother of a 10-year-old girl who died participating in the dangerous “Blackout Challenge.”

TikTok Updates Adult Content Guidelines

The lawsuit, filed by Tawainna Anderson, alleges that TikTok’s algorithm promoted the challenge on her daughter Nylah’s “For You” page (FYP), leading to her tragic death. The “Blackout Challenge” encourages users to choke themselves until they pass out.

The Third Circuit Court of Appeals reversed a lower court’s decision, finding that TikTok’s algorithm, which curates and recommends content, constitutes the platform’s own “expressive activity” and is therefore not protected by Section 230. This ruling marks a significant departure from previous interpretations of Section 230, which has typically shielded online platforms from liability for content posted by third parties.

Judge Patty Shwartz, writing for the court, argued that TikTok’s algorithm goes beyond simply hosting third-party content; it actively selects and organizes videos, creating a curated experience for users. This curation, the court found, transforms TikTok into an “affirmative promoter” of the harmful content.

Concurring Judge Paul Matey criticized TikTok’s interpretation of Section 230, stating that it does not grant companies “casual indifference to the death of a 10-year-old girl.” He emphasized that TikTok’s knowledge of the “Blackout Challenge” and its potential dangers, coupled with its alleged inaction, should hold the company accountable.

The ruling allows Anderson to pursue claims against TikTok based on its algorithm’s role in promoting the deadly challenge. It also sets a precedent for future cases involving platform algorithms and their potential liability for harmful content.

This decision could have far-reaching consequences for social media companies and the interpretation of Section 230. It suggests that platforms may be held responsible for the content they promote through their algorithms, particularly when that content poses a risk to minors.

Section 230 of the Communications Decency Act has been a cornerstone of internet freedom, protecting online platforms from liability for content posted by their users. However, this ruling highlights the ongoing debate surrounding the scope of Section 230 and its application in the evolving digital landscape.

The case now returns to the district court, where Anderson’s remaining claims will be considered. This case is likely to be closely watched by legal experts and the tech industry, as it could significantly impact the future of online content moderation and platform liability.


Live Sex view more

minivan680 Preview
minivan680 US
27 years old
CurvyxLulu Preview
CurvyxLulu GB
35 years old
UKSydneyFoxxx Preview
UKSydneyFoxxx GB
26 years old
TheSkylaRae Preview
TheSkylaRae US
25 years old
JaneCandyy Preview
JaneCandyy RO
19 years old
CassidyNicole Preview
CassidyNicole US
33 years old