A former content moderator for Chaturbate has filed a class-action lawsuit against the adult livestreaming platform and its parent companies, claiming he developed post-traumatic stress disorder (PTSD) due to prolonged exposure to disturbing content without access to mental health safeguards.
Neal Barber, who was employed in 2020 by Bayside Support Services and Multi Media LLC—the owner of Chaturbate.com—filed the suit on July 22 in a U.S. District Court. Barber alleges that during his time as a content reviewer, he was subjected daily to “extreme, violent, graphic, and sexually explicit content” and that the companies failed to implement basic industry protections against psychological harm.
“Without these safeguards, Mr. Barber eventually developed full-blown PTSD, which he is currently still being treated for,” Barber’s attorney Chris Hamner said in a statement.
The lawsuit marks the first known legal action against Chaturbate over content moderation practices. It proposes a class-action status on behalf of others employed in similar roles over the past four years.
According to the complaint, Barber worked as a “Customer Service Risk Supervisor,” a job title Chaturbate allegedly uses for its moderation team. His responsibilities included reviewing live-streamed shows and user reports, which often involved content depicting child exploitation, self-harm, extreme violence, and non-consensual acts.
“These moderators serve as the first line of defense,” the lawsuit states, “against illegal, unsafe, and abusive content, and without them, the site would become unmanageable and legally vulnerable.”
The suit argues that despite this, Chaturbate failed to provide essential safeguards such as:
“The class action we have filed seeks redress for Mr. Barber and other content moderators like him,” said Hamner, citing a “breach of duty of care” as the basis for the claim.
Barber says the absence of protective measures resulted in severe psychological distress, including “vivid nightmares, panic attacks, emotional detachment, and long-term trauma consistent with PTSD.”
The complaint claims these symptoms stem from a work environment where exposure to harmful content was routine and unavoidable, with little to no emotional support infrastructure in place.
Barber’s case adds to a growing body of litigation brought by content moderators across various platforms, many of whom allege serious mental health consequences from prolonged exposure to graphic material. Although lawsuits focused explicitly on adult content moderation are rare, several key cases have established legal precedent:
Barber’s lawsuit is notable for being the first of its kind to target an adult-only content platform directly, and it highlights how even platforms built entirely around sexually explicit content may fall short in supporting those tasked with monitoring it.
Although many moderation lawsuits have involved broader social platforms like Facebook and YouTube, the Chaturbate case stands out for its focus on an ecosystem where adult content is the norm, rather than the exception. If granted class-action status, it could pave the way for future lawsuits by moderators in the adult industry, a sector that has so far seen limited legal scrutiny.
As with the Facebook and YouTube cases, the Chaturbate suit argues that the employer failed to provide even basic workplace protections known to mitigate psychological harm, a key component in prior settlements. The absence of industry-wide standards for moderation in adult content may now face legal and public pressure for reform.
Multi Media LLC has acknowledged the lawsuit but has declined to comment directly.
“The company has not been served nor has it reviewed the complaint and therefore cannot comment on the matter at this time,” a spokesperson said. “With that said, it takes content moderation very seriously, deeply values the work of its moderators, and remains committed to supporting the team responsible for this critical work.”
Chaturbate is no stranger to legal challenges. Earlier in 2025, the company agreed to a $675,000 settlement with the state of Texas over violations related to age verification. In May, it faced a separate lawsuit in Kansas after a woman alleged her teenage son accessed the site via an unlocked laptop.