Elon Musk’s Platform X Faces Renewed Legal Case Over Child Exploitation Video

San Francisco – Elon Musk’s social media platform X (formerly Twitter) is once again under legal scrutiny in a disturbing case involving child sexual abuse material (CSAM). While the lawsuit was originally filed in 2022, before Musk's acquisition of the company, a U.S. federal appeals court has now partially revived the case, allowing specific claims to proceed.

According to a report by a leading news agency, the court alleges that X has become a haven for child exploitation, despite protections offered under federal law for online platforms.

Court Restores Key Claim Against X

The U.S. Court of Appeals in San Francisco upheld the lower court’s dismissal of some claims but restored a critical allegation. It asserts that X failed to report a sexually explicit video featuring two minor boys to the National Center for Missing and Exploited Children (NCMEC) in a timely manner.

The explicit video—which reportedly originated from a case of online child grooming and extortion—remained on Twitter for nine days, accumulating over 167,000 views before it was removed, according to court documents.

Background of the Case

One of the plaintiffs, who was a minor at the time, stated he was manipulated into sharing a nude photo on Snapchat with someone he believed to be a 16-year-old girl from his school. The recipient was, in fact, a child pornography trafficker, who blackmailed the boy into sending more explicit images. These materials were then used to create a video that was later uploaded and circulated on Twitter.

The lawsuit argues that Twitter (now X) did not act swiftly to remove the content, despite being made aware of its existence, and delayed reporting it to NCMEC—a violation of federal child safety regulations.

Court's Interpretation of Section 230

Judge Daniel Bress, leading the three-judge panel, emphasized that while Section 230 of the Communications Decency Act generally protects platforms from liability for user-generated content, it does not shield them from negligence claims once they become aware of illegal content.

“When a platform has actual knowledge of child sexual abuse material, it has a duty to act,” said Judge Bress.

The court also ruled that X must face additional claims related to its internal systems, which allegedly make it unnecessarily difficult to report child pornography.

Legal Representation and Advocacy

Attorney Dani Pinter from the National Center on Sexual Exploitation, representing the plaintiffs, welcomed the court's decision, stating:

“This ruling allows us to move forward with legal discovery and investigation. Justice and accountability are essential for the survivors and to ensure such negligence is not repeated.”

Conclusion

As X continues to face intense scrutiny over user safety, child protection, and content moderation, this case could set a precedent for how social media platforms are held accountable for their response to CSAM reports. With a growing call for responsible digital governance, platforms must prioritize child safety alongside free speech.

 

 

Comments

Popular posts from this blog

China Unveils Futuristic Unmanned Air Taxi at Paris Air Show 2025

Samsung Galaxy A06 4G Receives Android 15-Based One UI 7 Stable Update

iPhone 17 Pro Max Release Date Leaked: Here's What to Expect