Section 230 won't shield TikTok, Snapchat, Meta from lawsuit
Share and Follow

Main: U.S. District Judge Yvonne Gonzalez Rogers denied the majority of a motion to dismiss filed by various social media platforms Tuesday. As a result, tech companies will face claims that their platforms are “defective.” (Screengrab via YouTube). Inset: This combination of photos shows logos of Snapchat and TikTok. (AP Photo, File)

Meta, Google, TikTok, and Snapchat must face a major lawsuit over their allegedly “defective” platforms that plaintiffs say cause millions of kids to become addicted. The case, which is awaiting class action certification, survived a motion to dismiss Tuesday despite the companies’ argument that they are entitled to immunity under federal law.

U.S. District Judge Yvonne Gonzalez Rogers, a Barack Obama appointee, ruled Tuesday that Section 230 of the Communications Decency Act of 1996 does not shield the social media giants from products liability claims.

The lawsuit, filed by the Social Media Victims Law Center, alleged that the social media companies target children with platforms purposely designed to prey upon kids’ limited impulse control. The complaint charges that the harms range from excessive screen time to promotion of inappropriate sexual content to dangerous child-adult connections and geolocation and more.

The defendant companies argued — much as they have in many other cases — that Section 230 bars plaintiffs’ claims in their entirety, because they are not “publishers” of third-party content within the legal meaning of the term.   Rogers rejected what she called defendants’ “all-or-nothing” position and called Section 230 “more nuanced” than the companies contended. Rogers said that the lawsuit raised issues about “a wide array of conduct” that would constitute a failure to create safe products or warn about defects. The judge noted as examples, the failure to provide effective parent controls or options to self-restrict use times, the lack of robust age verification, the difficulty involved for users to report predator accounts, the use of appearance-altering filters, and organizing notifications “in a way that promotes addiction.”

Rogers said in her 52-page ruling that the defendant platforms did not sufficiently respond to plaintiffs’ allegations, and offered as an example Snapchat’s unconvincing argument is not a social media platform at all and rather just “a camera application.”

Most, though not all, of the plaintiffs’ claims against the tech companies survived. Rogers granted the defense motion to dismiss as to claims about some algorithm and notification features.

Share and Follow
You May Also Like

Police: Mother Hit 6-Year-Old in Head with Hammer Multiple Times

Inset: Zeinab Abdi (Louisville Metro Detention Center). Background: The home where Abdi…

Incarcerated individual guilty of hurling stones at prison issues fresh warnings

Richard Allan Manuel (Tygart Valley Regional Jail). A West Virginia inmate with…

Florida Tot Dies in Hot Car While Dad Gets Haircut, Goes to Bar for Drinks

A Florida dad was arrested this week and charged with manslaughter and…

“Law&Crime Strengthens Leadership Team for Future Expansion”

Left: Debasish Mishra; Right: Marco Bresaz (Law&Crime) Law&Crime, the top true crime…

Judge provides Newsom with additional options to contest Trump’s power

President Donald Trump speaks after signing a bill blocking California”s rule banning…

Girl who drove woman over with woman’s car gets sentenced

Background: News footage of the scene in Wauwatosa, Wis., after Sunita Balogun…

9th Circuit Court agrees with Trump regarding National Guard – tempora…

FILE – President Donald Trump and first lady Melania Trump listen to…

Travis Decker: Fugitive Father Who Suffocated 3 Daughters Is Still Alive, Police Say

Police believe a Washington father accused of killing his three daughters is…