Judge tosses social platforms’ Section 230 blanket defense in child safety case

Enlarge (credit: ljubaphoto | E+)

This week, some of the biggest tech companies found out that Section 230 immunity doesn’t shield them from some of the biggest complaints alleging that social media platform designs are defective and harming children and teen users.

On Tuesday, US district judge Yvonne Gonzalez Rogers ruled that discovery can proceed in a lawsuit documenting individual cases involving hundreds of children and teens allegedly harmed by social media use across 30 states. Their complaint alleged that tech companies were guilty of negligently operating platforms with many design defects—including lack of parental controls, insufficient age verification, complicated account deletion processes, appearance-altering filters, and requirements forcing users to log in to report child sexual abuse materials (CSAM)—and failed to warn young users and their parents about those defects.

Defendants are companies operating “the world’s most used social media platforms: Meta’s Facebook and Instagram, Google’s YouTube, ByteDance’s TikTok, and Snapchat.” All of these companies moved to dismiss the multi-district litigation entirely, hoping that the First Amendment and Section 230 immunity would effectively bar all the plaintiffs’ claims—including, apparently, claims that companies ignored addressing when moving to dismiss.

Read 16 remaining paragraphs | Comments