Can you have child safety and Section 230, too?

Casey Newton··14 min read
TechnologyRegulationPolitics
Share𝕏in

AI Summary

The newsletter analyzes recent jury verdicts against Meta and YouTube that found them liable for child safety harms through platform design features like infinite scroll and push notifications. The author argues these rulings create a path around Section 230 protections by targeting design rather than content, while defending the distinction between regulating mechanical features versus speech content.

Key Facts

Recent jury verdicts found Meta and YouTube liable for child safety harms through design features like infinite scroll and push notifications, creating a new path around Section 230 protections.
Legal scholars worry these rulings could force platforms to dramatically restrict speech and harm online communities, with Eric Goldman warning of existential legal liability for social media companies.
The author argues mechanical design features like autoplay video and midnight push notifications can be regulated without violating First Amendment protections for content.

Author Takes

NeutralCasey Newton

Platform design regulation

Content and design are necessary for people to be harmed at scale, and in a country where the Constitution prevents regulating content, you can only regulate design

SkepticalCasey Newton

Section 230 defenders

Rejects the argument that every design decision is a content decision protected under the First Amendment

BearishCasey Newton

Kids Online Safety Act

Remains skeptical as it could too easily be expanded to include bans on material about dissidents, LGBT people, and other disfavored groups

Contrarian Angle

WhatsApp replacing Instagram encryption

Meta discontinued encryption in Instagram, directing people instead to WhatsApp

Engineers switching from Instagram encryption to WhatsApp

Related topics

More from Casey Newton

📰TodayFeed📡Signals💰Capital