Michael Long
1 min readSep 3, 2024

--

Section 230 is not just about "the algorithm". It's about being held liable for anything and everything some user posts on your web site.

Including comments like this one.

If this comment had been CSAM, then all 230 says is that Medium isn't liable because someone else put it there. I'm liable. Medium is not.

Pretty simple.

Eliminating Section 230 fundamentally kills discourse on the internet because every comment and post now requires moderation... and moderation comes at a cost.

You danced around the point by claiming that AI can solve (yet another) problem. But AI-based services aren't perfect... and they aren't cheap.

Again, taking Medium as an example. Will they test every article posted? Will they test every comment? Every comment on a comment?

If you have a free website were you post content and where you now allow comments... are you going to personally pay hundreds or thousands of dollars a month to allow comments?

You don't like TikTok Blackout Challenges? Fine. What about Dance challenges? What about the IceBucketChallenge? That one okay? What happens when TikTok is sued because someone tried it and went into cardiac arrest... or died when the bucket landed on their head?

Shouldn't be allowed to promote content that's illegal or harmful.

Who decides? There's a very, very big line here.

--

--

Michael Long
Michael Long

Written by Michael Long

I write about Apple, Swift, and SwiftUI in particular, and technology in general. I'm also a Lead iOS Engineer at InRhythm, a modern digital consulting firm.

Responses (2)