Section 230 has been up in everybody’s grill lately. Yes, Mark and Jack and all have been talking about it in front of Congress and everywhere else. But it affects big tech and small bootstrapped companies alike.
In fact, any company with a website that allows user generated content is impacted.
So here is a quick primer on what the law means.
- The law gives them the power to decide how to moderate content on their own platforms; and
- Protection from liability for user-generated content, ie. user posts.
The little-known backstory of Section 230
It’s 1995. An anonymous user on then popular platform Prodigy (an AOL competitor) described chop shop Stratton Oakmont as a "cult of brokers who either lie for a living or get fired." Refresher: Stratton Oakmont was the brokerage firm of the Wolf of Wall Street, Jordan Belfort.
Stratton sued Prodigy for libel, and since Prodigy monitored its message boards yet didn’t monitor that post, it was held liable and was exposed to hundreds of millions in liability.
Congress responded by passing Section 230 of the Communications Decency Act. The law overturned the result in the Wolf of Wall Street case.
What would happen if Section 230 was repealed?
If any company with user-generated content on their site or app wanted to avoid being sued, it would have to review that content before posting it, which in effect would make the entire business model unworkable.
No more tweets, YouTube videos or Facebook posts. It wouldn’t be possible to review all of that content, so in order to avoid lawsuits, all such content would not be allowed. Bye-bye free speech.
The takeaway. Of course, it’s an extremely nuanced situation. There are obvious pros and cons to Section 230 as it stands. Simply letting it stand as is or repealed in its entirety is unworkable. Some commentators have talked about creating regulation similar to what the EU is doing with the Digital Services Act, but let’s see what 2021 holds in store in this arena.