Traditionally, the holiday season is when movie theaters open Hollywood’s biggest films. Alas, not this year. But in light of recent events, we’re reminded of a blockbuster that opened nationally on Christmas Day, 2013: The Wolf of Wall Street.
You may recall that Leonardo DiCaprio starred as the real-life Jordan Belfort, whose New York stock brokerage, Stratton Oakmont, was eventually exposed as a massive fraud. When the firm was shut down by authorities, he went to prison.
The real-life fraud lasted for more than seven years before its perpetrators were caught. During that time, however, whistleblowers tried to raise the alarm. On a popular online platform called Prodigy, one person described Stratton Oakmont as a “cult of brokers who either lie for a living or get fired.”
The nightmare of liability for websites
That turned out to be true, in Technicolor. But that didn’t stop Stratton Oakmont and Jordan Belfort from suing Prodigy for libel. They sought hundreds of millions in damages from Prodigy, simply for hosting the comment on its platform.
Prodigy argued it should not be responsible for the content its users create. It had no way of knowing whether Stratton Oakmont was a fraud or not and had never expressed an opinion on the subject. But a New York court held in favor of the real-life Wolf of Wall Street, exposing Prodigy to enormous liability.
The court specifically cited Prodigy’s efforts at content moderation, aimed at prohibiting online harassment, as the reason for treating it differently than online platforms where “anything goes.” If Prodigy had not attempted to stifle swearing, bullying and “grossly repugnant” content, the court stated, it would not have been liable for damages.
The alarming message of this case was clear: in the future, online platforms shouldn’t attempt to moderate even the most awful content. Doing so would make them legally responsible for everything their users post.
This was a prescription for turning every online platform hosting user-created content into a vulgar and dangerous place.
When that court ruling was publicly reported in 1995, it drew the attention of a bipartisan group in Congress who were wrestling with the difficult questions surrounding content moderation, privacy, free speech, and the dark side of cyberspace.
The result of that collective, year-long effort was Section 230 of the Communications Decency Act. The law overturned the result in the Wolf of Wall Street case, by protecting “good Samaritans” who attempt to keep cyberspace safe for all.
Fast forward to 2020. President Trump has promised to veto the National Defense Authorization Act, even though it passed the House and Senate with veto-proof majorities, in order to draw attention to his concerns with social media platforms that have flagged his content. As a condition for signing the bill, he has called for the complete repeal of Section 230.
Repealing Section 230 would be chaos
But repealing the law entirely would return us to the legal no-man’s land that necessitated Section 230 in the first place. It can’t be that every one of the over 200 million websites available to Americans — all of them governed by Section 230 — will have to either stop publishing their user’s contributions, or let “anything go” — no matter how gross or illegal. The whistleblowers of today would be shut out from sites like Yelp, Glassdoor, TripAdvisor, or any investment message board, all of whom depend on Section 230 to host user reviews and content.
For months, Congress has been sifting through proposals to fine-tune Section 230 for today’s internet and today’s unique challenges. This is by far the wiser course. But it is a difficult business, because for every problem solved there is a new one created.
Example: if platforms are made responsible for everything millions of users post on their sites, they will have to read it all first. This would mark the end of the internet as a forum for real time communication.
It would also force every website hosting user content to create round-the-clock legal and editorial review teams staffed with hundreds or thousands of people to continually monitor every message, video, photo, and blog. Alternatively, websites would face exorbitant legal damages at every turn. That is not realistic.
More realistic is that the many online avenues that ordinary citizens currently use to express themselves would be closed. Hosting user-created content will be too costly and risky. It is difficult to imagine a scenario more chilling of individual speech and the public’s right to know.
By all means, Congress should examine whether it’s possible to amend Section 230 without doing more harm than good. But in the process, we must be careful what we wish for. Because sometimes, those complaining about online speech are doing worse than crying wolf.
Ron Wyden is a senator from Oregon. He is the ranking member of the Senate Finance Committee and a senior member of the Senate Intelligence Committee. Chris Cox is a former U.S. representative from California and former chairman of the Securities and Exchange Commission. He is a partner in the international law firm of Morgan, Lewis & Bockius and outside counsel to the internet trade association NetChoice.
Website of source