By Brent Kendall and John D. McKinnon
WASHINGTON — The Justice Department is set to propose a rollback of legal protections that online platforms have enjoyed for more than two decades, in an effort to make tech companies more responsible in how they police their content, according to a Trump administration official.
The department’s proposed changes, to be unveiled as soon as Wednesday, are designed to spur online platforms to be more aggressive in addressing illicit and harmful conduct on their sites, and to be fairer and more consistent in their decisions to take down content they find objectionable, the official said.
The Justice Department proposal is a legislative plan that would have to be adopted by Congress.
The move represents an escalation in the continuing clash between the Trump administration and big tech firms such as Twitter Inc., Alphabet Inc.’s Google unit and Facebook Inc.
Last month, President Trump signed an executive order that sought to target the legal protections of social-media companies, responding to concerns among some conservatives about alleged online censorship by the platforms. The executive order sought to impose limits on legal immunity for social-media companies when they are deemed to unfairly curb users’ speech, for instance by deleting their posts or suspending their accounts. The administration, however, can’t impose many of these changes unilaterally.
The Justice Department’s proposed changes will address the type of speech concerns raised by Mr. Trump, but they also extend more broadly, seeking to strip civil immunity afforded to tech companies in a range of other circumstances if online platforms are complicit in unlawful behavior taking place on their networks, the administration official said.
The department’s proposal, for instance, would remove legal protections when platforms facilitate or solicit third-party content or activity that violates federal criminal law, such as online scams and trafficking in illicit or counterfeit drugs.
Internet companies would lose immunity if they have knowledge that unlawful conduct is taking place on their platforms or show reckless disregard for how users are behaving on their sites. Without those legal protections, tech companies could be exposed to claims for monetary damages from people allegedly harmed by online fraud and other illegal activity.
The department also wouldn’t confer immunity to platforms in instances involving online child exploitation and sexual abuse, terrorism or cyberstalking. Those carve-outs are needed to curtail immunity for internet companies to allow victims to seek redress, the official said.
Attorney General William Barr has repeatedly voiced concerns about online-platform immunity, citing, for example, a terrorism case in which courts ruled Facebook wasn’t civilly liable because its algorithms allegedly matched the Hamas organization with people that supported its cause.
The Justice Department also will seek to make clear that tech platforms don’t have immunity in civil-enforcement actions brought by the federal government, and can’t use immunity as a defense against antitrust claims that they removed content for anticompetitive reasons.
Twitter and Facebook representatives on Wednesday reiterated their past statements in support of longstanding legal protections.
Twitter last month said removing the protections would “threaten the future of online speech and Internet freedoms.” Facebook has said that cutting platform immunity would restrict more speech online “by exposing companies to potential liability for everything that billions of people around the world say.”
The sweeping protections now enjoyed by tech companies were established by Congress in the internet’s early days, through a provision known as Section 230 of the Communications Decency Act of 1996. Under that law, tech platforms are generally not legally liable for actions of their users, except in relatively narrow circumstances. Internet platforms also are given broad ability to police their sites as they see fit under the current law.
Those protections would be scaled back in significant ways under the Justice Department’s proposal, which seeks, in essence, to prevent platforms from taking down content without offering reasonable rules and explanations — and following them consistently. It also would make platforms more responsible for third-party content in other areas such as online commerce.
The proposal’s restrictions on platforms’ content-moderation practices would be extensive.
For instance, the department will propose to strike from federal law a provision that allows platforms to delete content that they merely deem to be “objectionable.”
The proposal also would give some teeth to an existing “good faith” standard that platforms are supposed to use in their content-moderation decisions. The aim would be to require platforms to adhere to their terms of service as well as their public claims about their practices. Platforms also would have to provide reasonable explanations of their decisions.
Section 230’s broad protections have drawn increasing criticism from both the right and the left in recent years. Many critics contend the protections are outdated and no longer needed in an age of internet giants.
For their part, the tech companies contend that Section 230 is fundamental to the internet economy’s smooth functioning.
Some lawmakers — including House Speaker Nancy Pelosi of California — have begun weighing rollbacks of Section 230. A bipartisan group of senators currently is pushing legislation encouraging internet companies to take special steps to block online child sexual exploitation in order to qualify for full protection.
Mr. Trump’s executive order focused on encouraging more action to curb Section 230 by federal regulators, including the Federal Communications Commission and the Federal Trade Commission. It also seeks to convene a working group of state attorneys general to look into complaints by users.
As expected, the order was quickly challenged in federal court by an online-rights group; that challenge remains pending.
Write to Brent Kendall at firstname.lastname@example.org and John D. McKinnon at email@example.com
Website of source