Generally speaking, there are two kinds of corporate players on the internet: companies that build infrastructure through which content flows, and companies that seek to curate content and create a community.
Internet service providers like Verizon and Comcast, domain name servers, web hosts and security services providers like Cloudflare are all the former — or the “pipe.” They typically don’t look at the content their clients and customers are putting up, they just give them the means to do it and let it flow. Social media platforms like Facebook are the latter. They encourage their users to create, share and engage with content — so they look at content all the time and decide whether they want to allow hateful material like that of neo-Nazis to stay up.
While there have long been worries about internet service providers favoring access to some content over others, there has been less concern about companies further along the pipeline holding an internet on/off switch. In large part, this is because at other points in the pipeline, users have choice. Private companies can make their own rules, and consumers can choose among them. If GoDaddy won’t register your domain, you can go to Bluehost or thousands of other companies.
But the fewer choices you have for the infrastructure you need to stay online, the more serious the consequences when companies refuse service. This is why Cloudflare’s decision to drop The Daily Stormer is so significant. Denying security service to one Nazi website seems fine now, but what if Cloudflare started suspending service for a political candidate that its chief executive didn’t like?
With this move, Cloudflare is wading into the business of evaluating the content of its clients — something sites like Facebook and Twitter have been wrestling with for years, leading them to develop complex rules and procedures that govern what users are and are not allowed to post. Most agree that it’s appropriate for social media companies to take down certain kinds of content — that’s how they ensure our newsfeeds aren’t full of pornography or violence. But that doesn’t mean we don’t want that type of content to be able to exist somewhere on the internet. Ensuring that sites like Cloudflare remain content-neutral might be necessary to guarantee that.
One of the additional difficulties with Cloudflare is that it is not so much a piece of pipe as it is a service. Specifically, it is a paid-for-protection service. Having to hire Cloudflare to protect your website is like having to hire security to protect you from attackers when you speak in the public square. If that security service is the only one in town, and you’ll be silenced if you try to speak without it, maybe that security service shouldn’t pick and choose whom it protects. While regulation should not be done lightly or broadly, there’s a case to be made that we should treat Cloudflare more like the police, who are supposed to equally protect all members of the public.
Last week, Matthew Prince, Cloudflare’s chief executive, acknowledged how much power his company has, and what’s at stake. “The internet is a really important resource for everyone,” he said in an interview with TechCrunch, “but there’s a very limited set of companies that control it and there’s such little accountability to us that it really is quite a dangerous thing.”
This is the most terrifying realization surrounding Cloudflare’s removal of a site from the internet: There’s a lack of accountability present at every part of the pipeline and on platforms. The people who run these companies are not elected officials, yet we still expect them to safeguard our basic liberties while also meeting our cultural expectations. For the most part they do: both because it’s good business to meet the expectations of their users, and because most have praiseworthy goals of corporate responsibility. Beyond these minor checks on these companies’ power, we as users have no way to ensure they meet our needs — and we have no idea what site they’ll take down next.
Continue reading the main story