Web-based hosts, platforms, and businesses need a plan for how they’ll handle issues around free speech and privacy rights for their users and customers.
Politics and the web are intersecting more and more. In recent news, at least three WordPress related companies have been getting broad media attention.
In just a few days, we've seen GoDaddy shut down a site for violating terms and conditions, as well as Automattic. DreamHost received significant attention for refusing to release site visitor information to the US Department of Justice.
I think the most relevant angle for this website is to note that it's important for web-based services to be prepared for the unexpected news cycles that revolve around web-based properties.
How well does your PR team know your terms and conditions? What's your stance on free speech, and when can that cross a line into speech or content that your service is ready to limit? The definitions can be narrow; let's look at Automattic's decision to shut down a site called Blood and Soil.
It's a despicable site, and it has been for a while. Automattic is aware of the sites that exist on WordPress.com, and this isn't their first rodeo with objectionable sites receiving lots of backlash from advocacy groups. For instance, the Guccifer 2.0 person or group that hacked the Democratic National Committee was on WordPress.com, and they still are. There are countless others, some hacking related, some simply vile or hate-filled.
So what makes a site cross the line for a particular service? GoDaddy's Ben Butler described to Fast Company that they draw the line between speech and violence:
GoDaddy's Ben Butler described to Fast Company that they draw the line between speech and violence:
“We strongly support the First Amendment and are very much against censorship on the internet,” writes Ben Butler, director of the Digital Crimes Unit for GoDaddy, in an email. He adds that, “if a site promotes, encourages, or engages in violence against people, we will take action.”
The GoDaddy decision (which Google followed up with as well) was especially interesting because they made the decision as the domain registrar, not a content host. In that case they weren't actually providing the hosting service.
Do not post direct and realistic threats of violence. That is, you cannot post a genuine call for violence—or death—against an individual person, or groups of persons. This doesn’t mean that we’ll remove all hyperbole or offensive language.
They also have a specific policy (not directly linked from their ToS) for terrorist activity, and a provision to allow them to remove content or users for any reason.
The terrorist in Charlottesville aligned himself with Blood And Soil, prompting Automattic to pull the plug — as the line was crossed.
DreamHost's pushback to the government was about First Amendment concerns as well, primarily with visitors:
The request from the DOJ demands that DreamHost hand over 1.3 million visitor IP addresses — in addition to contact information, email content, and photos of thousands of people — in an effort to determine who simply visited the website. (Our customer has also been notified of the pending warrant on the account.)
That information could be used to identify any individuals who used this site to exercise and express political speech protected under the Constitution’s First Amendment. That should be enough to set alarm bells off in anyone’s mind.
Every host deals with requests that may not require visitor information but definitely do require account information. Automattic's Paul Sieminski provided a helpful post on the types of requests they get, and how they handle them.
The US has broad protections built into the First Amendment covering free speech. Platforms are not required to meet those protections; however, many are strident supporters of the First Amendment. Those protections are often for some of the most unpopular types of content. The Supreme Court has ruled there's no hate speech exception in the First Amendment, and this ruling has been cited recently in a trademark case.
I think the author of the above-cited op-ed makes a good point:
We can and should speak up against hate. As the Supreme Court makes clear, there’s no hate speech exception to the First Amendment. With that freedom comes a heavy burden for government officials like Baker and Walsh, who must try to keep protected speech from turning into acts of violence.
The burden is also heavy for platforms who are dedicated to providing a place for unpopular opinions. There are many times when the unpopular opinion, or anti-government opinion, is incredibly important to protect. But when speech stems over into violence, then I believe platforms have not only a right, but also a responsibility to take a stand.
It's important for organizations to be educated about and consistent with their own terms of service, company-wide. I'm afraid these hard questions about speech, rights, and responsibility will be pretty common for a while to come. And as fast as information spreads — for instance, the calls for GoDaddy to shut down a hate site this week came in a fury, part of a quickly viral Twitter post — acting quickly and consistently will be incredibly important.
I've talked about platforms and services with some control over their user base. The obvious other side of this is that there is a whole segment of our community with no control over their users. Your theme, plugin, and WordPress itself can be used without permission by absolutely anyone, and of course that's by design. WordPress or a WordPress-related product could be identified and criticized virally for enabling objectionable users and content
As a community, are we prepared to respond to that?
PS: If you're a journalist writing about WordPress.com and issues like these, please understand the difference between WordPress.com, owned by Automattic, and WordPress the software. I wrote a handy guide for you.