- By Joshua Keating
Joshua Keating was an associate editor at Foreign Policy
Last week I wrote posts considering both what we can discern about the foreign-policy ideology of Silicon Valley firms and Facebook’s evolving stance on the censoring of hate speech. I’ve been thinking a bit more about these issues in light of the events of the last few days in Turkey and writing by some of the leading commentators on the Internet and democracy.
Over at Slate last week, Jillian York of the Electronic Frontier Foundation responded to Facebook’s promise to extend its ban on hate speech to gender-based hate, arguing that the company is "merely unequipped to deal with the sheer number of complaints it receives on a daily basis" and that attempts to do so would simply lead to overreach and the deletion of legitimate political content. Facebook, she argues, should not be in the position of policing speech:
Setting aside concerns about Facebook’s procedures, there is a bigger question: Should private companies be determining what constitutes “hate speech”? In the United States (where Facebook is based), most of the speech flagged by Women, Action & the Media as offensive is, while abhorrent, protected by law. And while Facebook may be private, many of its users treat it like the new town square, making it more of a quasi-public sphere. While the campaigners on this issue are to be commended for raising awareness of such awful speech on Facebook’s platform, their proposed solution is ultimately futile and sets a dangerous precedent for special interest groups looking to bring their pet issue to the attention of Facebook’s censors.
I was also interested to see the "town square" metaphor appear in a recent paper by MIT’s Ethan Zuckerman. In the paper, Zuckerman revisits his "cute cat theory," originally devised before the Arab Spring, which argued that consumer-targeted social networks are often more useful for activists than applications designed specifically for political uses because they are difficult for governments to censor without censoring innocuous content." In the paper, Zuckerman considers some of the dangers of relying on networks maintained by private corporations for political activism:
By relying on corporate-owned social media platforms, activists are vulnerable to the policies and politics of the platform owners. While platforms like YouTube, Twitter or Facebook may feel like public spaces, in practice, they’re private property, controlled by contracts of adhesion users must agree with to participate in the space.
Speech on these platforms is less like holding a rally in a public park – it’s more like giving a speech in a shopping mall, an idea that has been widely discussed in the U.S. Supreme Court, notably in Lloyd Corop. Vs. Tanner and Pruneyard Shopping Center vs. Robins, with courts finding that private actors have a great deal of control over speech that takes place on their property.
Platforms like YouTube and Facebook have inadvertently censored activist speech due to their limited ability to evaluate content in non-English languages. YouTube removed videos posted by activist Wael Abbas that documented abuses by the Egyptian police, while Facebook removed Wale Ghonim’s "We Are All Khaled Said" group, used to help organize the Tahrir Square protests, several times.
The ongoing protests in Turkey are perhaps an even better example of social media’s potential political impact than the Arab Spring events, since this time the majority of the activity is actually being generated by people on the ground. As NYU Ph.D. candidates Pablo Barbera and Megan Metzger write in a guest post over at the Monkey Cage,
Unlike some other recent uprisings, around 90% of all geolocated tweets are coming from within Turkey, and 50% from within Istanbul (see map below). In comparison, Starbird (2012) estimated that only 30% of those tweeting during the Egyptian revolution were actually in the country. Additionally, approximately 88% of the tweets are in Turkish, which suggests the audience of the tweets is other Turkish citizens and not so much the international community.
In addition to the old standbys — Facebook, YouTube, and Twitter — protesters are also now finding innovative ways to use newer programs like Instagram and Vine to document what’s happening in Istanbul and other Turkish cities. Prime Minister Recep Tayyip Erdogan has lashed out in response, saying that "social media is the worst menace to society.”
As exciting as all this activity is, Zuckerman’s shopping mall warning is still worth keeping in mind. The big social networking sites are designed to generate revenue for the companies that set them up not to serve as platforms for social organizing. The companies’ and the protesters’ interests may sometimes align, but other times may not.
It would be easy to say, so what? Facebook is designed for the cute cat people not the protesters in Egypt or Turkey, and the company has no special obligations just because people happen to use it for political purposes. But there’s abundant evidence that these companies don’t take this view. In a 2012 letter to shareholders, for instance, Mark Zuckerberg wrote:
By giving people the power to share, we are starting to see people make their voices heard on a different scale from what has historically been possible. These voices will increase in number and volume. They cannot be ignored. Over time, we expect governments will become more responsive to issues and concerns raised directly by all their people rather than through intermediaries controlled by a select few.
Through this process, we believe that leaders will emerge across all countries who are pro-internet and fight for the rights of their people, including the right to share what they want and the
right to access all information that people want to share with them.
If Facebook is going to take credit for creating a new kind of global politics, it does have some obligations to the political actors who use it. I don’t think it’s necessarily impossible to enforce some basic content guidelines while still preserving a Facebook as a space for political action — though mistakes and overreach will no doubt occur — but the company needs to do something it hasn’t been that great at in recent years: make its guidelines very clear and inform users in advance when they change.
Companies may have the right to set rules for how their product is used, but they need to be clear, consistent, and users should know what they are.