Why did Facebook censor this photograph?
The Uprising of Women in the Arab World is not pleased with Facebook. The group, which advocates for women’s rights in the Middle East, issued a press statement on Nov. 7 claiming that Facebook, once hailed as the catalyst of the Arab Spring, was purposefully targeting the organization through censorship. After a member posted a ...
The Uprising of Women in the Arab World is not pleased with Facebook.
The Uprising of Women in the Arab World is not pleased with Facebook.
The group, which advocates for women’s rights in the Middle East, issued a press statement on Nov. 7 claiming that Facebook, once hailed as the catalyst of the Arab Spring, was purposefully targeting the organization through censorship. After a member posted a controversial photograph to the group’s Facebook page on Oct. 25, group leaders say, the social networking giant reacted by blocking the image and suspending the account of the administrator who posted it for 24 hours.
"The photograph was part of a campaign which asks the members of our Facebook page to post pictures of themselves holding banners that explain why they support The Uprising of Women in the Arab World," Diala Haider, one of the organization’s administrators, explained in an interview. "Women from all the Arab countries participated and expressed their demands and outrage at social discrimination and the ways in which women have been marginalized in the public sphere."
This particular photograph was posted by Dana Bakdounes, a young woman from Syria. In it, Bakdounes is pictured with her hair uncovered, holding her passport, which has a photo of her wearing a hijab. She also holds a sign which reads: "I am with the uprising of women in the Arab world because for 20 years I was not allowed to feel the wind in my hair and on my body."
Haider says that after the first time Bakdounes’ photo was removed by Facebook, supporters of The Uprising of Women in the Arab World responded by posting the image to their own Facebook pages and on Twitter. Convinced that the removal of the photograph had been an error on the part of Facebook, one of the administrators, Yalda Younes, reposted the image to the original page. Facebook then allegedly removed the photograph again and suspended her account for seven days. The group filled out a feedback request stating that Facebook’s actions were a violation of free speech, and on Oct. 31 the block on Bakdounes’ photo was removed. But just a week later, after the organization posted a status update on Facebook asking its supporters to follow the group on Twitter and use the hashtag #DanaWind for solidarity, Haider says Facebook suspended all five of the administrators’ accounts and sent them an official notice warning that their accounts could be deleted if they violated Facebook community rules again.
"We’ve had a lot of religious fanatics and extremists who use offensive and insulting language in reaction to our efforts," says Haider. "They call us infidels for supporting the freedom of women to choose things like whether to wear a veil or not. We’ve come under attack, but that was expected…. The real surprise was Facebook’s reaction to the page."
In a statement posted on Reddit on Nov. 13 and confirmed to Foreign Policy as official by a Facebook spokesman, Facebook explained that the incident was simply an error:
"We made a mistake," the statement reads. "In this case, we mistakenly blocked images from The Uprising of Women in the Arab World Page, and worked to rectify the mistake as soon as we were notified…. To be clear, the images of the woman were not in violation of our terms. Instead, a mistake was made in the process of responding to a report on controversial content…. What made this situation worse is that we made multiple mistakes over a number of days, and it took time to rectify each of these missteps."
Incidents such as the removal of Bakdounes’ photo raise questions about Facebook’s content moderation system, which has come under fire in recent months. In February, Amine Derkaoui, a Moroccan employee of oDesk, one of the outsourcing firms that Facebook used to moderate its content at the time, leaked internal documents to Gawker detailing the social media site’s content guidelines. According to the documents, while "camel toes" and breastfeeding mothers are off limits, "Crushed heads, limbs etc are OK as long as no insides are showing." Facebook terminated its partnership with oDesk in May.
An incident similar to the removal of Bakdounes’ photo occurred in April 2011 when a photograph of gay men kissing was removed (and subsequently reposted by Facebook with an apology for its "error"). The site has also been criticized for blocking the New Yorker‘s Facebook page after the magazine posted a cartoon that depicted female nipples. In October, a group of Navy SEALS claimed that Facebook was censoring an anti-Obama meme when it took down the image and provided no explanation for its removal until after the story was reported — at which point Facebook issued statements to news outlets apologizing for its mistake.
These episodes begin to make more sense when you factor in the system that, at least until May, Facebook used to moderate its content. Derkaoui told Gawker that he was part of a team of 50 people from across the globe — many from poor countries — who moderated Facebook’s content from home for as little as $1 an hour. He did not return requests for comment, and Facebook has been tightlipped about which companies it now uses to moderate content, failing to respond to emailed questions sent by Foreign Policy.
Vaughn Hester, who works at Crowdflower, a San Francisco-based crowdsourcing firm that also tasks employees from around the world with moderating content, told The Daily Beast in September that "asking moderators to flag photos that are ‘offensive’ can result in very different attitudes in terms of what constitutes offensive content versus permissible content." Given what seems to be the inherent subjectivity of the content moderation industry, as well as the vast cultural and religious differences between employees from different countries, it seems possible that a photograph like Bakdounes’, which Americans might not find offensive in the least, could have upset a moderator from another country.
Panagiotis Ipeirotis, an associate professor in the Operations and Management Sciences department at New York University’s Stern School of Business, says that there are many ways to identify and eliminate biases in the content moderation industry.
"You might, for example, compare different moderators’ work against each other," says Ipeirotis. "So, if you’re worried about cultural biases, you can take five moderators from different regions and get blended input on an image."
Ipeirotis says he is unfamiliar with Facebook’s content moderation policy, but maintains that the content moderation systems of different companies are only as efficient as the standards they implement.
Haider says that while she understands that mistakes are made, it’s important that Facebook take incidents like this seriously because arbitrary acts of censorship aren’t compatible with the site’s role as a forum for free speech.
"It’s only normal that Facebook, which has penetrated the whole globe, hires employees from all over the world with various religious and cultural backgrounds," she says. "This becomes problematic only when those employees favor their cultural and religious biases over Facebook’s policy of respecting freedom of expression. This is why Facebook should take serious measures regarding such mistakes. We trusted that Facebook would be a supporter of freedom of expression and the uprising; we have faced the opposite by feeling that Facebook is assisting extremists and misogynists to put us in a corner…. It is disappointing, to say the least."
Facebook outlines some of its guidelines for acceptable content on its community standards page while maintaining that it attempts to balance the need for a safe online environment with its commitment to freedom of speech.
"Facebook gives people around the world the power to publish their own stories, see the world through the eyes of many other people, and connect and share wherever they go," the page reads. "The conversation that happens on Facebook — and the opinions expressed here — mirror the diversity of the people using Facebook. To balance the needs and interests of a global population, Facebook protects expression that meets the community standards."
Despite the removal of Bakdounes’ photograph, The Uprising of Women in the Arab World’s Facebook page has over 66,000 likes, and Haider acknowledges the important role that social media sites such as Facebook have played in mobilizing activist groups such as hers.
"We wanted a forum that can provide a free space for women and men from around the Arab world to meet and voice their concerns and propositions for a better reality for women within the transforming Arab societies," she says. "In this sense, Facebook helps break the borders and helps in sharing real experiences and awareness with the least possible costs."
Sulome Anderson is a journalist based between Beirut and New York City. Follow her on Twitter: @SulomeAnderson.
More from Foreign Policy
At Long Last, the Foreign Service Gets the Netflix Treatment
Keri Russell gets Drexel furniture but no Senate confirmation hearing.
How Macron Is Blocking EU Strategy on Russia and China
As a strategic consensus emerges in Europe, France is in the way.
What the Bush-Obama China Memos Reveal
Newly declassified documents contain important lessons for U.S. China policy.
Russia’s Boom Business Goes Bust
Moscow’s arms exports have fallen to levels not seen since the Soviet Union’s collapse.