Facebook Keeps Failing in Myanmar
The Arakan Army’s online propaganda campaign shows the tech company hasn’t yet figured out how to tackle hate speech on its platform.
In February, Facebook banned accounts connected to four armed groups that had allegedly committed violence against civilians in Myanmar. The tech company sought to prevent the groups from using its platform “to further inflame tensions on the ground.” Within days, it was clear that Facebook had failed. In the months since the ban, dozens of newly created pages that appear to support the insurgents, and the Arakan Army in particular, continue to share unambiguous propaganda.
In February, Facebook banned accounts connected to four armed groups that had allegedly committed violence against civilians in Myanmar. The tech company sought to prevent the groups from using its platform “to further inflame tensions on the ground.” Within days, it was clear that Facebook had failed. In the months since the ban, dozens of newly created pages that appear to support the insurgents, and the Arakan Army in particular, continue to share unambiguous propaganda.
Internal conflicts have simmered for decades in Myanmar. More than a dozen ethnic armed groups are fighting for autonomy against Myanmar’s national army, known as the Tatmadaw. The groups banned by Facebook—the Arakan Army, the Myanmar National Democratic Alliance Army, the Kachin Independence Army, and the Taang National Liberation Army—are united in a coalition, the Northern Alliance.
Fighting between the Arakan Army and the Tatmadaw in Rakhine state has intensified dramatically since January. The Arakan Army—not to be confused with the Arakan Rohingya Salvation Army, another insurgent group—identifies itself as the legitimate representative of the Buddhist Rakhine people, the largest ethnic group in Rakhine state. For much of the Arakan Army’s decadelong history, the Tatmadaw prevented the group from establishing a foothold in the state itself. In exile, the Arakan Army built up its forces, which now number around 7,000. The Arakan Army’s ultimate goal has always been to return to Rakhine, and it began an aggressive push back into the state in January.
Facebook has grown tremendously in Myanmar, with an estimated 20 million users out of 53 million citizens. But the company has faced significant international criticism for failing to police hate speech against the Rohingya Muslim minority being spread on its platform. In 2018, Facebook commissioned an independent assessment that connected specific posts to offline violence. The United Nations concluded that Facebook was “a useful instrument for those seeking to spread hate, in a context where, for most users, Facebook is the internet.”
While international attention has often focused on the Tatmadaw’s abuses against the Rohingya in Rakhine, Myanmar’s army has also committed violence against ethnic Rakhine communities as it seeks to solidify its control over the region. A U.N. report from its fact-finding mission last August called for Tatmadaw leaders to be investigated for crimes against humanity, including serious human rights violations and sexual violence perpetrated against ethnic Rakhine communities.
There is evidence of abuses against civilians by both sides in the conflict, according to a report by Amnesty International. In addition to putting civilians in danger, the Arakan Army has threatened, intimidated, and abducted villagers. “Civilians are bearing the brunt of the latest violence in Rakhine,” said Laura Haigh, a Myanmar researcher for Amnesty. “[T]he military is becoming increasingly intolerant of reporting on violations in Rakhine.”
Some have argued that by censoring online support for the Arakan Army but not for the Tatmadaw, Facebook is in effect boosting the Myanmar government in its push to influence the international narrative. Facebook acknowledges that it held conversations with the government before implementing the ban but asserts that the decision was not made as the result of government pressure. Facebook has previously banned some individual Myanmar government figures, including high-ranking leaders of the Tatmadaw.
The Arakan Army is also fighting for control of the narrative. It has granted interviews with its leaders in international media. But most of its propaganda campaign runs across social media platforms. Dozens of Facebook accounts promote blatant Arakan Army propaganda. One page regularly shares footage from conflict zones and training camps. There are pages with English-language Arakan Army memes, poetry, and songs. Those behind the pages use Facebook-specific features to extend their reach: People can add an Arakan Army frame to their profile pictures, for example.
In an open letter to Facebook CEO Mark Zuckerberg in 2018, Myanmar civil society groups highlighted the social media company’s initial lack of Burmese-language speakers as a key factor in its ineffective response to hate speech. In the past year, Facebook says it has hired 100 moderators who speak Burmese, including some with knowledge of regional dialects. None are physically based in Myanmar, with Facebook citing safety concerns. The moderators take down fake accounts and those that violate the rules. The company plans to launch a digital literacy program in Myanmar later this year, part of a regional effort that began with a program in Singapore in March. “We have repeatedly taken action against violent actors and bad content on Facebook in Myanmar,” a Facebook spokesperson wrote in an email.
The changes are an improvement. But in light of Facebook’s 20 million users in Myanmar, 100 Burmese-language moderators seem insufficient. Digital literacy programs are unlikely to prevent the spread of hate speech in Myanmar. The groups responsible for creating and promoting online hate speech are highly adept with digital technology and use the platforms to exploit preexisting divisions. “Facebook has so far failed to take effective action to counter the spread of hate speech and misinformation on its platform,” said Amnesty International’s Haigh. “It is essential that they do so—in particular ahead of the [country’s] 2020 general elections.”
The proliferation of Arakan Army propaganda on Facebook’s platform demonstrates a deeper issue: a failure to identify and remove pages explicitly banned by Facebook. Finding many of the pages linked to the armed groups is as simple as searching for their names or related phrases in English. (Much of the content is in English, which is likely intended to catch the attention of the international community.) The failure to detect and remove months-old pages with English-language names such as “We Love Arakan Army” suggests that Facebook isn’t proactively enforcing its own ban.
Furthermore, the Arakan Army pages are often branded clearly with the group’s symbol or photos of its leader or of fighters in uniform. Individual profiles may represent real individuals or trolls, but in either case they are defying Facebook’s ban on “all related praise, support and representation” of the groups. Many of the pages are linked to one another on the platform, making them even easier to locate.
The difficulty of the challenge facing Facebook should not be understated. As the world’s predominant social network, it is at the center of conflicts around the globe. Its Silicon Valley leadership isn’t prepared to play peacekeeper in the fight for narrative control in the far reaches of Myanmar or beyond.
The immensity of the task is no excuse for the depth of Facebook’s failure. Facing an online insurgency, the social network holds the high ground. Facebook’s moderators have a level of access and capabilities on the platform that none of the forces on the ground possess. The resources at the disposal of Facebook, which posted a revenue of $55.8 billion in 2018, dwarf those of the armed groups and even some governments. Combating hate speech and violence on its platforms is a contest that Facebook can win—but it needs new strategies.
There are a number of straightforward steps Facebook could take to prevent the spread of propaganda by the banned armed groups in Myanmar. In the third quarter of 2018, 63 percent of the hate speech content that it removed in Myanmar was automatically detected, according to the Facebook spokesperson. Facebook could assign moderators to look for content that violates its policy proactively, rather than waiting for it to be reported or filtered. As the Arakan Army pages show, a vast amount of banned content should be relatively simple to detect.
It may be that a better combination of human and automated moderation is needed. Facebook could flag pages containing obvious phrases associated with the groups to be reviewed by a moderator. These phrases would need to be frequently checked and, if necessary, revised by a human moderator, as groups attempting to avoid automated moderation often switch to code words and phrases, something that may be readily apparent to a human but missed by an algorithm.
Facebook could also use its facial recognition technology to help flag potentially banned content. Many of the Arakan Army pages use photos of the group’s leader, Twan Mrat Naing, as a profile picture or cover image. Facebook’s technology recognizes individuals’ faces in photos even if they are not tagged. It should likewise be able to pick out when Twan Mrat Naing appears in a profile picture or cover image and flag it for review.
Some activists criticized Facebook’s decision to ban the Arakan Army and its allies, noting that the company did not consult with Myanmar civil society and had failed to apply its standards consistently. There is genuine support for the Arakan Army among some parts of the ethnic Rakhine community, and the ban could violate the right to freedom of political expression.
Furthermore, the Arakan Army’s information campaign against the Tatmadaw involves documenting abuses against civilians. In May, accounts linked to the Arakan Army shared unfolding developments in Kyauktan village, where the Tatmadaw held dozens of men and boys in a school for days on suspicion of being insurgents. Photographs shared on the pages were later used in the media, raising international awareness of the situation.
Moderation by Facebook that implicitly supports governments over marginalized civilian groups is not unique to Myanmar. The company has been accused of deleting Palestinians’ accounts of the behest of Israel’s government and of aiding state censorship in Vietnam. Influence campaigns are not a black-and-white issue but a gray zone. Human review is crucial to prevent censorship, and the boundaries of what is considered legitimate content remain contested.
However, Facebook has banned the Arakan Army and its allies, believing the groups pose an unacceptable risk of violence. In that light, Facebook’s failure to meaningfully enforce its own decision is yet another abnegation of responsibility for the ways in which its platform can be used in conflict zones.
David Kaye, the U.N. special rapporteur on freedom of expression, has highlighted Facebook’s inconsistent enforcement of its own rules as one reason why greater regulation of the social media giant is needed, and why using antitrust law to break them up should be an option on the table. When it comes to moderating propaganda, however, it’s not clear that Facebook’s size as a company is the problem. One thing is very clear: Despite some improvements, Facebook is still not taking this problem seriously enough.
Facebook aims to become the world’s social network, which means that some part of its platform will always be a warzone. If it is not doing enough to respond to the narrative battle being fought across its platform in Myanmar, where the company has faced sustained criticism, it raises serious questions about what may be happening in other, less high-profile cases—and what kind of crisis it will take to force Facebook to reckon with the power it wields.
Elise Thomas is a freelance journalist and a researcher with the International Cyber Policy Centre at the Australian Strategic Policy Institute. She has written for Wired, the Daily Beast, the Guardian, and the Australian Broadcasting Corp. on technology and human rights. Twitter: @elisethoma5
More from Foreign Policy

At Long Last, the Foreign Service Gets the Netflix Treatment
Keri Russell gets Drexel furniture but no Senate confirmation hearing.

How Macron Is Blocking EU Strategy on Russia and China
As a strategic consensus emerges in Europe, France is in the way.

What the Bush-Obama China Memos Reveal
Newly declassified documents contain important lessons for U.S. China policy.

Russia’s Boom Business Goes Bust
Moscow’s arms exports have fallen to levels not seen since the Soviet Union’s collapse.