Skip to content
Myanmar Military Used Facebook to Incite Genocide, Ethnic Cleansing

Reuters

700,000 Muslims were forced to flee to neighboring Bangladesh in 2017.

Joshua Smalley
Joshua Smalley

Oct 16 | 2018

On Monday, Facebook said it removed 13 pages and 10 accounts controlled by the Myanmar military in connection with the Rohingya refugee crisis.

The accounts were masquerading as independent entertainment, beauty, and information pages, such as Burmese popstars, wounded war heroes, and “Young Female Teachers.” Fake postings reached 1.35 million followers, spreading anti-Muslim messages to social media users across the Buddhist-majority country.

Facebook’s move comes a year after 700,000 Rohingya, a Muslim minority group in Myanmar, were forced to flee to neighboring Bangladesh amid widely-documented acts of mob violence and rape perpetrated by Myanmar soldiers and Buddhist mobs. The United Nations Human Rights Council denounced the crisis as “a textbook case of ethnic cleansing and possibly even genocide.”

Last month, the social media giant announced a similar purge, removing Facebook and Instagram accounts followed by a whopping 12 million users. Senior General Min Aung Hlaing, commander-in-chief of the Myanmar armed forces, was banned from the platform, as was the military’s Myawady television network.

Over the last few years, Facebook has been in the hot seat for their tendency to spread misinformation. In the 2016 U.S. presidential election, inauthentic Facebook accounts run by Russian hackers created 80,000 posts that reached 126 million Americans through liking, sharing, and following. This problem has persisted in the 2018 midterm elections, ahead of which 559 pages were removed that broke the company’s policies against spreading spam and coordinated influence efforts. Recent campaigns originating in Iran and Russia target not only the U.S., but also Latin America, the U.K., and the Middle East.

The situation in Myanmar is particularly troubling—it’s not an effort by foreign powers to stoke hate and prejudice in a rival, but rather an authoritarian government using social media to control its own people. According to the New York Times, the military Facebook operation began several years ago with as many as 700 people working on the project.

Screen shots from the account of the Myanmar Senior General Min Aung Hlaing, whose pages were removed in August.

Facebook

Claiming to show evidence of conflict in Myanmar’s Rakhine State in the 1940s, the images are in fact from Bangladesh’s war for independence from Pakistan in 1971.

Facebook

Fake pages of pop stars and national heroes would be used to distribute shocking photos, false stories, and provocative posts aimed at the country’s Muslim population. They often posted photos of corpses from made-up massacres committed by the Rohingya, or spread rumors about people who were potential threats to the government, such as Nobel laureate Daw Aung San Suu Kyi, to hurt their credibility. On the anniversary of September 11, 2001, fake news sites and celebrity fan pages sent warnings through Facebook Messenger to both Muslim and Buddhist groups that an attack from the other side was impending.

Facebook admitted to being “too slow to prevent misinformation and hate” on its sites. To prevent misuse in the future, they plan on investing heavily in artificial intelligence to proactively flag abusive posts, making reporting tools easier and more intuitive for users, and continuing education campaigns in Myanmar to introduce tips on recognizing false news.

The company called the work they are doing to identify and remove the misleading network of accounts in the country as “some of the most important work being done [here].”

Joshua Smalley is a New York-based writer, editor, and playwright. Find Josh at his website and on Twitter: @smalleywrites.

Related Articles