Will Facebook own up to Myanmar?

Social media drove the Arab Spring, the story goes. If it weren’t for viral posts in Tunisia setting off a cascade of dominoes across the region change would never have arrived. For a brief period, the arrival of social media giant Facebook in countries with low connectivity or strict freedom of the press and internet meant change was afoot.

Smartphone showing Facebook app

Facebook launched its controversial Free Basics program in Myanmar mid-2016. This would allow low-income Burmese to access the internet far more cheaply than previously and saw millions jump onto the platform. By 2017, 20 million accounts had been made in Myanmar. It coincided with the worst flare-up in violence in the northern state of Rakhine, with hundreds of thousands of Rohingya minority refugees fleeing across the border into Bangladesh amid reports of killings, sexual violence and destruction of homes at the hands of the military. There are multiple causes for what has been called a genocide by the United Nations, but one thing is clear. Facebook has been a contributor, a UN report declared earlier this year.

Facebook faces two monumental charges when it comes to its operations in Myanmar. Firstly, it allowed hate speech and hoaxes about the Rohingya Muslim-minority to proliferate. The Rohingya have long been demonised in Buddhist-majority Myanmar as an Islamic terror threat or ‘Bengalis’ that is, non-Burmese illegal immigrants.

A Reuters report in August explored the murky web of Facebook-based hate speech against the community and found thousands of examples. Some of these predated the current crisis but other posts were made at the peak of violence in August and September last year. References to ‘doing what Hitler did to the Jews’ and calls to 'destroy their race' underscore how violent hate for the minority has become.

When talk of the violent rhetoric online went mainstream, Facebook pledged to step up in weeding out particularly vitriolic users and pages as well as introduce greater moderation to comments. This has so far been a spectacular failure. The social media giant has said efforts to expand moderation have been stymied by difficulties in finding candidates who speak both Burmese and English fluently, which is understandable perhaps since the reforms were rolled out in the Singapore office. This becomes much less understandable after Facebook officials met with Myanmar’s Ministry of Information who suggested an office be opened in-country, only to be declined.

The second problem is one not even the genius of Mark Zuckerberg can completely control. Startling reports in October revealed that a systemic hoax campaign had been launched on the network — by military officials. This had long been suspected but the extent of the campaign shocked when it came to light. The pages amassed 1.3 million followers by disguising themselves as pages for singers or other pop culture icons.

Working in shifts, the team spread anti-Rohingya posts, supported the military and 'gathered intelligence' on influencers. To Facebook’s credit, it did launch a thorough investigation within its security desk to ascertain the extent of the dark campaign. It found 'clear and deliberate attempts to covertly spread propaganda that were directly linked to the Myanmar military.'

 

"[Casey Newton] makes the case that any tech company, not just Facebook, entering a new country must 'conduct in-depth human rights impact assessments for their products, policies and operations, based on the national context and take mitigating measures to reduce risks as much as possible'. "

 

Still, there remains a lot Facebook can control. After the damning UN report, Facebook commissioned their own independent report from not-for-profit Business for Social Responsibility which, importantly, made recommendations beyond ‘more moderation’. That the long-awaited report was dropped as the US reached midterm election fever pitch indicates an attempt at burying the findings.

Casey Newton, a columnist exploring social media and democracy at tech news site The Verge, isn’t convinced. She’s written off the report as only superficial and not asking a key question — why did Facebook not anticipate serious trouble when rolling out into a country well known for its unstable political situation?

She makes the case that any tech company, not just Facebook, entering a new country must 'conduct in-depth human rights impact assessments for their products, policies and operations, based on the national context and take mitigating measures to reduce risks as much as possible'. Crucially, she calls on Facebook to share its data on hate speech, such as examples and an approximation of how widely it is produced, with researchers who can then use that information to more accurately predict flare-ups and unrest.

It’s not just Facebook and it’s not just Myanmar. The spread of Islamophobic content via messaging app WhatsApp, bought by Facebook in 2014, in India has directly led to a spate of lynchings. There, WhatsApp now has limited forwarding capabilities, but the impact of that is still too early to tell. In Brazil’s recent heady presidential election, fears that WhatsApp would introduce similar restrictions were denied by Facebook after eventual winner Jair Bolsonaro declared he would 'fight' for Brazilians against the giant.  

What’s clear is Facebook is happy to claim credit for information-sharing when the result appears to be in the interest of democracy. It’s also clear that Facebook will continue to dip and dodge the blame for when it demonstrably has been used as a force of evil and its reluctance has only exacerbated the problem.

 

Erin CookErin Cook is a Jakarta-based journalist with a focus on South East Asia, and editor of the SEA news digest Dari Mulut ke Mulut.

 

Main image: Smartphone shows Facebook app (Pixabay)

Topic tags: Erin Cook, Facebook, Myanmar, Rohingya

 

 

submit a comment

We've updated our privacy policy.

Click to review