Written by Lisa Brooten.

Social media have clearly provided outlets for the voices of those in many countries around the world who are frustrated by their government’s performance and marginalised by mainstream media, such as in Bangladesh and Pakistan. Myanmar is no exception. What is less obvious, however, are the state and commercial interests that leverage social media effectively for their own, often detrimental ends.

Facebook’s role in the horrific violence over the last several years between Myanmar’s extremist nationalist Buddhists and the Rohingya Muslims has been widely acknowledged. Hateful and discriminatory messages posted on Facebook have inflamed – and in some cases triggered – the violence, as well as the resulting exodus of 700,000 Rohingya Muslims. The violence has also promoted a high level of engagement by Facebook users.

The current phase of anti-Rohingya violence began in 2012 but intensified after 25 August 2017 when a group calling itself the Arakan Rohingya Salvation Army (ARSA) attacked a Myanmar security outpost, killing twelve members of the security forces. This triggered a violent response from the military, leading to bloodshed and the exodus of hundreds of thousands of Rohingya to Bangladesh and neighbouring countries.

We now know that many of the Facebook messages inciting this violence originated as part of a systematic campaign organised by actors affiliated with the Myanmar military to leverage Facebook to create instability in the country.

In a context in which authoritarian powers can leverage Facebook to advance their own agenda, and in which sensationalism drives the bottom line of this commercial platform, it is clear that support for the civil society sector will remain vital as a counterbalancing force.

The impact of this kind of manipulation should not be underestimated, given how Facebook has, within the space of a few years, become the primary source of information for many people in Myanmar, especially the 20 million (approximately one third of the country) who are online. Facebook itself is struggling to know what to do and is working to stave off externally imposed regulations.

In its first major public response to the problems in Myanmar, after multiple warnings over several years by civil society organisations (CSOs) about abuses of its platform, the company announced in August 2018 that it was removing 20 Facebook and Instagram accounts of people found to be inciting violence towards the Rohingya, including Senior General Min Aung Hlaing and five other military officials.

Criticism of Facebook’s role in the communal violence in Myanmar arose even before the violence in Rakhine State broke out in 2012, but it grew to a crescendo after the violence in August 2017. In March 2018, UN Special Rapporteur to Myanmar Yanghee Lee stated that Facebook had been used to incite violence against Rohingya Muslims. Then in late August, the UN Human Rights Council released the Report of the Independent International Fact-Finding mission on Myanmar, which also found that Facebook had been used to spread hate. Shortly after this, Facebook announced its first decision to remove the Facebook accounts of Senior General Min Aung Hlaing and the other military officials, including several named in the UN report.

A few months later, in October 2018, a New York Times investigation reported that the Myanmar military conducted a systematic, coordinated and wide-ranging campaign involving fake names and accounts to incite communal violence. Facebook confirmed that the campaign was systematic, covert and directly connected to the Myanmar military, and shortly after the NYT report was published, the company announced it was taking down another series of military accounts.

Despite high hopes that the National League for Democracy’s (NLD) victory in the 2015 elections would improve the situation facing journalists and activists, the NLD and the military have responded in a similar fashion. Both deny any coordinated effort to drive out the Rohingya, and both argue that the international media have reported irresponsibly, exaggerating the situation and the hardships faced by the Rohingya.

In addition, the government has continued its online surveillance efforts through its Social Media Monitoring Team, reportedly to track posts that ‘undermine youths’ moralities or peace and stability or security and rule of law on the open sources like Facebook, Messenger and Twitter’. Activists fear these surveillance powers because counter-messaging campaigns intended to curb hate speech are often targets of online threats from Buddhist nationalists and others.

Myanmar CSOs collectively responded to a Mark Zuckerberg interview with Vox in April 2018, in which he describes Facebook’s response to problematic posts in Myanmar and maintains that ‘our systems detect that that’s going on. We stop those messages from getting through’.

Countering Zuckerberg’s claims, the CSOs posted their own open letter in response a few days later, in which they argue that ‘far from being stopped, [the problematic messages Zuckerberg refers to] spread in an unprecedented way, reaching country-wide and causing widespread fear and at least three violent incidents in the process’. The letter also points out that Zuckerberg’s reference to ‘our systems’, which he claims detected the problem, was in fact a reference to alerts sent to Facebook by Myanmar CSOs, rather than any ‘system’ put in place by the company.

Facebook’s August 2018 removal of Myanmar military Facebook accounts was the first time the social networking site had ever removed the account of any country’s military or political leaders. It removed additional pages and groups linked to Myanmar’s military in October and December 2018 for ‘coordinated inauthentic behaviour’, including misrepresenting themselves as independent news, entertainment, beauty and lifestyle pages. In February 2019, Facebook banned four ethnic ‘insurgent’ groups it argues are ‘dangerous organisations’. The wisdom of this latest move is debated by CSOs, who were not consulted on the decision.

To address these issues, much of Facebook’s focus is on developing artificial intelligence (AI) to detect and prevent the spread of hate speech and misinformation. This is especially challenging given the highly contextual nature of speech and meaning. Activists also argue that AI is not particularly cost effective in many countries in the Global South, where low local labour costs provide access to local people who possess the cultural capital to monitor content with the nuance necessary to prevent further incitement to violence. CSOs similarly challenge the company’s focus on the never-ending flood of content, when it is often individual actors and their networks that drive much of the problematic content.

In a context in which authoritarian powers can leverage Facebook to advance their own agenda, and in which sensationalism drives the bottom line of this commercial platform, it is clear that support for the civil society sector will remain vital as a counterbalancing force. Because CSOs understand first-hand the impact of Facebook’s unintended consequences, they can nurture the emergence of indigenous solutions and work to counterbalance the impact of hate-filled or distorted messages.

Local partnerships can help Facebook identify the actors and networks driving problematic content, and reduce the company’s focus on content, which can be a losing battle. Local communities would also benefit from developing local alternatives to platforms like Facebook.

As one Myanmar tech analyst told me: ‘We really need to see beyond Facebook. We need to empower ourselves to not let one company be responsible for everything’. This is a tall order in Myanmar, but perhaps an obvious next step in a country where Facebook’s use is so ubiquitous, and so clearly problematic.

Dr Lisa Brooten is Associate Professor in the College of Mass Communication and Media Arts at Southern Illinois University Carbondale. She is editor of Myanmar Media in Transition (forthcoming). Image Credit Wikimedia.

Leave a Reply

Your email address will not be published. Required fields are marked *