Myanmar, the Country Where Facebook Posts Whipped Up Hate

ASIA-UPDATES ON MYANMAR ROHINGYA GENOCIDE, MEDIA, 17 Sep 2018

BBC News – TRANSCEND Media Service

Decades of ethnic and religious tensions, a sudden explosion of internet access, and a company that had trouble identifying and removing the most hateful posts.

More than 700,000 Rohingya Muslims have fled Myanmar. Getty Images

12 Sep 2018 – It all added up to a perfect storm in Myanmar, where the United Nations says Facebook had a “determining role” in whipping up anger against the Rohingya minority.

“I’m afraid that Facebook has now turned into a beast, and not what it originally intended,” Yanghee Lee, UN special rapporteur on human rights in Myanmar, said in March.

The company admits failures and has moved to address the problems. But how did Facebook’s dream of a more open and connected world go wrong in one south-east Asian country?

Enter Facebook

“Nowadays, everyone can use the internet,” says Thet Swei Win, director of Synergy, an organisation that works to promote social harmony between ethnic groups in Myanmar.

That wasn’t the case in Myanmar five years ago.

Outside influence had been kept to a minimum during the decades when the military dominated the country. But with the release of Aung San Suu Kyi, and her election as Myanmar’s de facto leader, the government began to liberalise business – including, crucially, the telecoms sector.

The effect was dramatic, according to Elizabeth Mearns of BBC Media Action, the BBC’s international development charity.

“A SIM card was about $200 [before the changes],” she says. “In 2013, they opened up access to other telecom companies and the SIM cards dropped to $2. Suddenly it became incredibly accessible.”

For many in Myanmar, Facebook is synonymous with the internet.
Getty Images

And after they bought an inexpensive phone and a cheap SIM card, there was one app that everybody in Myanmar wanted: Facebook. The reason? Google and some of the other big online portals didn’t support Burmese text, but Facebook did.

“People were immediately buying internet accessible smart phones and they wouldn’t leave the shop unless the Facebook app had been downloaded onto their phones,” Mearns says.

Thet Swei Win believes that because the bulk of the population had little prior internet experience, they were especially vulnerable to propaganda and misinformation.

“We have no internet literacy,” he told Trending. “We have no proper education on how to use the internet, how to filter the news, how to use the internet effectively. We did not have that kind of knowledge.”

Ethnic tensions

Out of a population of about 50 million, around 18 million in Myanmar are regular Facebook users.

But Facebook and the telecoms companies which gave millions their first access to the internet do not appear to have been ready to grapple with the ethnic and religious tensions inside the country.

The enmity goes deep. Rohingyas are denied Burmese citizenship. Many in the Buddhist ruling class do not even consider them a distinct ethnic group – instead they refer to them as “Bengalis”, a term that deliberately emphasises their separateness from the rest of the country.

Last year’s military operation in the north-west Rakhine state was designed, the government says, to root out militants. It resulted in more than 700,000 people fleeing for neighbouring Bangladesh – something that the United Nations calls the world’s fastest growing refugee crisis.

A UN report has said top military figures in Myanmar must be investigated for genocide in Rakhine state and crimes against humanity in other areas. But the government of Myanmar has rejected those allegations.

Facebook ‘weaponised’

The combination of ethnic tensions and a booming social media market was toxic. Since the beginning of mass internet use in Myanmar, inflammatory posts against Rohingya have regularly appeared on Facebook,

Thet Swei Win said he was horrified by the anti-Rohingya material he has seen being shared. “Facebook is being weaponised,” he told BBC Trending.

Reuters

In August, a Reuters investigation found more than 1,000 Burmese posts, comments and pornographic images attacking the Rohingya and other Muslims.

“To be honest I thought we might find at best a couple of hundred examples I thought that would make the point,” says Reuters investigative reporter Steve Stecklow, who worked with Burmese-speaking colleagues on the story.

Stecklow says some of the material was extremely violent and graphic.

“It was sickening to read and I had to keep saying to people ‘Are you OK? Do you want to take a break?'”

Some posts on Facebook expressed the hope that fleeing Rohingya refugees would drown at sea. Reuters

“When I sent it to Facebook, I put a warning on the email saying I just want you to know these are very disturbing things,” he says. “What was so remarkable was that [some of] this had been on Facebook for five years and it wasn’t until we notified them in August that it was removed.”

Several of the posts catalogued by Stecklow and his team described Rohingyas as dogs or pigs.

“This is a way of dehumanising a group,” Stecklow says. “Then when things like genocide happen, potentially there may not be a public uproar or outcry as people don’t even view these people as people.”

Lack of staff

The material that the Reuters team found clearly contravened Facebook’s community guidelines, the rules that dictate what is and is not allowed on the platform. All of the posts were removed after the investigation, although the BBC has since found similar material still live on the site.

So why did the social network fail to grasp how it was being used to spread propaganda?

One reason, according to Mearns, Stecklow and others, was that the company had difficulty with interpreting certain words.

For example, one particular racial slur – “kalar” – can be a highly derogatory term used against Muslims, or have a much more innocent meaning: “chickpea”.

In 2017, Stecklow says, the company banned the term, but later revoked the ban because of the word’s dual meaning.

There were also software problems which meant that many mobile phone users in Myanmar had difficulties reading Facebook’s instructions for how to report worrying material.

But there was also a much more fundamental issue – the lack of Burmese-speaking content monitors. According to the Reuters report, the company had just one such employee in 2014, a number that had increased to four the following year.

The company now has 60 and hopes to have around 100 Burmese speakers by the end of this year.

Multiple warnings

Following the explosion in Facebook use in Myanmar, the company did receive multiple warnings from individuals about how the platform was being used to spread anti-Rohingya hate speech.

In 2013, Australian documentary maker Aela Callan raised concerns with a senior Facebook manager. The next year a doctoral student named Matt Schissler has a series of interactions with employees, which resulted in some content being removed.

And in 2015, tech entrepreneur David Madden travelled to Facebook’s headquarters in California to give managers a presentation on how he had seen the platform used to stir up hate in Myanmar.

“They were warned so many times,” Madden told Reuters. “It couldn’t have been presented to them more clearly, and they didn’t take the necessary steps.”

Accounts removed

A Facebook spokeswoman told Trending via email that the company was committed to hiring more content moderators but was also taking a number of other steps to tackle the problems in Myanmar.

“In the last year, for example, we have established a team of product, policy and operations experts to roll out better reporting tools, a new policy to tackle misinformation that has the potential to contribute to offline harm, faster response times on reported content, and improved proactive detection of hate speech,” the spokeswoman said.

Since last year, the company has taken some publicly visible action. In August, Facebook removed 18 accounts and 52 pages linked to Burmese officials. One account on Instagram, which Facebook owns, was also closed. The company said it “found evidence that many of these individuals and organizations committed or enabled serious human rights abuses in the country.”

The spokeswoman said deleting the accounts was “not a decision we took lightly.”

“Staying ahead of the bad means always looking for how people can misuse technology – and doing everything you can to prevent that misuse from happening in the first place. That’s our responsibility now and it’s something that weighs heavily on all of us.”

Radical Buddhist monk Wirathu’s Facebook page was removed earlier this year.
Facebook screengrab

Between them, the accounts and pages were followed by almost 12 million people.

In January this year, Facebook also removed the account of Ashin Wirathu, a radical monk famed for his angry speeches which stoking fears against Muslims.

‘Too slow’

In a statement, Facebook has admitted that in Myanmar it was “too slow to prevent misinformation and hate”, and acknowledged that countries that are new to the internet and social media are susceptible to the spreading of hate.

The subject of hate speech on the platform came up in early September, when Facebook’s chief operating officer, Sheryl Sandberg, testified in front of a US Senate committee.

Sheryl Sandberg says Facebook is committed to tackling hate speech. Drew Angerer

“Hate is against our policies and we take strong measures to take it down. We also publish publicly what our hate-speech standards are,” she said. “We care tremendously about civil rights.”

When Facebook chief executive Mark Zuckerberg appeared in front of Congress in April, he was asked specifically about events in Myanmar, and said that in addition to hiring more Burmese speakers, the company was also working with local groups to identify “specific hate figures” and creating a team that would help identify similar issues in Myanmar and other countries in the future.

Elizabeth Mearns from BBC Media Action, believes that while it is Facebook’s role in Myanmar that is currently under scrutiny, the situation is just one example of a far wider issue.

“We are definitely now in a situation where content on social media is directly affecting people’s real life. It’s affecting the way people vote. It’s affecting the way people behave towards each other, and it’s creating violence and conflict,” she says.

“The international community understands now, I think, that it needs to step up and understand technology. And understand what’s happening on social media in their countries or in other countries.”

Go to Original – bbc.co.uk

Share this article:


DISCLAIMER: The statements, views and opinions expressed in pieces republished here are solely those of the authors and do not necessarily represent those of TMS. In accordance with title 17 U.S.C. section 107, this material is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. TMS has no affiliation whatsoever with the originator of this article nor is TMS endorsed or sponsored by the originator. “GO TO ORIGINAL” links are provided as a convenience to our readers and allow for verification of authenticity. However, as originating pages are often updated by their originating host sites, the versions posted may not match the versions our readers view when clicking the “GO TO ORIGINAL” links. This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available in our efforts to advance understanding of environmental, political, human rights, economic, democracy, scientific, and social justice issues, etc. We believe this constitutes a ‘fair use’ of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml. If you wish to use copyrighted material from this site for purposes of your own that go beyond ‘fair use’, you must obtain permission from the copyright owner.

Comments are closed.