A new report has found that Facebook has failed to detect blatant hate speech and calls for violence against Myanmar’s Rohingya Muslim minority, and that such behavior has played a decisive role in the genocide against them over the years.
A report shared exclusively with The Associated Press shows that rights group Global Witness has submitted eight paid advertisements to Facebook for approval, each containing various versions of hate speech against the Rohingya. Facebook has approved the publication of all eight ads.
The group dragged out ads before posting or paying, but the results confirmed that despite promises to do better, Facebook’s leaked controls still failed to detect hate speech and called for violence on its platform.
The army conducted a clearance operation in the Rakhine state of western Myanmar in 2017 after an attack by a Rohingya rebel group. More than 700,000 Rohingya have fled to neighboring Bangladesh, and security forces have been accused of gang-rape, murder and arson in thousands of homes.
On February 1 last year, Myanmar’s military forcibly took control of the country by imprisoning democratically elected government officials. Rohingya refugees have condemned the military occupation and said it would make them more afraid to return to Myanmar.
Experts say that despite such ads constantly appearing and promising to do better and assuring it has taken its role seriously in the genocide, Facebook still fails the simplest test – ensuring that paid ads on its site are not included. Hate speech calling for killing of Rohingya Muslims.
“Keller’s current killing isn’t enough, we need to kill more!” Read a proposed payout post by Global Witness, a slur often used to refer to people of East Indian or Muslim descent in Myanmar.
“They are very dirty. The standard of living of Bengali / Rohingya women is very low and poor hygiene. They are not attractive, ”read another.
Ronan Lee, a research fellow at the Institute for Media and Creative, said: “These posts are shocking to anyone who encourages them and it is a clear sign that Facebook has told the public what they will do or has not changed what they will do: control themselves properly.” Art at the University, London.
Eight Global Witness ads have used hate speech directly from the UN’s independent international fact-finding mission in Myanmar in their report to the Human Rights Council. Many examples were from past Facebook posts.
The eight ads that Facebook has approved are particularly worrying because the company claims that the ads are placed on “even stricter” criteria than regular, unpaid posts, according to their Help Center page for paid advertising.
“I agree that eight is not a very big number. But I think the results are really tough, all eight ads were accepted for publication, “said Rosie Sharp, a promoter for Global Witness. “I think you can conclude from this that hate speech has the potential to surpass the overwhelming majority.”
Facebook’s parent company Meta says it has invested in improving its security and safety controls in Myanmar, including banning military accounts after Tatmado because the armed forces are known locally, seized power and captured leaders elected in the 2021 coup.
“We have formed a dedicated team of Burmese speakers, banned Tatmad, disrupted networks that manipulate public debate, and took action against harmful misinformation to help keep people safe.” We have also invested in Burmese-language technology to reduce the prevalence of infringing content, ”Rafael Frankel, director of public policy for emerging markets in Meta Asia Pacific, wrote in an e-mailed statement to the AP on March 17. “This work is led by experts, civil society organizations and independent reports, including the UN fact-finding mission on the Myanmar inquiry and the independent human rights impact assessment that we commissioned and published in 2018.”
In the past, Facebook has been used to spread hate speech and military propaganda in Myanmar.
Shortly after Myanmar joined the Internet in 2000, Facebook paired up with its telecom providers to allow customers to use the platform without paying for data, which was still expensive. Explosion using the platform. For many in Myanmar, Facebook has become the Internet itself.
Local internet policy advocates have repeatedly said that Facebook has spread hate speech across the platform, often targeting the Muslim minority Rohingya, a Buddhist majority.
Burmese Rohingya President Tun Khin says for years Facebook has failed to understand the political situation in Myanmar by speaking the local language or investing in fact checkers or closing specific accounts or deleting pages used to spread hatred against the Rohingya. Organization UK, a London-based Rohingya advocacy organization.
In March 2018, less than six months after thousands of Rohingya fled violence in western Myanmar, Marzuki Darusman, chairman of the UN’s Independent International Fact-Finding Mission in Myanmar, told reporters that social media “contributed significantly to the level of violence and dissent and conflict, if you will.” , In public. “
“Hate speech is definitely a part of it. As far as the situation in Myanmar is concerned, social media is Facebook, and Facebook is social media, ”Darussman said.
Asked about Myanmar at a U.S. Senate hearing a month later, Meta CEO Mark Zuckerberg said Facebook plans to hire “dozens” of Burmese speakers to moderate content and develop new technologies to identify hate speech and fight hate speech. Will work with groups
“Hate speech is a very specific language. It’s difficult to do without those who speak the local language, and we need to dramatically increase our efforts there, “said Zuckerberg.
Yet in the internal files of whistleblower Frances Haugen leaked last year, the AP found that violations continued. The company has stepped up its efforts to combat hate speech but has never fully developed the tools and techniques needed to do so.
Rohingya refugees have filed a 150 billion (approximately Rs 11,46,180 crore) lawsuit against Facebook, alleging that it failed to stop hate speech by Myanmar’s military rulers and their supporters inciting violence against Muslim ethnic groups. Rohingya youth groups in Bangladesh’s refugee camps have filed a separate complaint with the 38-nation Organization for Economic Co-operation and Development in Ireland, urging Facebook to provide some remedial programs in the camps.
Now Meta refuses to say how many of its content moderators read Burmese and thus could detect hate speech in Myanmar.
“Survivors of the Rohingya genocide are still living in camps and Facebook is failing them,” said Tun Khin. “Facebook needs to do more.”