Rohingya sue Facebook for £150bn over Myanmar genocide
- 07/12/2021
- 0
By Dan Milmo, The Guardian
Victims in US and UK legal action accuse social media firm of failing to prevent incitement of violence
Facebook’s negligence facilitated the genocide of Rohingya Muslims in Myanmar after the social media network’s algorithms amplified hate speech and the platform failed to take down inflammatory posts, according to legal action launched in the US and the UK.
The platform faces compensation claims worth more than £150bn under the coordinated move on both sides of the Atlantic.
A class action complaint lodged with the northern district court in San Francisco says Facebook was “willing to trade the lives of the Rohingya people for better market penetration in a small country in south-east Asia.”
It adds: “In the end, there was so little for Facebook to gain from its continued presence in Burma, and the consequences for the Rohingya people could not have been more dire. Yet, in the face of this knowledge, and possessing the tools to stop it, it simply kept marching forward.”
A letter submitted by lawyers to Facebook’s UK office on Monday says clients and their family members have been subjected to acts of “serious violence, murder and/or other grave human rights abuses” as part of a campaign of genocide conducted by the ruling regime and civilian extremists in Myanmar.
It adds that the social media platform, which launched in Myanmar in 2011 and quickly became ubiquitous, aided the process. Lawyers in Britain expect to lodge a claim in the high court, representing Rohingya in the UK and refugees in camps in Bangladesh, in the new year.
“As has been widely recognised and reported, this campaign was fomented by extensive material published on and amplified by the Facebook platform,” says the letter from the law firm McCue Jury & Partners.
Facebook admitted in 2018 that it had not done enough to prevent the incitement of violence and hate speech against the Rohingya, the Muslim minority in Myanmar. An independent report commissioned by the company found that “Facebook has become a means for those seeking to spread hate and cause harm, and posts have been linked to offline violence”.
The McCue letter says: “Despite Facebook’s recognition of its culpability and its pronouncements about its role in the world, there has not been a single penny of compensation, nor any other form of reparations or support, offered to any survivor.”
In the US and UK, the allegations against Facebook include: Facebook’s algorithms amplified hate speech against the Rohingya people; it failed to invest in local moderators and fact checkers; it failed to take down specific posts inciting violence against Rohingya people; and it did not shut down specific accounts or delete groups and pages that were encouraging ethnic violence.
The US complaint cites Facebook posts that appeared in a Reuters report, with one in 2013 stating: “We must fight them the way Hitler did the Jews, damn Kalars [a derogatory term for Rohingya people].” Another post in 2018, showing a photograph of a boatload of Rohingya refugees, says: “Pour fuel and set fire so that they can meet Allah faster.”
The number of Rohingya killed in 2017, during the Myanmar military’s “clearance operations”, is likely to be more than 10,000, according to the medical charity Médicins sans Frontières.
About 1 million Rohingyas live in Cox’s Bazar refugee camp, in south-eastern Bangladesh, where McCue and Mishcon de Reya, which is also working on the UK-based case, expect to recruit more claimants.
The UK case has about 20 claimants so far, while in the US the class action suit hopes to act on behalf of an estimated 10,000 Rohingya in the country.
The Facebook whistleblower Frances Haugen has alleged the platform is fanning ethnic violence in countries including Ethiopia and is not doing enough to stop it. She said 87% of the spending on combating misinformation at Facebook is spent on English content, while only 9% of users are English speakers.
Responding to Haugen’s revelations, Facebook has said it had a “comprehensive strategy” in place for countries at risk of conflict and violence, including use of native speakers and third-party fact checkers.
Facebook’s owner, Meta, has been approached for comment.