Saturday, May 4, 2024
HomePolitics‘Echo chamber of anti-Rohingya content’: Amnesty says Meta’s algorithms stoked Myanmar violence

‘Echo chamber of anti-Rohingya content’: Amnesty says Meta’s algorithms stoked Myanmar violence

New Delhi: Amnesty International has in a report published Thursday claimed that Meta, the parent company of Facebook, promoted violence against the Rohingya community in Myanmar through its “content-shaping algorithms”.

Several hateful Facebook posts against the Rohingyas were shared widely and passed on without checks on moderation, it said.

In the months and years leading up to and during the 2017 atrocities, Facebook in Myanmar, according to the 74-page document, became an “echo chamber of virulent anti-Rohingya content”.

“Actors linked to the Myanmar military and radical Buddhist nationalist groups systematically flooded the Facebook platform with incitement targeting the Rohingya, sowing disinformation regarding an impending Muslim takeover of the country and seeking to portray the Rohingya as sub-human invaders,” it further said.

The report recounted that around the 2017 period, when Meta started enjoying “near-total market dominance”, the people of Myanmar started using Facebook as the primary source of information and news.

Even prominent leaders of the Myanmar military junta were posting inciting content. “The leader of Myanmar’s military, Senior General Min Aung Hlaing, posted on his Facebook page on 1 September 2017, saying, “We openly declare that absolutely, our country has no Rohingya race”. Meta finally banned Min Aung Hlaing from Facebook in 2018,” the report points out.

Amnesty collated interviews of Rohingya community members between February and June 2022. All of them explained how violence surged after a few hateful Facebook posts went viral, and how communities clashed over fake news, it said.

The report also pointed out Meta’s “refusal to remediate” the Rohingya community, and accused the tech giant of ignoring all requests made by multiple civil society members to put a stop to hate speech.

ThePrint sought comment from Meta on the report via email. The report will be updated when a response is received.


Also Read: No plans to house Rohingyas, says Centre after its minister tweets ‘India gives refuge to all’


‘Inadequate staffing’, ‘near-total market dominance’

A major lapse highlighted by the report was “inadequate staffing” at Meta during the period when reports of violence were significantly high.

“Meta’s wholly inadequate staffing of its Myanmar operations prior to 2017 was a significant factor in the company’s staggering failures to remove harmful anti-Rohingya content from the Facebook platform,” the report said.

It added: “This is symptomatic of the company’s broader failure to adequately invest in content moderation across the Global South. In mid-2014, Meta staff admitted that they only had one single Burmese-speaking content moderator devoted to Myanmar at the time, based in their Dublin office.”

Meta claimed to have hired dozens of Burmese-speaking reviewers in a response to US legislators in June 2018. However, the Amnesty report states that according to one investigation, Meta had only five Burmese language speakers to monitor and moderate content in April 2018. At the time, Myanmar had 18 million Facebook users.

Meta’s “content-shaping algorithms”, according to Amnesty, are “optimized to ensure that users engage with content on Facebook as much as possible and spend as much time as possible on the platform”. This algorithmic structure determines what users see, and “the more engaged users are, the more advertising revenue Meta earns”.

The report elaborated how posts inciting violence and hatred saw maximum engagement.

“Meta — through its dangerous algorithms and its relentless pursuit of profit – substantially contributed to the serious human rights violations perpetrated against the Rohingya,” it said.

It also stated that Meta’s was not merely a “passive and neutral platform” that responded inadequately.

Refusal to remedy 

The report recounted that since 2019, those in the Rohingya community have filed a range of complaints and proposals that seek “justice and reparations” for the atrocities they suffered in 2017.

One such proposal, put forward by Rohingya youth groups in mid 2020, was for “Meta to fund a USD $1 million education project”.

On 10 February 2021, Meta rejected the community’s request. They stated that Facebook does not “directly engage in philanthropic activities”.

The report said that Meta’s characterisation of the Rohingya community’s request as “philanthropic activities” conveys a deeply flawed understanding of the company’s human rights responsibilities.

“The Rohingya communities have not requested charity; they are pursuing Meta to demand that the company fulfils its responsibility to remediate the severe human rights harms they have suffered and which the company contributed to,” it added.

(Edited by Theres Sudeep)


Also Read: Despite pushback from social media intermediaries, govt’s grievance appellate committee stays


Source: The Print

RELATED ARTICLES
- Advertisment -

Most Popular

Recent Comments