Rohingya activist calls for Meta probe into Myanmar violence

0

A Rohingya rights activist and survivor of atrocities, Maung Sawyeddollah, has submitted a complaint to the US Securities and Exchange Commission (SEC), urging the agency to investigate Meta for possible violations of securities laws related to the company’s failure to disclose its significant association with the 2017 genocide against the Rohingya in Myanmar. Amnesty International, the Open Society Justice Initiative, and Victim Advocates International jointly supported this submission.

The complaint sheds light on Meta’s alleged complicity in the violence against the Rohingya and highlights the company’s misrepresentations to shareholders and the SEC. Mandi Mudarikwa, Head of Strategic Litigation at Amnesty International, hopes that the SEC will consider the submission and investigate Meta for any breaches of federal securities laws.

The submission to the SEC outlines how Meta was warned multiple times by activists and researchers about the potential misuse of Facebook to incite violence against the Rohingya prior to the 2017 atrocities in Myanmar. Despite these warnings, Meta continued to omit crucial information regarding the risk of real-world violence in its communications with investors. A report by Amnesty International in 2022 concluded that Meta contributed to the Rohingya atrocities through its algorithms that amplified harmful content and insufficient moderation of such content, violating the company’s own Community Standards.

The report found that Meta’s profit-driven business model, which involved invasive profiling and targeted advertising, facilitated the spread of harmful content, including incitements to violence. Meta’s algorithmic systems prioritize user engagement to boost advertising revenue, often resulting in the promotion of divisive and harmful content.

Maung Sawyeddollah shared his frustration at Meta’s inadequate response to reports of harmful content on Facebook, stating that even when content violated Community Standards, Meta did not take sufficient action. Subsequently, in Myanmar and other regions, Meta’s lack of enforcement led to the proliferation of inflammatory and harmful content.

Eva Buzo, Executive Director at Victim Advocates International, emphasized the devastating impact of Meta’s algorithms in Myanmar, promoting anti-Rohingya online campaigns that escalated to offline violence. The submitted complaint highlights Meta’s failure to address warnings from civil society regarding Facebook’s role in fueling violence, particularly in Myanmar.

Despite numerous alerts from civil society members between 2013 and 2017, Meta did not fully disclose the risks associated with its operations in Myanmar, denying investors crucial information. Meta did not heed shareholder proposals for human rights impact assessments or committees to oversee international public issues, including human rights.

Subsequent public pressure in 2018 compelled Meta to acknowledge its role in the Rohingya atrocities, yet the company failed to curb hate speech and violence content targeting the Tigrayans in Ethiopia, contributing to severe offline violence. Meta’s recent policy changes eliminating independent fact-checking pose a significant risk of exacerbating human rights violations and offline violence globally.

In conclusion, Meta’s disregard for warnings and failure to address the spread of harmful content on its platforms highlight the urgent need for increased accountability and transparency in the tech industry to prevent further complicity in human rights abuses.

Leave a Reply

Your email address will not be published. Required fields are marked *