The social media giant said in a statement Wednesday that it had removed 425 Facebook pages, 17 groups and 135 accounts in Myanmar for engaging in "coordinated inauthentic behavior," meaning they misrepresented who was running the provocative accounts. It also removed 15 Instagram accounts.
Some 700,000 Rohingya fled their homes in western Myanmar since last year in response to a brutal counterinsurgency campaign by the military, which has been accused of massive human rights violations. Rights activists and U.N. investigators have charged that the military in Buddhist-dominated Myanmar was carrying out a policy of ethnic cleansing, or even genocide.
"As part of our ongoing investigations into this type of behavior in Myanmar, we discovered that these seemingly independent news, entertainment, beauty and lifestyle Pages were linked to the Myanmar military, and to the Pages we removed for coordinated inauthentic behavior in Myanmar in August," said Facebook's statement. "This kind of behavior is not allowed on Facebook under our misrepresentation policy because we don't want people or organizations creating networks of accounts to mislead others about who they are, or what they're doing."
In its initial action in August to fight the problem, Facebook said it had banned Myanmar's powerful military chief and 19 other individuals and organizations in order to "prevent the spread of hate and misinformation."
Last month, Facebook admitted that it didn't do enough to prevent its services from being used to incite violence and spread hate in Myanmar.
It was reacting to a report it commissioned from the nonprofit group Business for Social Responsibility, which said that "A minority of users is seeking to use Facebook as a platform to undermine democracy and incite offline violence, including serious crimes under international law."