Facebook-Parent Goal to ‘assess the feasibility’ of human rights review on Ethiopian practices

Deepak Gupta January 14, 2022
Updated 2022/01/14 at 9:29 AM

Facebook owner Meta Platforms said on Thursday that it would “assess the feasibility” of commissioning an independent human rights assessment of its work in Ethiopia, after its oversight board recommended a review of how Facebook and Instagram were handled. used to spread content that increases the risk of violence there.

The board, created by the company to address criticism over the handling of problematic material, makes binding decisions on a small number of challenging content moderation cases and provides non-binding policy recommendations.

Meta is under scrutiny by lawmakers and regulators over user safety and how it handles abuses on its platforms around the world, particularly after whistleblower Frances Haugen leaked internal documents that showed the company’s struggles in policing content in countries where such speech was more likely to cause harm, including Ethiopia.

Thousands died and millions were displaced during a year-long conflict between the Ethiopian government and rebel forces in the northern Tigray region.

The social media giant said it had “invested significant resources in Ethiopia to identify and remove potentially harmful content” as part of its response to the December council’s recommendations on a case involving content posted in the country.

The supervisory board last month upheld Meta’s original decision to remove a post alleging the involvement of ethnic Tigris civilians in atrocities in Ethiopia’s Amhara region. As Meta had restored the post after the user’s appeal to the board, the company had to remove the content again.

On Thursday, Meta said that while it had taken down the post, it disagreed with the board’s reasoning that it should have been removed because it was an “unverified rumor” that significantly increased the risk of imminent violence. He said that would impose “a standard of journalistic publishing on people”.

A spokesperson for the oversight board said in a statement: “Meta’s existing policies prohibit rumors that contribute to impending violence that cannot be debunked within a meaningful time frame, and the board has made recommendations to ensure these policies are effectively enforced. in conflict situations”.

“Rumors alleging that an ethnic group is complicit in atrocities, as found in this case, has the potential to lead to serious harm to people,” they said.

The board had recommended that Meta commission a human rights due diligence assessment, to be completed in six months, which should include an analysis of Meta’s language capabilities in Ethiopia and an analysis of measures taken to prevent misuse of its services. in the country.

However, the company said that not all elements of this recommendation “may be feasible in terms of time, data science or approach”. He said he would continue his existing human rights due diligence and should have an update on whether he could act on the board’s recommendation in the coming months.

Previous Reuters reports on Myanmar and other countries investigated how Facebook struggled to monitor content around the world in different languages. In 2018, UN human rights investigators said that the use of Facebook played a key role in the spread of hate speech that fueled violence in Myanmar.

Meta, which said it was too slow to prevent misinformation and hatred in Myanmar, said the company now has native speakers around the world reviewing content in more than 70 languages ​​working to stop abuse on its platforms in places where there is a increased risk of conflict and violence.

The board also recommended that Meta rewrite its value statement on safety to reflect that online speech can pose a risk to people’s physical safety and their right to life. The company said it would make changes to this value, in a partial implementation of the recommendation.

© Thomson Reuters 2022


See the latest from the Consumer Electronics Show on Gadgets 360 in our CES 2022 hub.

.

Share this Article
Leave a comment

Leave a Reply

Your email address will not be published.