On Tuesday, Facebook’s Oversight Board officially announced the first six cases it will consider, to determine whether it should override the social media platform’s decisions on removed content.
Facebook CEO Mark Zuckerberg first proposed the independent board in November 2018 to create accountability around Facebook’s decision-making. When Facebook or Instagram users exhaust their content removal appeals within the platform, they can submit their case to the Oversight Board for final review. Since the board started accepting cases in October, users have submitted more than 20,000 pieces of removed content for consideration.
Three out of the six cases that were chosen had violated rules on hate speech, an issue the board has deemed complicated for Facebook’s algorithms.
“Hate speech is an especially difficult area,” Jamal Greene, one of the board’s co-chairs and a professor at Columbia Law School, told Reuters. “It’s not that easy … for an algorithm to get the context of” such speech. The other cases the board will consider involve violence and incitement, dangerous individuals and organizations, and adult nudity. Here are the basic details about the six appeals the board will hear.
1. A user posted two screenshots of tweets by former Malaysian Prime Minister Mahathir Mohamad, which said “Muslims have a right to be angry and kill millions of French people for the massacres of the past.” Facebook removed the post for hate speech violations, but the user’s appeal indicates they wanted to spread awareness of the prime minister’s “horrible words.”
2. Another hate speech case included a user posting photos of a deceased child lying on a beach. The post’s text, in Burmese, criticizes the inaction against China for its treatment of Uyghur Muslims compared “to the killings in France relating to cartoons.” The user said the post was trying to convey that human lives are more important than religious ideologies.
3. The final hate speech case involved a user posting about the Armenian-Azerbaijani conflict, saying Armenians are restoring mosques on their land and that they are against “Azerbaijani aggression” and “vandalism.” The content was removed. But according to the Oversight Board, “The user indicated in their appeal to the Oversight Board that their intention was to demonstrate the destruction of cultural and religious monuments.”
4. One case involved pictures of female breasts, which a user said was intended to promote breast cancer awareness. It was removed for violating policies on adult nudity.
5. Another case involved a user resharing a “Memory” post that featured an alleged quote from Nazi minister of propaganda Joseph Goebbels about needing to appeal to emotions and instincts instead of intellect. Facebook removed the post for violating its policies about dangerous individuals and organizations, but the user argued the quote was important and that they “consider the current US presidency to be following a fascist model.”
6. Facebook itself submitted a COVID-19 misinformation case, in which it removed a post that criticized the French agency responsible for health products for “purportedly refusing authorization for use of hydroxychloroquine and azithromycin against COVID-19, but authorizing promotional mail for remdesivir.” According to the Oversight Board, Facebook said it submitted the case because it “presents an example of the challenges faced when addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic.”
The board consists of independent members from around the globe, including a Nobel Peace prize winner, a former prime minister, and multiple university professors. They will not review every case submitted, but will instead focus on cases that could set a major precedent for many users. “We will therefore prioritize cases that have the potential to impact many users around the world, are of critical importance to public discourse, or raise important questions about Facebook’s policies,” the board said in a statement in October when it announced it was open for business. After the board announces that it takes on the case, there will be a one-week period in which users can submit comments about it. Decisions will be made within 90 days of the announcement about the case.
Although Facebook has insisted the board was created with transparency and independence from the social media powerhouse, some are not convinced. Critics have noted the board is entirely funded, structured, and chosen by Facebook. “If it’s radical change you’re looking for, the [Facebook Oversight Board] is not it,” TechCrunch’s Natasha Lomas wrote.
“Its remit does not extend to being able to investigate how Facebook’s attention-seeking business model influences the types of content being amplified or depressed by its algorithms, either.”
In September, a group of civil rights activists started a separate initiative called the “Real Facebook Oversight Board.” The rogue board has accused the platform of letting its tools be used to “spread lies” and enable voter suppression among Black and Latinx voters. On Monday, it announced it would review Facebook’s decision to ban President Trump’s former adviser Steve Bannon, who suggested that the country’s top infectious disease expert, Anthony Fauci, be beheaded. Facebook dismissed the effort, however.
“We ran a year-long global consultation to set up the Oversight Board as a long-lasting institution that will provide binding, independent oversight over some of our hardest content decisions,” the company said in a statement. “This new effort is mostly longtime critics creating a new channel for existing criticisms.”
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.
from Slate Magazine https://ift.tt/2VswbLH
via IFTTT
沒有留言:
張貼留言