[ad_1]
Facebook users who recently watched a video featuring Black men from a British tabloid saw an automated prompt from the social network asking if they wanted to “continue watching videos about Primates,” prompting the company to investigate and disable the AI-powered feature. pushed this message.
Facebook apologized on Friday for calling it an “unacceptable mistake” and said it was investigating the recommendation feature “to prevent this from happening again”.
The June 27, 2020 video, courtesy of The Daily Mail, featured clips of black men arguing with white civilians and police officers. It had no connection to apes or primates.
Darci Groves, a former content design manager at the social network, said a friend recently sent him a screenshot of the prompt. He then posted it to a product feedback forum for current and former Facebook employees. In response, a product manager of Facebook Watch, the company’s video service, called this “unacceptable” and said the company was “investigating the root cause”.
Ms. Groves said the prompt was “terrible and dreadful”.
Facebook spokesperson Dani Lever said in a statement: “As we’ve said before, while we’re making improvements to our AI, we know it’s not perfect and we have more progress. We apologize to anyone who may have seen this offensive advice.”
Google, Amazon and other tech companies have been under scrutiny for years for bias in their AI systems, particularly around race issues. Studies Have shown Facial recognition technology is biased towards people of color and has more trouble identifying them, leading to incidents where Blacks are discriminated against or discriminated against. arrested for computer error.
In an example in 2015, Google Photos He mistakenly tagged photos of black people as “gorillas” and the search giant said it was “really sorry” for this and would work to fix the problem immediately. More than two years later, wired He found that Google’s solution was to censor the word “gorilla” from searches while also blocking the words “chimpanzee”, “chimpanzee” and “monkey”.
Facebook has one of the world’s largest user-uploaded image repositories to train face and object recognition algorithms. The company, which organizes content based on users’ past browsing and viewing habits, sometimes asks people if they want to continue seeing posts under the relevant categories. It was unclear whether messages like “primates” were common.
Photo sharing application Facebook and Instagram, struggled with other race-related issues. For example, after the European Football Championship in July, three Black members of the England national football team were racially abused on the social network for missing penalty kicks in the championship game.
Race issues have also caused internal strife on Facebook. In 2016, CEO Mark Zuckerberg, asked the employees Stop crossing out the phrase “Black Lives Matter” and replacing it with “All Lives Matter” in a common area at the company’s headquarters in Menlo Park, California. Hundreds of employees organized a virtual walk last year to protest the company’s handling of a post from President Donald J. Trump regarding the murder of George Floyd in Minneapolis.
The company then hired a civil rights vice president and was released. civil rights control. Inside annual diversity report In July, Facebook said 4.4 percent of its US-based employees were Black, up from 3.9 percent a year earlier.
Ms. Groves, who left Facebook over the summer after four years, said in an interview that there were a series of missteps at the company that suggested its leaders were not prioritizing ways to deal with racial issues.
“Facebook can’t make these mistakes and then say ‘I’m sorry,'” he said.
[ad_2]
Source link