Facebook apologized after an "unacceptable" error was made by its Artificial Intelligence (AI) software that asked users of the social media platform viewing a clip featuring black men if they wanted to watch more "videos about primates."
The clip was uploaded in June last year, and showed an encounter between a group of black men and a white man. In the video, the white man seems to be calling authorities and reporting about him "being harassed" by the black men.
Last week, a former Facebook employee tweeted the error after a friend informed her about the misidentification. Darci Groves called the error "egregious," and pointed out that the Facebook message was unacceptable, and said that despite the clip being more than a year old, a friend got the prompt recently.
She even shared a screenshot of the clip along with the social media giant's "primates" message.
After the tweet, Facebook reportedly disabled the topic recommendation feature, and said that it is investigating the cause of the mistake. The automated prompt was an "unacceptable error," Dani Lever, a Facebook spokesperson, told The New York Times, and apologized to those who came across the offensive message.
In the past also, there have been a series of racial blunders online that seemed to have been caused by technology. According to New York Daily News, facial recognition technology has more trouble identifying people of color, revealed recent studies.
About six years ago, Google apologized after its Photos application mistakenly identified black people as “gorillas.” Jacky Alcine, a New York-based software developer, who was one of the persons who featured in the photo, brought the error to Google's attention. The company was even slammed on social media because of the label's racist connotation.
A spokeswoman told BBC back in 2015 that there was still a lot of work to do when it came to automatic image labeling, and that they were looking at how they could prevent these types of blunders from happening in the future.
However, Alcine wondered what kind of people and photos were used in their initial priming that led to results like these.
© 2024 Latin Times. All rights reserved. Do not reproduce without permission.