Google Photos still can’t find Gorillas. Nor can Apple.

credit…Desiree Rios/The New York Times

After eight years of arguing about black people being mislabeled by image analysis software – and despite major advances in computer vision – the tech giants still fear a repeat of the mistake.


When Google released the standalone Photos app in May 2015, people were blown away by what it could do: analyze photos to label the people, places, and things in them, an amazing proposition for the consumer at the time. But two months after release, its developer, Jacky Alciné, discovered that Google had labeled photos of him and a friend, both black, as “gorillas,” a particularly offensive term because it echoes centuries of racist tropes.

In the ensuing controversy, Google banned its software from labeling anything in photos as a gorilla, and vowed to fix the problem. Eight years later, with major advances in AI, we tested whether Google had solved the problem, and took a look at similar tools from its competitors: Apple, Amazon, and Microsoft.

There was only one member of the primate family that Google and Apple managed to identify — lemurs, a perpetually stunning animal with a long tail that shares an opposable thumb with humans, but is more closely related than apes.

Obviously, Google and Apple tools have been the most sophisticated when it comes to image analysis.

However, Google, whose Android software powers most of the world’s smartphones, made the decision to turn off the ability to visually search for a primate for fear of making an offensive mistake and classifying someone as an animal. And Apple, with technology that performed similarly to Google’s in our testing, appears to have disabled the ability to search for monkeys and monkeys as well.

Consumers may not need to do such a search frequently — though in 2019, an iPhone user complained on an Apple customer support forum that the software “can’t find monkeys in the photos on my device.” But the case raises larger questions about other unfixed or irreparable flaws lurking in services that rely on computer vision — technology that interprets visual images — as well as other AI-powered products.

Mr. Alciné was dismayed to learn that Google had not yet completely solved the problem, and said that society puts a lot of faith in the technology.

“I will not forever believe in this artificial intelligence,” he said.

Computer vision products are now used for mundane tasks such as sending an alert when there is a package on the doorstep, and heaviness such as navigating cars and finding offenders in law enforcement investigations.

The errors could reflect racist attitudes among those who encode the data. In the gorilla incident, two former Googlers who worked on the technology said the problem was that the company did not include enough images of black people in the set of images it used to train its AI system. As a result, the technology was not familiar enough with dark-skinned people and mistook them for gorillas.

As AI becomes more ingrained in our lives, it raises fears of unintended consequences. Although computer vision products and AI chatbots like ChatGPT are different, both rely on underlying data packets that train the program, and both can malfunction due to flaws in the data or biases built into their code.

Microsoft recently restricted users’ ability to interact with a chatbot built into its search engine, Bing, after it instigated inappropriate conversations.

Microsoft’s decision, like Google’s choice to completely prevent its algorithm from identifying gorillas, illustrates a common approach in the industry — to turn off features of technology that malfunction rather than fix them.

“Resolving these issues is important,” said Vicente Ordonez, a professor at Rice University who studies computer vision. “How can we trust this software for other scenarios?”

Google has prevented its Photos app from labeling anything as a monkey or ape because it determined the benefit “doesn’t outweigh the risk of harm,” said Michael Marconi, a Google spokesperson.

Apple declined to comment on users being unable to search for most primates on its app.

Representatives from Amazon and Microsoft said the companies were always striving to improve their products.

When Google was developing its own Photos app, which was released eight years ago, it collected a large amount of photos to train its AI system to recognize people, animals, and objects.

Two former Googlers said his oversight — not having enough photos of black people in his training data — subsequently caused the app to crash. Former employees said the company failed to disclose the “gorilla” issue at the time because it did not require enough staff to test the feature before its public debut.

Google has apologized profusely for the gorilla incident, but it was one of a number of episodes in the broader tech industry that have led to accusations of bias.

Other products criticized include HP’s face-tracking webcams, which couldn’t detect some people with darker skin, and the Apple Watch, which, according to a lawsuit, failed to accurately read blood oxygen levels across skin tones. The glitches indicate that the tech products were not designed for people with darker skin. (Apple pointed to a paper from 2022 detailing its efforts to test its application of blood oxygen on “a wide range of skin types and tones”).

Years after the Google Photos bug, the company ran into a similar issue with its Nest home security camera during internal testing, according to a person familiar with the incident who worked at Google at the time. The Nest Cam, which used artificial intelligence to determine whether someone on the property was familiar or unfamiliar, mistook some of the blacks for animals. The person said Google rushed to fix the problem before users could access the product.

However, Nest customers continue to complain on the company’s forums about other flaws. In 2021, a customer gets alerts that his mother was ringing the doorbell but finds his mother-in-law on the other side of the door. When users complained that the system was confusing faces they had marked as “familiar,” a customer support representative on the forum advised them to delete all of their stickers and start over.

“Our goal is to prevent these types of errors from ever occurring,” said Mr. Marconi, a Google spokesperson. He added that the company has improved its technology “by partnering with experts and diversifying our image datasets.”

In 2019, Google tried to improve the facial recognition feature of Android smartphones by increasing the number of dark-skinned people in its data set. But the contractors hired by Google to collect the face scans are said to have resorted to a troubling tactic to compensate for the paucity of diverse data: They targeted the homeless and students. Google executives called the incident “extremely disturbing” at the time.

While Google worked behind the scenes to improve the technology, it never let users judge those efforts.

Margaret Mitchell, a researcher and co-founder of Google’s Ethical AI group, joined the company after the gorilla incident and collaborated with the Photos team. She said in a recent interview that she was a supporter of Google’s decision to remove the “gorilla tag, at least for a while.”

“You have to think about how often someone needs to be labeled a gorilla versus perpetuating harmful stereotypes,” said Dr. Mitchell. “The benefits do not outweigh the potential harms of the error.”

Dr. Ordonez, the professor, expected that Google and Apple could now be able to distinguish between primates and humans, but did not want to enable the feature given the potential reputational risks if it were to disappear again.

Google has since released a more powerful image analysis product, Google Lens, a tool for searching the web with images rather than text. Wired discovered in 2018 that the tool was also unable to identify a gorilla.

These systems are never foolproof, said Dr. Mitchell, who no longer works at Google. Since billions of people use Google’s services, even the rare bugs that happen to only one in a billion users will be seen.

“It only takes one mistake to have massive social repercussions,” she said, referring to it as “the poisoned needle in a haystack.”

#Google #Photos #find #Gorillas #Apple

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top