Google apologized after its app called two users something extremely racist.

Google apologized after its app called two users something extremely racist.
Advertising

Google's new auto-tagging algorithm for Google Photos turned out to be a racist a-hole.

Google has never been afraid to roll out experimental new technology and let users catch the bugs. It's actually the best way to develop software. But that philosophy fails when your software starts spewing racial slurs.

Jacky Alciné is a computer programmer from Brooklyn, NY. He uploaded a number of pictures to Google Photos this week, and got a nasty surprise. A new update to the app allows it to automatically tag pictures and sort them into categories based on similarity to sample images. Basically, the app knows what photos are of food, landscapes, buildings, people, and animals. Or it's supposed to.

The app sorted the photos of Alciné and a friend of his into an album labelled "gorillas." Considering that "gorilla" is a slur for black people, and that both Alciné and his friend are black, that's pretty bad. What's worse is that these photos were singled out from a larger collection, the rest of which were properly labeled. Alciné uploaded the proof to Twitter:

Advertising

Less than an hour and a half later, Google's Chief Social Architect Yonatan Zunger wrote back to Alciné in full damage-control mode:

Google's devs immediately went to work fixing the error. First, they removed the "gorilla" tag entirely, but a more permanent solution will take a lot of work. In an extended Twitter thread, Zunger explained that their facial recognition software had difficulty processing "dark-skinned faces." He added that light-skinned faces were a challenge they had just recently overcome.

Advertising

This is actually a pretty great example of customer relations. Zunger responded immediately with multiple apologies and a concerted effort to fix the problem, and Alciné thanked him for it. Google responded officially too – in a statement to Ars Technica, a spokesperson said:

"We're appalled and genuinely sorry that this happened. We are taking immediate action to prevent this type of result from appearing. There is still clearly a lot of work to do with automatic image labeling, and we're looking at how we can prevent these types of mistakes from happening in the future."
Advertising

Good work, Google. Here's one step you can take to prevent it from happening again: don't get racist great-grandfathers to write your software.

Advertising