@Verge — #Google #AI Tool Will No Longer Use Gendered Labels Like In Photos Of People; #Privacy

New York (The Verge) — Google AI tool will no longer use gendered labels like “woman” or “man” in photos of people theverge.com/2020/2/20/2114….

Google’s AI tool for developers won’t add gender labels to images anymore, saying a person’s gender can’t be determined just by appearance. The company emailed developers about the change to its Cloud Vision API tool, which developers use to analyze images and identify faces, landmarks, explicit content, and other recognizable features.

Google’s Cloud Vision API will tag images as ‘person’ to thwart bias.

AI image recognition has been a thorny issue for Google in the past. In 2015, a software engineer noted that Google Photos’ image recognition algorithms were categorizing his black friends as “gorillas.” Google promised to fix the issue, but a follow-up report by Wired in 2018 found Google had blocked its AI from recognizing gorillas and had not done much else to address the problem at its core.

Google released its AI principles in 2018, in response to backlash from Google employees, who protested the company’s work on a Pentagon drone project. The company pledged not to develop AI-powered weaponry, and it also outlined a number of principles, such as the one referenced above, to address issues of bias, oversight, and other potential ethical issues in its future development of the technology. Kim Lyons/@verge

Source: verge, full story

 

Leave a Reply