Google AI Won’t Use ‘Man’ or ‘Woman’ to ID People, Says Business Insider

February 21st, 2020 7:59 AM

An automated service will no longer label people as male or female, according to a report from Business Insider.

Google's Cloud Vision API, a “computer vision” product that has the ability to “[a]ssign labels to images and quickly classify them into millions of predefined categories,” might be making some changes to two specific labels. Business Insider claimed to have seen a Feb. 20 email from Google to developers, which stated that the company would avoid using gendered labels for its image tags.

Business Insider claimed that this was a direct quote from the email: “Given that a person’s gender cannot be inferred by appearance, we have decided to remove these labels in order to align with the Artificial Intelligence Principles at Google, specifically Principle #2: Avoid creating or reinforcing unfair bias.”

In the Artificial Intelligence Principles published by Google AI, Principle #2 states: “We will seek to avoid unjust impacts on people, particularly those related to sensitive characteristics such as race, ethnicity, gender, nationality, income, sexual orientation, ability, and political or religious belief.”

Mozilla tech policy fellow Frederike Kaltheuner reportedly told Business Insider that “Anytime you automatically classify people, whether that’s their gender, or their sexual orientation, you need to decide on which categories you use in the first place -- and this comes with lots of assumptions. Classifying people as male or female assumes that gender is binary.”

In 2019, Google reportedly uploaded “gender fluid” emojis for users to employ in Android and in messaging forums. The company is also quick to judge conservatives for what it deems to be anti-LGBTQ speech. Kay Cole James, head of the conservative Heritage Foundation, was protested by Google employees when she was announced as a member of the Google Ethics Advisory Board on Artificial Intelligence.