Google’s AI Will No Longer Tag Photos with Gender, Will Use ‘Person’ Instead

 

 
 
 
 



Google has publicized that its workstation image procedure will no longer tag pictures with sexual category. According to an email sent yesterday to developers, an AI-based tool will no longer use gender tags such as “woman” or “man,” and instead will default to “man.”
 Click here: Imagesmall
The changes apply to the Cloud Vision API, which developers can use to mark photos based on what the computer “sees” inside the frame. According to Business Insider, to whom a copy of this email was sent, this API is now modified to avoid a potential error.
“Given that a person’s gender cannot be determined by appearance,” the e-mail says, “we decided to remove these shortcuts in order to comply with Google’s Principles of Artificial Intelligence, in particular, principle number 2: to avoid creating or reinforcing unfair bias.”

Testing the API for yourself shows that the change has already taken effect:

The offset referenced by Google is the result of “incorrect training data,” which inevitably leads the algorithm to make certain assumptions. Anyone who does not fit into the trained binary of this algorithm of how a “man” or “woman” may look will thus be automatically exposed to errors by the AI. This is what Google is trying to avoid.
The principle of artificial intelligence, which Google mentions in the email, specifically states that Google will try to “avoid unjust human exposure”, especially with regard to “sensitive characteristics” such as race, ethnicity, gender, political beliefs and sexual orientation, among others. ,
According to Business Insider, this decision led to predictable consequences. The politician from Mozilla with whom they spoke said that this step is “very positive” and agreed with Google that the gender of the person cannot be taken out of appearance, while at least one affected developer also reacted to the change. stating that "political correctness takes place in APIs."

For more information visit our website https://imagesmall.com/

Comments