![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9XENrBDlUfVgm4XRFgt_Ngcy-qT9IGq0j2zgR2tXphoWy3qj3oGiJAx4Ap-zqWwLsT3u1rr7QawK5w_CQYH8Vry-ooWqYlNh7fklCNDjzBDzJOokdy2iYVCwGNgixB-CwvydN9NmnT0Q/s200/25cc42b6deadbfaa1ae4922e97e6131c.jpg)
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiVhq2PheTJltnliHCI3_FnfJanRdjDP3Mi7fvD83xMV7qI7h9igNWvAvOOJwhS798gP9RJviE0m6PBOrig3epoKr1uzCfac8n1hRdAo5t-9gXksmPjbqOXnOanL62CADmhOLedB0Gknlo/s200/2302adcd5aaa592e6903b653e01aaab3.jpg)
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhuGV2oSlpX5_hPiQhH0XFx8vzStZcWKxW_qr6wzlSGgPXxWxZ1g3ZK671o8S3WigVDkHxGWaHkKw0He-yoxtKLSpGFqHudohbtC5HzwDEnPQjwegjvaIBw8LYG6xSPSueXTdhvt3RGbLk/s200/2937fb7cca70861d22251b22e2bc9ec9.jpg)
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiI-1OARlSVmExkieil_TQwrzn9Kih_YH_08h5rAq8IPuG_S2QFbDzaYfQYhU-_ZC8fhjeSYmJgexVZ2YfHVBcDe-b-LFK_Iz1gB_esel80T1jYIK4z_LpousEuisyG3QPT_Q2NXmW7JTU/s200/9287aa52b03e96d50970d15d5bdfefe6.jpg)
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhm-oGCgkafngldq3FF6mityulqOsj7W8vj9qa6VK9wHmEgQsCVZXx_S9n5dKQ5jAkfqlZFT-Bf3rLyen_NTc55h6PN0gIelYYHaK_x2D_fB6um22kfTngUcdFrQu8tjU-Kl4UNT28JCC8/s200/9934ec6f4090147f478f1b02fe5941b6.jpg)
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjDLBOLvV1qzu4m_yFSam5-W61JQcM6PFWqQsTpfgAjDzwwGJHIwRIrJQZagfHjEvjm7GUB-t5YN1kb2-jM2hl7tJ4x_FidSb2VGam-TCkhz9zbqhLuBbXx9zX8fonlNzzgg_IhLGnSAic/s200/52977c72472daec671dbdf5036217a30.jpg)
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgstoFo-LD5a8tMhDUH_73XEbAAQq6KBXNMQ-7tpCc0y-3Jd9yVFwwcVnWmBFlm5BbgLJWHGWIGhDZ2fnVWcF1hKtcr-FUB-d2wAiDOnzjLcqk1qDhP_X_WDTayqRcn1XiHB-Y1Xf75DJI/s200/779677e69bf5cefd8cf586d84aa40f77.jpg)
Google has publicized that its workstation image procedure
will no longer tag pictures with sexual category. According to an email sent
yesterday to developers, an AI-based tool will no longer use gender tags such
as “woman” or “man,” and instead will default to “man.”
The changes apply to the Cloud Vision API, which developers
can use to mark photos based on what the computer “sees” inside the frame.
According to Business Insider, to whom a copy of this email was sent, this API
is now modified to avoid a potential error.
“Given that a person’s gender cannot be determined by
appearance,” the e-mail says, “we decided to remove these shortcuts in order to
comply with Google’s Principles of Artificial Intelligence, in particular,
principle number 2: to avoid creating or reinforcing unfair bias.”
Testing the API for yourself shows that the change has already taken
effect:
The offset referenced by Google is the result of “incorrect
training data,” which inevitably leads the algorithm to make certain
assumptions. Anyone who does not fit into the trained binary of this algorithm
of how a “man” or “woman” may look will thus be automatically exposed to errors
by the AI. This is what Google is trying to avoid.
The principle of artificial intelligence, which Google
mentions in the email, specifically states that Google will try to “avoid
unjust human exposure”, especially with regard to “sensitive characteristics”
such as race, ethnicity, gender, political beliefs and sexual orientation,
among others. ,
According to Business Insider, this decision led to
predictable consequences. The politician from Mozilla with whom they spoke said
that this step is “very positive” and agreed with Google that the gender of the
person cannot be taken out of appearance, while at least one affected developer
also reacted to the change. stating that "political correctness takes
place in APIs."
Comments
Post a Comment