Get Into PC

We Warm Welcome by Get Into PC your Biggest source for new versions of softwares and Apps for PC, Macintosh and Linux Getintopc.

Study: Facial recognition AI's alright, if you're cisgender and white

Study: Facial recognition AI's alright, if you're cisgender and white

I am a white cisgender man. I can unlock my phone, log in to my bank account and quickly go through the border patrol checkpoints with my face with an accuracy of 98 percent.

Facial recognition software is ideal for people who look like me. Of course, I still face the same existential dangers as everyone else: in the United States and China we live in a state of total vigilance. The police can track us without a court order and put us into categories and groups as "dissident" or "anti-police", without even investigating us. If I show my face in public, they are probably following me. But that doesn't make me special.

What makes me special is that I look like a white man. My beard, short hair and other features remind facial recognition software that I am the "default" when it comes to how AI classifies people. If I were black, brown, a woman, transgender or non-binary, "AI" would have difficulties or not identify me. And, in this domain, that means that cutting-edge technology from Microsoft, Amazon, IBM and others inherently discriminates against anyone who doesn't look like me.

Unfortunately, advocates of facial recognition often do not see this as a problem. Scientists at the University of Boulder in Colorado recently conducted a study to show how badly AI performs when trying to recognize the faces of transgender and non-binary people. This is a problem that has been framed as horrible by people who believe that AI should work for everyone, and "is not a problem" by those who think only in unnatural binary terms.

bitotry

Read also x