I am a white cisgender man. I can unlock my phone, log in to my bank account and quickly go through the border patrol checkpoints with my face with an accuracy of 98 percent.
Facial recognition software is ideal for people who look like me. Of course, I still face the same existential dangers as everyone else: in the United States and China we live in a state of total vigilance. The police can track us without a court order and put us into categories and groups as "dissident" or "anti-police", without even investigating us. If I show my face in public, they are probably following me. But that doesn't make me special.
What makes me special is that I look like a white man. My beard, short hair and other features remind facial recognition software that I am the "default" when it comes to how AI classifies people. If I were black, brown, a woman, transgender or non-binary, "AI" would have difficulties or not identify me. And, in this domain, that means that cutting-edge technology from Microsoft, Amazon, IBM and others inherently discriminates against anyone who doesn't look like me.
Unfortunately, advocates of facial recognition often do not see this as a problem. Scientists at the University of Boulder in Colorado recently conducted a study to show how badly AI performs when trying to recognize the faces of transgender and non-binary people. This is a problem that has been framed as horrible by people who believe that AI should work for everyone, and "is not a problem" by those who think only in unnatural binary terms.
It's easy for a fan to rule out the tribulations of those whose identity is out of their worldview, but these people are missing the point altogether. We are teaching AI to ignore basic human physiology.
The researcher Morgan Klaus Scheuerman, who worked on the Boulder study, appears to be a cis man. But because he has long hair, IBM's facial recognition software labels him "feminine."
And then there are beards. Approximately 1 in 14 women has a condition called hirsutism that causes them to grow "excess" facial hair. Almost all humans, men or women, grow some facial hair. However, at a rate of approximately 100 percent, AI concludes that facial hair is a male trait. Not because it is, but because it is socially unacceptable for a woman to have facial hair.
In 20 years, if it suddenly becomes elegant for women to grow a beard and men to maintain a smooth face, AI trained in binary image data sets would label people with beards as women, whether they are or no.
It is important that people understand that AI is stupid, they no longer understand gender or race as a toaster understands thermodynamics. Just try to understand how people who develop it see race and gender, that is, those who set their rewards and success parameters determine the accuracy threshold. If you are all white, everything is fine.
If you are black? He could be a member of Congress, but Amazon AI (the same system used by many law enforcement agencies in the U.S.) is likely to label him wrongly as a criminal. Google might think you are a gorilla. Worse, if you are a black woman, all major facial recognition systems have a great chance of labeling you as a man.
But if you're not binary or transgender, things get even worse. According to a researcher who worked on the Boulder study:
If you are a cisgender man or a cisgender woman, you are doing quite well in these systems. If you are a trans woman, not so good. And if you're a trans man … looking at the Amazon Rekognition … you're 61 percent. But if we go beyond people who have binary gender identities … one hundred percent of the time they will be classified incorrectly.
Facial recognition software reinforces the flawed social constructions that men with long hair are female, women with short hair are male, intersex people don't matter, and the bar viability in an AI the product is "if it works for white men of cis gender, it is ready for release."
It is easy to ignore this problem if it does not affect you, because it is difficult to see the "dangers" of facial recognition software. Black people and women can use Apple's Face ID, so we assume that this integrated example of machine learning represents the reality of general recognition connected to the database. It does not.
FaceID compares the face you see with a database consisting of only you. General detection, such as finding a face in nature, is done through programmable thresholds. This only means that the Amazon Rekognition, for example, can be set at 99 percent confidence (it won't make a decision if you're not 99 percent sure), but then it becomes useless since it will only work on cisgender men and women White with perfect portrait photos and great lighting. Law enforcement agencies reduce the accuracy threshold well below the minimum configuration recommended by Amazon for the system to start making "guesses" on non-whites.
Police, politicians, banks, airports, border patrols, ICE, UK passport offices and thousands of other organizations and government entities use facial recognition every day despite the fact that it creates and automates the privilege of cisgender of white.
If Microsoft released a version of Windows that proved and qualitatively worked best for blacks, Asians or Orientals, it is likely that the outrage would be enough to shake the company's billions of dollars in the world of technology. It does not matter that almost all artificial intelligence technology that labels or classifies people into subsets based on inherent human characteristics such as sex, gender and race exacerbates secondary systemic intolerance.
Facial recognition software designed to make things in general more efficient is a unique cisgender sign that prevents access to the future. It gives cisgender white men a superior advantage over the rest of the world.
There is certainly hope that, someday, researchers will discover a way to combat these prejudices. But, at this time, any government or company that implements these technologies for wide use runs the risk of intentionally using artificial intelligence to disseminate, codify and reinforce the current notions of intolerance, racism and misogyny.