All of us are recorded on surveillance cameras dozens of times per day. And soon many of us will be unlocking new iPhones with our faces. As facial recognition technology continues to improve, there is a danger that it can be combined with the ample existing footage of us to invade our privacy in new ways.
To warn against this, Dr. Michal Kosinski, a professor at Stanford Graduate School of Business, set out to develop gaydar. By feeding 35,000 photos of both gay and straight folks' faces into a deep neural network's algorithms, and using an off-the-shelf facial analysis program, Kosinski and co-researcher Yilun Wang reckon they've granted AI the ability to visually distinguish between gay and straight.
Enter a caption (optional)
The accuracy that they're claiming is astonishing: They say that if provided five photographs of a person, their system can correctly identify gay or straight with 91% accuracy for males and 83% accuracy for females. When humans were put to the test, their accuracy was far worse, getting it right only 61% of the time for males and 54% for females.
This has some terrifying implications. There are plenty of reasons that a person might want to keep their sexual orientation private, and there are still developed countries on this planet where homosexuality is considered a crime punishable by death.
Speaking of death, Dr. Kosinski has had his own life threatened after publishing the research, according to the Times:
"I imagined I'd raise the alarm," Dr. Kosinski said in an interview. "Now I'm paying the price." He'd just had a meeting with campus police "because of the number of death threats."
Plenty of folks have been tearing into the researchers' claims, which have all been recorded in a publically-viewable paper called "Deep Neural Networks Can Detect Sexual Orientation from Faces," and which is due to be published in the Journal of Personality and Social Psychology. Critics say the system could not possibly be reliable outside of the study, which relied solely on photographs of white Americans who were open about their sexual preferences.
Until Dr. Kosinski's gaydar is definitively proven accurate or inaccurate, we can choose to either believe that it works or that it doesn't work. Although the pseudoscience of physiognomy--whereby it was thought that one could deduce a person's intelligence and criminal proclivities by their facial features--has been debunked, I don't have a hard time believing that algorithms crunching through thousands of photos can detect patterns that we humans cannot perceive. "Just because humans are unable to see the signs in faces," the Economist points out, "does not mean that machines cannot do so."
Enter a caption (optional)
I also remember reading a 2003 University of London study where researchers discovered that lesbians blink like straight men. To explain, people blink when startled, by a loud noise, for instance. The rate of this involuntary eye-blink is different between straight men and straight women. However, the researchers found that the blink rates lined up for lesbian women and straight men. As it is humanly impossible to control this response to being startled, the study would seem to reinforce that sexual orientation is involuntary and not a choice.
If Dr. Kosinski's gaydar is accurate, it, too, could be used to support that case; we cannot easily change the micro-dimensions of our facial features.
Alternatively, the gaydar could be exploited for profit or used in the service of hatred or ideology.
As is always the case with technology, it would be less about the tech and more about what we choose to do with it.