Metro Weekly

LGBTQ groups decry study claiming facial recognition technology can tell if you’re gay

GLAAD and HRC say study's claims could lead to attempts to use artificial intelligence to persecute LGBTQ people

Photo: Intel Free Press, via Wikimedia.

The Human Rights Campaign and GLAAD are outraged after media outlets reported on a study by a professor affiliated with Stanford University that claims that artificial intelligence can be used to detect sexual orientation.

The two LGBTQ organizations are primarily concerned that the research study — which has a number of methodological flaws — will be misused to target LGBTQ people for discrimination, violence, or other nefarious motives.

The study in question, by Dr. Michal Kosinski and Yilun Wang, claims that facial recognition software can infer a person’s sexual orientation by analyzing their facial features. They claim the software has a high likelihood of successfully determining sexual orientation because the program is able to pick up on facial features that result from exposure to certain levels of hormones, particularly testosterone, when they are in the womb, reports The Economist.

The study, which is expected to be published in the Journal of Personality and Social Psychology, involved hundreds of thousands of images of gay and straight men and women from a popular American dating websiteThe researchers claim their software is able to determine correctly whether a man is gay 81% of the time after viewing one photo, and 91% of the time after viewing five photos of the same model. For women, they claim the program picks correctly 71% of the time after one photo, and 83% after five.

The LGBTQ groups wasted no time blasting Kosinski and Wu’s claims, while also calling on Stanford University to distance itself from the study. They are also urging media outlets to highlight and expose the flaws in their research and to debunk any factually inaccurate claims that artificial intelligence can tell if someone is gay just by looking at their face.

“Technology cannot identify someone’s sexual orientation,” Jim Halloran, GLAAD’s chief digital officer, said in a statement. “What their technology can recognize is a pattern that found a small subset of out white gay and lesbian people on dating sites who look similar. Those two findings should not be conflated. This research isn’t science or news, but it’s a description of beauty standards on dating sites that ignores huge segments of the LGBTQ community, including people of color, transgender people, older individuals, and other LGBTQ people who don’t want to post photos on dating sites.

“At a time where minority groups are being targeted, these reckless findings could serve as weapon to harm both heterosexuals who are inaccurately outed, as well as gay and lesbian people who are in situations where coming out is dangerous,” he added.

Even more concerning, the groups say, is that Stanford University and the lead researchers met with GLAAD and HRC several months ago. At that meeting, these same concerns were raised and the LGBTQ groups warned against over-inflating the results or the significance of them. There was no follow-up meeting between the researchers and the LGBTQ representatives, and the same flaws that were highlighted in that meeting have not been addressed.

Specifically, the groups take issue with the fact that the study did not look at any non-white individuals, and did not independently verify crucial information such as age and sexual orientation, instead taking information appearing online at face value. 

The study, which was not peer-reviewed, also makes no distinction between sexual orientation and sexual activity, and assumes there are only two sexual orientations, gay and straight. Lastly, even if the facial recognition software were accurate 81 percent of the time, that would still lead to instances where straight people were incorrectly identified as gay — something that could be concerning were the software used to target sexual minorities.

“This is dangerously bad information that will likely be taken out of context, is based on flawed assumptions, and threatens the safety and privacy of LGBTQ and non-LGBTQ people alike,” Ashland Johnson, HRC’s director of public education and research, said in a statement.

“Imagine for a moment the potential consequences if this flawed research were used to support a brutal regime’s efforts to identify and/or persecute people they believed to be gay,” Johnson said. “Stanford should distance itself from such junk science rather than lending its name and credibility to research that is dangerously flawed and leaves the world — and this case, millions of people’s lives — worse and less safe than before.”

John Riley is the local news reporter for Metro Weekly. He can be reached at jriley@metroweekly.com