Can a computer determine whether a person is lesbian or gay by appearance? Stanford University researchers decided that they could do this and caused a firestorm in the media. Early in their announcements, the mainstream media accepted their research:

“Advanced computer analysis … found that gay men and women tended to have ‘gender-atypical’ features: fixed features such as the shape of one’s nose or jaw, and transient features such as hairstyle and facial hair…. ‘The results show that the faces of gay men were more feminine and the faces of lesbians were more masculine than those of their respective heterosexual counterparts. Among men, the data-driven measure of facial femininity positively correlated with the probability of being gay.’”

The study reported that gay men had narrower jaws and lesbians had larger jaws. The assumption is from the questionable theory that these characteristics come from levels of hormone exposure, in particular testosterone, in the uterus. The study used only white people, according to the researchers, because the source gave them insufficient numbers of people of color. The computer was programmed to focus on the nose, eyes, eyebrows, cheeks, hairline and chin of males and the nose, mouth corners, hair and neckline were more important for women. The abstract also stated that gay men and lesbians have “gender-atypical … expression and grooming styles.”

“We used deep neural networks to extract features from 35,326 facial images. These features were entered into a logistic regression aimed at classifying sexual orientation. Given a single facial image, a classifier could correctly distinguish between gay and heterosexual men in 81% of cases, and in 74% of cases for women. Human judges achieved much lower accuracy: 61% for men and 54% for women. The accuracy of the algorithm increased to 91% and 83%, respectively, given five facial images per person. Facial features employed by the classifier included both fixed (e.g., nose shape) and transient facial features (e.g., grooming style).”

Among their criteria is that “heterosexual men and lesbians tended to wear baseball caps” and that “gay men were less likely to wear a beard.” Lesbians don’t wear eye makeup, but straight women do. And lesbians wear “less revealing clothes,” have darker hair, and smile less than straight women,

The photographs of white people were based on researchers’ pre-conceived assumptions about ethnicity.

Only two sexual orientations—straight and lesbian/gay—were assumed for the study. No bisexuals, asexuals, questioning, etc.

All the subjects were under the age of 40.

Everyone lived in the United States.

They were all open about their sexual orientations on a website by identifying the gender—“men” or “women”—of the subject for relationships. No closeted people.

The study addressed a rigid sexual and gender binary, ignoring people with non-binary gender identity or sexual orientation.

A response to an article about the research’s accuracy had valid points:

“My face would look different if I were single. If I were trying to attract a woman, I would go to the gym more, which would slim down my face. I would get haircuts more often. I would trim my beard more often, and shave more often. It’s intuitive to me that this would be common. Some people might put more effort into styling their hair, doing their makeup, treating their acne, wearing contacts instead of glasses, etc. (I’m not saying people who wear glasses are less attractive; I’m saying people who wear glasses might feel they’re less attractive than they would be without glasses.)”

Researchers said that the computer could select gay people with a 91-percent accuracy rate only applies when the computer program knew that one of two images belonged to a gay person. Selecting 1,000 men at random with at least five photographs, the ratio of success dropped to seven in every 100. When given 100 males that researchers through were most likely to be gay, only 47 of them self-identified as gay.

One person evaluating the research said that “it’s a description of beauty standards on dating sites that ignores huge segments of the LGBTQ community, including people of color, transgender people, older individuals, and other LGBTQ people who don’t want to post photos on dating sites.”

After greater scrutiny of the study, the American Psychological Association’s Journal of Personality and Social Psychology, which had tentatively accepted it for publication, decided to take a “closer look.” An editor addressed the journal’s concerns:

“In the process of preparing the [manuscript] for publication, some concerns have been raised about the ethical status of your project. At issue is the use of facial images in both your research and the paper we accepted for publication. The acceptance was conditional on the ethical clearance of the work to be published. We typically do this by having authors state the [Institutional Review Board] status of their work. However, in this particular case, there turns out to be some idiosyncratic issues that must be cleared before your paper can be set ready for publication.”

The editor continued by stating that the images were copyrighted and that “the owners of the images posted them for different purposes. It may therefore be deemed unlikely that all of them would have granted consent for the use of their images in your research work.”

GLADD stated that technology can only “recognize … a pattern that found a small subset of out white gay and lesbian people on dating sites who look similar.” Sherry Turkle, a professor at the Massachusetts Institute of Technology and author of the book Reclaiming Conversation said that the industry may want to “buy and sell this information with purposes of social control.” She pointed out that judging a person by appearance can increase institutional discrimination. Also frightening is extending the purported use to technology identify other characteristics such as those assumed to belong to Jewish people.

Ashland Johnson, Director of Public Education and Research at the Human Rights Campaign, pointed out how the “dangerously bad” research was “likely to be taken out of context” and possibly “threaten the safety and privacy of LGBTQ and non-LGBTQ people alike.” He gave as an example “a brutal regime’s efforts to identify and/or persecute people they believed to be gay.”

In Wired, Sophia Chen drew attention to the researchers’ excuse for developing the program: “[Our] findings expose a threat to the privacy and safety of gay men and women.” Chen responded:

“They built the bomb so they could alert the public about its dangers.”

Ethicist Jake Metcalf of Data & Society social scientists lack clear ethical deadlines to keep from harming people. He explained that outdated and irrelevant guidelines for social experiment allow researchers to make up the rules themselves. Government-funded scientists must get the approval of an institutional review board that developed its rules 40 years ago for such human interactions as drawing blood or conducting interviews. In this study about identifying lesbians and gays, the researchers assumed they didn’t have to consult a review board because they didn’t interact with anyone; they just took their photos off an online dating site.

Last month, researchers released a free app to guess ethnicity and nationality from names for an 80 percent accuracy. A scientist at the affiliated Stony Brook University, Steven Skiena, said the purpose was to prevent discrimination. Yet no one has any way to control the use of the app. It’s the same problem with a possibly inaccurate method of identifying lesbians and gays.