Gay rights groups have come out against a controversial study that found that sexual orientation can be read from people’s faces.

The implications for personal privacy are obvious regarding research by Stanford’s Yilun Wang and Michal Kosinski. The study used an artificial intelligence facial recognition algorithm and more than 35,000 pictures of men and women in a dating site who’d identified themselves as gay or straight.

The AI model correctly distinguished between gay and straight men 81% of the time, and gay and straight women with 71% accuracy. In short, the study scientifically supports the notion of gayface.

The study’s co-authors said they were “really disturbed” by their findings, “given that companies and governments are increasingly using computer vision algorithms to detect people’s intimate traits, our findings expose a threat to the privacy and safety of gay men and women.”

Download the app!100% ad free.

Welcome to RedFlag News, a 100% independent news-aggregation website. The views expressed herein are the views of the linked author exclusively and not necessarily the views of RFN or its advertisers. // Aggregated content may contain copyrighted material. Such material is made available for educational purposes only. This constitutes a 'fair use' of any such copyrighted material as provided for in Title 17 U.S.C. section 107 of the US Copyright Law. If you believe there is a genuine DMCA copyright infringement, please contact us immediately so that your photography or content can be reviewed and, if necessary, removed.