AI can identify sexual orientation better than humans do

Computers can tell if you're gay or not just by looking at your face

The study certainly goes very far in proving that sexual orientation is innate and not a choice.

Artificial intelligence software developed at the University of Stanford can predict a person's sexuality with far more accuracy than humans, suggesting a "gaydar" app may not be far away.

The study, which was conducted by researchers Michal Kosinski and Yilun Wang, analyzed more than 35,000 facial images posted on a U.S. dating website.

Researchers Michal Kosinski and Yilun Wang have suggested that the AI can pick out subtle differences in facial structure that can help it determine its answer.

The research found that gay men and women tended to have "gender-atypical" features, expressions and "grooming styles", essentially meaning gay men appeared more feminine and visa versa.

"Typically, [heterosexual] men have larger jaws, shorter noses, and smaller foreheads". The AI had an accuracy of 81% (for men) when given one picture to look at but this rose to 91% when fed five photos of a person.

The study, first reported in the Economist, has sparked heated debate about the biological origins of sexual orientation and the ethics of facial-detection technology, which is becoming increasingly advanced and prevalent in society. Rather, it is created to demonstrate-or even warn-that technological advances can be used for such means, and could pose a threat to our privacy as information is so much more easily accessible digitally. The accuracy level in the cases of women is low but it is still impressive as a human analyzer guessed the sex of both men and women with much lower accuracy.

The researchers trained the AI using pictures of 36,630 men and 38,593 women taken from online dating profiles of gay and straight people.

Rule speculated about AI being used to actively discriminate against people based on a machine's interpretation of their faces: "We should all be collectively concerned". By contrast, a human was only able to identify sexual orientation 61 percent of the time for men and 54 percent for women. The model performed worse with women, telling gay and straight apart with 71% accuracy after looking at one photo, and 83% accuracy after five.

The study, which was not peer-reviewed, also makes no distinction between sexual orientation and sexual activity, and assumes there are only two sexual orientations, gay and straight. The advocates also criticized the study for excluding people of color and bisexual and transgender people and claimed the research made overly broad and inaccurate assumptions about gender and sexuality.

"Faces contain much more information about sexual orientation than can be perceived and interpreted by the human brain", the authors concluded from the study.

Of course, as these are dating profiles, all the queer people are presumably open about their sexuality.

However, understandably there are major concerns around this type of technology even existing.

"Imagine for a moment the potential consequences if this flawed research were used to support a brutal regime's efforts to identify and/or persecute people they believed to be gay", he said. Facial images of billions of people are also stockpiled in digital and traditional archives, including dating platforms, photo-sharing websites, and government databases.

"If you can start profiling people based on their appearance, then identifying them and doing frightful things to them, that's really bad", Rule told the Guardian.

Related:

Comments


Other news