Why Using AI to Identify Gay People Is Dangerous
Science

Why Using AI to Identify LGBTQ+ People Is Dangerous

Shutterstock

Using AI to determine queer sexuality is misconceived and dangerous.

How do we know if someone is gay? A recent Stanford University study has claimed that Artificial Intelligence (AI) using a facial recognition algorithm can more accurately guess whether a person is gay or lesbian than human beings can.

The study has proved controversial not because of our apparent mediocrity in the face of computer algorithms, but because of its dubious methodology — among other things, its exclusive focus on white subjects and its exclusion of bisexual, transgender, and intersex participants. It also highlights the dangers AI poses to the “outing” of sexual minorities against their will, exposing people to possible discrimination.

Related | This AI Can Guess Your Sexuality From a Selfie

We strongly object to the use of an algorithmic “gaydar” to predict a person’s sexual orientation and believe studies such as this are both misconceived and pose very real and present dangers for LGBTQI human rights around the world.

The ambiguity of gayness

The claim that AI can determine whether a person is gay or lesbian by assessing a photograph of their face presupposes that sexuality exists as a binary: you’re either gay or straight. Yet, some people are neither — for example, they may be attracted to people who identify as a third gender.

Moreover, while we are accustomed to using the term gay, its definition is somewhat elusive. If the term refers to particular kinds of sexual activity between men, then self-identified but celibate homosexuals present a problem, as do sections of the prison population who don’t identify as homosexual but have sex with men in prison, and those who participate in sex with men for money but don’t identify as gay. So what does a “gay” face reveal to an algorithm? If it’s not sexual acts, then is it sexual preference, queer sensibility, identity or something else?

However, this isn’t the most disturbing aspect of such studies. The greater concern lies with why we are so obsessed with knowing the causes of homosexuality in the first place.

The search for a cure

The study of homosexuality has an ignoble past. From 19th century sexology and psychoanalysis to more contemporary studies within medical science, homosexuality has been pathologized. Attempts have been made to eradicate it through various methods including talking therapies, electroconvulsive therapy and lobotomy. It’s only since 1972 and 1992 that homosexuality has been removed from the main global diagnostic manuals of mental disorders.

This history produces a chilling effect. As the American queer theorist Eve Sedgwick warned in 1990, the question of what makes someone gay is not easily separated from “the essentially gay-genocidal nexuses of thought through which [it] has developed.” LGBTQI people might be less concerned if the desires animating the research were not ultimately linked to ways of policing — or worse erasing — sex, gender, and sexual difference.

The Stanford study does not buck this trend. Rather, the authors note that:

Our results provide strong support for the [prenatal hormone theory], which argues that same-gender sexual orientation stems from the underexposure of male fetuses and overexposure of female fetuses to prenatal androgens responsible for the sexual differentiation of faces, preferences, and behavior.

Our concern is that such studies encourage us to reduce being gay to something that can be understood, or worse “cured”, through science. Recently, intersex advocates have warned of how doctors have prescribed a steroid called dexamethasone to prevent homosexuality and “physical masculinization” in fetuses identified as female.

Dangers of prediction

It is not our intention to attribute bad motives to the authors of the Stanford study. They themselves warn against some potential homophobic uses of their research. But, it’s important to focus on potential homophobic uses. Which state institutions, companies or groups are likely to use such a study and to what ends? Ultimately, a concern with ethics requires us to ask whether such studies and technologies further or undermine LGBTQI human rights.

Related | 78 Countries Where Being Gay is Illegal

Predicting someone’s sexuality may sound innocuous, but in places that criminalize or police homosexuality and gender non-conformity, the consequences of prediction can be life threatening. For example, the Malaysian government issued a “guide” for identifying homosexuals that relied on physical and social characteristics. This sort of information encourages institutions and individuals to “spot” so-called “deviants” in order to subject them to punishment, discrimination, harassment, and vilification.

Even in places where queer people seek protection from persecution, scientific analytics prove problematic. The Czech Republic, for example, has used phallometric testing to determine whether a person seeking asylum is really gay or not. This is a process where electrodes are attached to the penises of people watching gay porn to measure physiological arousal. Not only are such tests violations of privacy, they are wholly inaccurate in determining whether someone identifies as gay or faces persecution because they are perceived to be gay.

While the European Court of Justice has condemned the use of humiliating tests, the potential for using facial recognition software to predict sexuality could result in refugees being denied asylum because algorithms — like many human decision-makers — fail to recognize their sexuality.

We cannot ignore the consequences of algorithms that predict someone’s sexuality. In a world that polices and punishes people on the basis of their actual or perceived sexual identity, ignoring such things can have devastating results.

Alex Sharpe is a professor of law at Keele University and Senthorun Raj, and a lecturer at Keele Law School, Keele University.

This article was originally published on The Conversation. Read the original article.

Why Using AI to Identify LGBTQ+ People Is Dangerous
To Top