top of page
Search

Hot Take: Detecting Sexuality with AI is Fake Science

In this week’s Good Robot Hot Takes, Kerry and Eleanor talk about a group of scientists in Zurich that tried to measure a correlation between brain activity and sexuality using AI. This smacks not only of previous attempts to use AI to try and ‘read’ people’s sexuality, but also of dangerous 19th and 20th century race science. We talk about how the language of science is weaponised against queer people, why there are no real scientific foundations to using AI to detect sexuality, and why science needs to think about sexuality not as fixed or static but wild and infinite.


READING LIST:


The Good Robot - Lorraine Daston on the Exorcism of Emotion in Rational Science (and AI)


The Good Robot - Jack Halberstam on Tech, Resistance, and Invention


The Good Robot - Arjun Subramonian on Queer Approaches to AI and Computing

The Good Robot - Sophie Lewis on Techno-Feminisms and Why Nature is Far Stranger Than We Think


The Good Robot - Blaise Agüera y Arcas on Debunking Myths in Technology: Intelligence, Survival, Sexuality


The Good Robot - Alex Hanna on Vague AI Ethics Principles and why Automatic Gender Recognition is Nonsense


The Good Robot - Os Keyes on Avoiding Universalism and 'Silver Bullets' in Tech Design


Tinsley, O. (2018). Ezili's mirrors : Black atlantic genders and the work of the imagination.


Woolf, V. (2014). Orlando (Collins classics).


Daston, L., & Galison, P. (2007). Objectivity. New York: Zone Books.


Haley, S. (2016) No Mercy Here: Gender, Punishment, and the Making of Jim Crow Modernity.


Snorton, C. (2017). Black on both sides : A racial history of trans identity.


Cooper Owens, D. (2018). Medical bondage: Race, gender, and the origins of American gynecology.


Dillon, S. (2018). Fugitive life: The queer politics of the prison state.


Puar, J. (2017). Terrorist assemblages : Homonationalism in queer times.



TRANSCRIPT:


KERRY MCINERNEY:

Hi! I’m Dr Kerry McInerney. Dr Eleanor Drage and I are the hosts of The Good Robot podcast. Join us as we ask the experts: what is good technology? Is it even possible? And how can feminism help us work towards it? If you want to learn more about today's topic, head over to our website, www.thegoodrobot.co.uk, where we've got a full transcript of the episode and a specially curated reading list by every guest. We love hearing from listeners, so feel free to tweet or email us, and we’d also so appreciate you leaving us a review on the podcast app. But until then, sit back, relax, and enjoy the episode!


ELEANOR DRAGE:

I have a question for you: Can you detect my sexuality from my face? Do you think sexuality is something that sits in our brains? In this week’s Good Robot Hot Takes, Kerry and I talk about a group of ‘scientists’ in Zurich that tried to measure a correlation between brain activity and sexuality using AI. This smacks not only of previous attempts to use AI to try and ‘read’ people’s sexuality, but also of dangerous 19th and 20th century race science. We talk about how the language of science is weaponised against queer people, why there are no real scientific foundations to using AI to detect sexuality, and why science needs to think about sexuality not as fixed or static but wild and infinite. I hope you enjoy the show.


KERRY MCINERNEY:

 Welcome to today's episode of The Good Robot Hot Takes. And so I'm here with my lovely co-host and work wife, Dr. Eleanor Drage, who has luckily for her been on holiday in Italy for the last week, which I'm sure she had a wonderful time. And I'm now about to wreck that joyful holiday bubble on Monday, with the return to the work week.


And of course, Our latest kind of spicy hot take on issue in tech, and this week we are talking about the use of AI to try and deduce people's sexuality. So Eleanor, do you see at all while you were in holiday bliss, a news article or any Twitter coverage or anything like that about a controversial new research study that was released this month?


ELEANOR DRAGE:

No. This is a nightmare. I spent the last week eating cheese. So I am just, I'm, yeah, I, it just, it's just horrific. I couldn't believe it when you sent this paper to me because there was one other that we'll talk about, one other instance of AI being used to identify people's sexuality and it was horrific.


So, I mean, this is just really shocking. I'm excited to talk to you about it, but also kind of afraid to get back into it.


KERRY MCINERNEY:

So for our listeners, we're gonna be focusing on a few different pieces of research that have attempted to use AI and machine learning techniques to deduce sexuality.


And that's going to include a major study done at 2018 by Yilun Wang and Michal Kosinski, which used machine learning to try and deduce people's sexual orientation from photographs of their faces. And then this recent paper, which just came out this month in 2023, and I'm gonna read the title out loud because I can't say it off the top of my head, which is Deep Learning in the Identification of Electroencephalogram Sources Associated with Sexual Orientation.


And for all those of you who are wondering what the heck is an electroencephalogram, I also thought that, and so I had to Google it. Google says that this is a recording of brain activity, and so during this painless test, small sensors are attached to the scalp to pick up electrical signals produced by the brain.


So this can be used to diagnose things like epilepsy, sleep disorders, and brain tumors. And so you might be thinking, what do any of those things have to do with your sexuality? The answer is not a lot, but according to the eight authors of this paper, who they split people up into three groups. The groups they chose were homosexual men.


Heterosexual men, and a mixed sex sample cohort. And they found that people's electrical brain signals apparently, uh, corresponded with their sexuality to a total accuracy of 83%, which, It's not even a very good success rate. Just to be blunt, I'm really shocked that this paper passed peer review, but I'm shocked on a huge number of levels that A, people were allowed to conduct this study with human participants.


I think this is a particularly horrifying example because clearly it was a clinical trial, but also B, because it reinvents and gives new legitimacy to a wide range of racial and sexual pseudosciences that have long been discredited.


ELEANOR DRAGE:

Yeah, this is a classic example of long words being used to propagate nonsense.


And I think, you know, what's really disappointing about this example is again, you know, I don't think necessarily that these researchers were saying, let's try and create a tool that could really violently be used against queer people.


Let's try and create a tool that could be used to detect some sexuality and then to repress or punish them. Even though we could see this definitely as an outcome of this tool, instead, they probably just thought, oh, we are contributing to some kind of idea of what it means to be human by understanding our sexuality and where it comes from and finding the biological origins of sexuality.


But those questions are not neutral scientific questions. They're deeply embedded in social and political power structures. And so I think we need to be really, really critical, not only about the false science involved here. But also about this idea that you can just have these neutral scientific motivations.


ELEANOR DRAGE:

Actually, let's talk about science for a minute, because you've done a lot of research into pseudoscience and racial science. So why is it important for us to critique science itself or activity done in the name of science?


KERRY MCINERNEY:

I think this is a crucial question and I think it's a delicate balance we have to strike here.


Cause on the one hand, our western understanding of what science is and scientific method has long excluded a lot of other kinds of knowledge about the world, about our bodies, about ourselves, that are actually really important and valid. And so, you know, I think it's important that we do challenge a hegemonic idea of what science should be. But on the other hand, we also have a long history of the way that science and pseudoscience and other kinds of anthropological and biological knowledges have been weaponized and been used to create hierarchies and divisions between people and being used to justify oppression. And I think there's one of the most overt examples of this is to do with race, science and racial pseudoscience. So the creation and the understanding of people is somehow biologically and innately different, and that justifying white supremacy and other kinds of racial domination. But this is also something we see a lot in relation to gender and sexuality.


ELEANOR DRAGE:

Absolutely. And there's a really good book by Lorraine Daston, who we interviewed, and Peter Galison called Objectivity and it's a history of objectivity in science and it shows you how objective science or ideas about what constitutes neutral and objective science have changed over time.


So even these ideas of what constitutes scientific neutrality or objectivity have always been up for debate, and go through fashions as it were. I think it would be interesting now to talk about the accuracy rate, right? Because in that paper they say that the system is X percent accurate. Is that right?


KERRY MCINERNEY:

They say 83% accuracy, which again, I think this is like a, like a B minus. So it's not even a great level of accuracy by the standards of like maybe what we would expect to be a finding. But I think it's also interesting because this measure accuracy can be very misleading.


ELEANOR DRAGE:

Yeah, absolutely. So the previous attempt to discern sexuality from looking at someone's face using AI that we were referring to, that happened a couple years back, they also said that their system was very accurate, but accuracy of course, depends on what you are measuring and if you are measuring something that's bullshit, then the accuracy rate is bullshit too. So what they were looking at was whether AI could tell someone's sexuality from still pictures. And these pictures were scraped off dating websites, okCupid. There were correlations that were made between people's faces and their sexuality according to the sexuality they stated on the app. Which is its own thing. The sexuality we state on an app can change for whatever reason. And changes depending on who we're looking for and what kind of relationship we want and which relationship we got out of all those kinds of things.


But that aside, what it was doing was not actually using AI to read your mind or read your sexuality, to make fleshy something that is in some way mysterious, but was actually creating these correlations between makeup you're wearing, the way that you angle your head and your stated sexuality.


And that's interesting because the way that we present ourselves as sexual beings, of course, relates to all these different things, but that's not reading our sexuality, that's reading these other kinds of cultural cues. So they can be really misleading in terms of how they advertise themselves.


And obviously we know, and we'll get into this, that sexuality is not something that can be read from your face. I don't think I necessarily appear to be bisexual from my eyelids or whatever, so it's interesting to see that people are using debunked scientific ideas from 19th century pseudoscience because we have AI and maybe AI can do it in a way that humans can't.


KERRY MCINERNEY:

Mm-hmm. Exactly. And this is something we also talk about in our hot take, and that's on AI hiring. Definitely encourage you to check that out. When we talk about that, the way that there's so much hype around AI's observational skills, this idea that AI can see so much more that humans can, and that means it can make deductions that humans can't make just based on the sheer quantity of data.


And again, this might make sense in situations where the connections are not entirely spurious. Whereas in this situation and in the case of AI hiring where you are making connections between a candidate's face and their personality, there simply isn't a meaningful causal relationship between those things.


Or even a meaningful, in many cases, correlative relationship as we showed in that last episode. But I think it really matters to draw forward these questions of, you know, is it, as Os Keyes would say, is this biased or is this completely just bullshit? Because as we mentioned, this 2023 paper was extremely disturbing, but it also had this nasty sense of deja vu because we already had a huge public outcry about the 2018 paper that tried to do a similar thing.


But Eleanor, could you talk us through a little bit of the initial press response to wang and Kosinski's 2018 study on whether machine learning could deduce someone's sexuality from their face.


ELEANOR DRAGE:

Yeah, absolutely. So the way that Wang and Kosinski framed what they were trying to do was, They were saying that they were concerned that these kinds of tools that could deduce someone's sexuality by looking at their face, could be used to out LGBTQ+ people.


And they were saying that they were really worried for these populations, these groups, and that these tools could be used against them for this purpose of outing. Now this isn't necessarily coming from a place of love or a place of concern because all you are doing is creating fear for people from a tool that didn't need to be built.


It shouldn't have been built and it wasn't actually able to do what it said on the tin. So why are people doing these things that are not politically useful under the guise of care and concern for queer people? Now, the problem was, is that this was still misinformation because this encouraged people to think that these tools could actually out people. And the politics of outing is its own thing anyway, like coming out is something that is very Western ideal and outing people is not something that happens across the world in equal ways.


But, coming out against your will, this kind of forced outing is a really horrific thing. It can be a devastating thing for many people. And so you are really creating a horrible fear and concern for no particular reason. What they should have been saying is that these tools cannot out you, because sexuality is something that cannot be revealed through artificial intelligence. So you don't need to worry. Queer people worry enough anyway. I mean, now is pride month, it's a joyful time, but it's also a time to mediate on fear and suffering and what people go through. And people really don't need this extra angst, and extra fear. So I would really urge people who are trying to do things under the guise of improving the wellbeing of LGBTQ+ people to think, does this actually make things better or does it create more fear?


KERRY MCINERNEY:

Exactly. And I just think that there's so much trauma associated with this concept of being forced out. And I think increasing queer people's sense of being surveilled of the extending various kinds of state and oppressive power over them through machine learning tools like this is just a terrible political project.


I think it also reveals the limits about thinking through AI ethics. In terms of the individual, in terms of notions of privacy, of course those are really important things. Of course, people's privacy and data was probably non consensually stripped in order to create this tool, like there are a lot of very classic AI ethics issues here, but there's this much bigger societal question I think of what happens when you have old ideologies, old forms of gender and racial pseudoscience rearing their head again, being given you legitimacy in machine learning platforms and forms, and actually I wanna come back to this idea of history, but specifically what I mentioned at the very beginning of this episode, which is the idea of origins and the kind of origin- seeking of things like sexuality.


And how that links to this kind of very modern idea of objectivity and of ai and machine learnings is like amazing observational skills because this is in the abstract, oh yes. Sorry. I should have said this at the beginning of the episode, the 2023 paper has already been taken down, so I have not been able to read the paper.


All I have to go off is the abstract of someone's screenshots which are still available online, but actually the opening of this abstract begins with the idea of that we don't know where sexuality comes from. Then the second line is, and machine learning allows us to see things that we otherwise can't see, right. So we see these two things being glued together and like, why do you think that's significant? Why do you think that they've already framed their study in this way?


ELEANOR DRAGE:

Yeah, I think that's fascinating also that it's already been taken down and I would urge people not to think of it as like a free speech, like, oh, we're censoring science or whatever, because if it was a paper that was trying to justify why the earth is flat and it'd been taken down from a journal, I think everyone would just be like, chuckle and move on. Right. And it's exactly the same thing, really.


It's not even science, I don't think we can call it science. The term bad science can be useful. It's things done under the name of science. But at the same time, this isn't science at all, it's speculation, long words and a lab, and those things don't equate to scientific research.


I have now forgotten your question.


KERRY MCINERNEY:

My question was around the idea of origins like why are they so interested in the origins of sexuality and why are they on earth? Are they trying to read this from the electro pulses of your brain?


ELEANOR DRAGE:

Yeah, so throughout the ages people have tried to literally take queer people apart and look inside them and see why they are the way they are.


It's the same with race. What's the book about the really terrible experiments on...


KERRY MCINERNEY:

Medical Bondage, or like the work on like gynecology or, Black on Both Sides, C. Riley Snorton's book?


ELEANOR DRAGE:

Yeah. This work that you've mentioned that, shows how putting the othered body on a dissection table and looking at them, has been this kind of fetish of white medical culture. It's horrific. It's obviously torture and it's murder. And this is a another way of doing this kind of experimentation.


However, trying to find something innately queer about people, biologically innately queer has also been used as a useful political tool to justify the existence of queer people. So you can take as the epitome of this Lady Gaga's born this way narrative, right?


You have the right to exist because you were born this way. I have the right to occupy this space because I was born this way. And this harks back to Christian narratives about nature being what God wanted. God created the world perfectly exactly as it was, and because it was natural or because there's something natural about any phenomenon, marriage or trees, that means that it is perfect and it has the right to exist, and it's understandable that queer politics has taken hold of this very powerful idea, very much using the master's tools for its own work, to say we have the right to exist because I was born this way. It's what Judith Butler calls a strategic provisionality, which means it is only strategic provisionally, sometimes, but, this politics of origins -of where we begin from- and why we are the way we are, is what we call essentialist, so it assumes that there is an essential component, part of our being that is queer or that is human or whatever it is that validates our existence. There's an amazing book that I'd love to point people towards called Ezili's Mirrors by Omise'eke Natasha Tinsley. And it's part historical fiction, and historical non-fiction and part personal narrative and they're thinking about the Ezili family of Spirit forces of the Haitian religion of Vodoun and Ezili is many different personas and manifests in many different spirit forms like Ezili Freda, who's a luxurious mulatta, who loves perfume and music and flowers and sweets; the fierce protectress Danto, and then Lasirenn, a mermaid who swims in lakes and rivers.


And the spirit also is the spirit of transmasculine and transfeminine Haitians, because often they will embody her, they will become possessed by her and Ezili shifts through these different personas over time and changes in sexuality, changes in form, and changes in the kinds of persons that she embodies.


And it reminds me a lot of Orlando by Virginia Woolf, the this figure, Orlando, who shifts through the eras, becomes male and becomes female, changes sexuality and sexuality is contingent then on history, on these different bodies, different characters. That's what sexuality is.


And if you read those kinds of things and you experience sexuality in that way, that it changes across time, my sexuality changes week to week, day to day, month to month. Then when you look at these AI systems that are trying to discern a fixed, stable, sexual identity by looking at your face, it just seems so absurd.


KERRY MCINERNEY:

Absolutely. And I think, again, like these debates around what sexuality is, where it comes from, whether it's innate, whether it's a practice, these are really, really rich debates happening within queer theory, within queer activism groups, within our societies. And you know, they're really complex.


Take for example, kink, like some people understand kink very much as an identity. Other people understand it as a set of sexual practices that they choose to engage in from time to time. You know, and again, I don't think there's a right answer here. I think that though the kind of studies we are looking at are not at all trying to engage with these kinds of complex ideas around what sexuality is.


Instead, the engaging in the crudest kind of non-strategic biological essentialism. It's important to note here that the essentialisms they're engaging in are not only to do with sexuality, they're also very much to do with gender and to do with race. Right? So, for example, take this 2023 study and the 2018 study was the same, right?


Notice the groups that they do: they do homosexual men and heterosexual men. In the case of the 2023 study, right? And then a mixed sex group. And by mixed sex, I'm pretty sure what they mean is men and women who was excluded from this picture? Most queer people are not even present. Both studies 2018 and 2023 never used bisexual people, as an example or part of their sample size because that would completely overthrow their binary logic of what sexuality is. And to me, I think what it really ties to is, I guess what Judith Butler calls this heterosexual matrix. So the need to compress everything to fit these normative ideas of straight sexuality, but also what Jasbir K. Puar calls homonationalism, and this is more to do with the national politics around sexuality and queerness, but something Pua shows us is how there's a particular normative idea of what good homosexuality looks like, and that tends to be the white gay man in a consensual loving exclusive partnership with another white man. So for example, when GLADD, the Gay Lesbian Rights Organization critiqued Wang and Kosinski study, they critiqued it on a number of grounds and they did call it junk science, but one of the things they actually critiqued it for was the fact that it didn't have any, I think people of color or I dunno if it was any or many, But that people of color weren't really represented in this study.


And on the one hand, this points to a broader problem of the erasure of queer people of color. But on the other hand, it also points to the limits of a representational politics because, the last thing I would want is for people of color to be included in these kinds of studies.


ELEANOR DRAGE:

Absolutely. So can you tell us then also about the kinds of theory that you are using when you look at these kinds of tools and why you think they don't work?


KERRY MCINERNEY:

Yeah, yeah, absolutely. And I definitely wanna hear your thoughts on this as well, cause I know you draw a lot of insights from queer theorists like Judith Butler when you're thinking about AI and machine learning.


I'm really interested in queer of color critique and. Particularly how a lot of this work is very good at questioning foundational categories, and also questioning our idea of what we consider to be normal. So I'm working at the moment with Os Keyes, who we had on the podcast. As you can listen to their episode.


I'm gonna link it in the show notes along with a number of episodes that really excellently explore a lot of these ideas around gender and sexuality, queerness, and race and machine learning, thinking about actually emotion sciences and the racial and gendered pseudosciences of emotion and how those play out in emotion.


AI and queer theory has been really important to me there for showing how these ideas of what it means to feel right, to feel normatively correctly has played a really central role in defining what it means to be human. And again, for me, I see that being very much linked to race, but most of my work on Asian diasporic studies and thinking through the racialization of Asian people, it's always focused on the way that this racialization is profoundly gendered and specifically to do with not sitting within quote unquote normal white Western gender norms and being pushed out of that through modes of racial violence and exclusion. So what happens, for example, if we think about the Tiger Mother, for example, that stereotype as being this like queer figure of parenting, I'm interested in those kinds of Provocative questions, I guess.


Um, I didn't really answer your question.


ELEANOR DRAGE:

You did. You totally did.


KERRY MCINERNEY:

I'm really interested in a particular trajectory of thinking as well, which deals with the slightly controversial concept of forced queering.


So this is people like Sarah Haley, Stephen Dillon, who are interested in how racial violence specifically pushes people of color outside of the structures of normal sexuality. And so, to give an example of this, um, [Malini Johar] Schueller talks about how in Gayle Rubin's thinking sex, the kind of ideal sexual relationship is the heterosexual marriage.


And then Schueller immediately pushes back on this by saying, well, you know, No, that's, that's fundamentally premised on whiteness because a heterosexual marriage with an interracial couple was not considered normatively okay. That's actually considered, you know, in many places around the world, still really abnormal.


Um, so yeah, these are the kinds of ideas around sexuality that I find really interesting to play with when I'm also thinking about technology.


ELEANOR DRAGE:

And you brought up practices and that is so important because , De Witt Douglas Kilgore, this scholar was talking about queer defined as non-normative practices.


So it's not something innate, it's anything that is not socially normative. So if you change society, practices that you were talking about, normative practices like marriage, if they are queered, if the world changes, then also what it means to be queer changes, right?


Because queer is contingent on the normal, on the status quo. That's why opening somebody up cannot tell you about queerness because it's a, you could say a response to what is going on in the world, normatively.


KERRY MCINERNEY:

Yeah, exactly. And like, I don't know enough about like the history of these kinds of things, but I wonder to what extent things like divorce, for example, that would've been, not normatively accepted and again, like there truly is a lot of stigma around it, but I'm, it is certainly much more legally and socially accepted.


It's just like a fact of life, you know? To what extent did these kinds of social practices play into our expectations when it comes to sexual expression?


ELEANOR DRAGE:

Yeah. So how can we queer the way that we see the world using AI? These are the kinds of questions I'd be interested in exploring rather than how are we using AI to impose a normative vision on the world.


KERRY MCINERNEY:

I actually wanted to also touch on a different kind of trajectory of queer theory that might be really illuminating when thinking about both the problems with these papers, these 2018 and 2023 papers, trying to deduce sexuality from someone's face or someone's brain.


And this question of Origins, which is Michel Foucault and his idea of genealogy. And I was wondering maybe if you'd be open to talking a bit to this Eleanor, because it's been a really influential idea in the social sciences and humanities, but also I think a really important idea when we are thinking about the history of technology and how we end up where we do.


ELEANOR DRAGE:

Go on, you tell me what you think.


KERRY MCINERNEY:

Um, yeah, I guess like for me, again, I'm not a specialist in this area at all, but I think, like for me, because thinking around genealogy, he's been very much questioning this idea of like a natural unfolding narrative. To say the ways things are right now is how it should be.


I think there's like an anti teleological impulse, I think to it.


ELEANOR DRAGE:

You are gonna have to explain anti teleological.


KERRY MCINERNEY:

By anti- teleological, I mean, you know, teleological, teleology is this idea that, you know, history is driving down one path towards a kind of end. So this end, for example, might be the, like the second coming of Christ for in the Christian religion.


There's all sorts of different ways of thinking about this end point. Um, but you know, I think what Foucault does is that he deliberately destabilizes this idea that history is moving towards an end point, that there's going to be an end of history, and that all the events have happened so far are leading us to that end goal.


Instead, he frames these histories as entirely contingent that they were the process of actions and events and chant. All mixed together into this ugly and mysterious kind of winding path. But things didn't have to be that way. And so there's a process of deconstructing and denaturalizing a lot of things that we might consider normal.


And that's where I think it's ultimately really relevant to the kinds of ideas that we're discussing today, right? This idea of destabilizing the normal. And I think also important when we think about, The fact that we are, seem to be stuck in the same grooves of using AI machine learning to undertake these same kinds of pseudoscientific questions.


Like when I think it doesn't have to be that way. And so I think there's something, that's really disappointing about where we are, that I think queer theory is really important for pulling us out of this particular hole that we're in.


ELEANOR DRAGE:

I like disappointment as a political tool. It's a bit like, when your mom isn't angry with you or she's just disappointed and somehow it's worse.


KERRY MICNERNEY:

Exactly.


ELEANOR DRAGE:

I mean, the narrative of natural progression is something that queer studies has critiqued for a long time. I was thinking about it last week because in Christianity there are these different sacraments and they begin with baptism, and then I think Confirmation and then marriage. And Judaism has similar sacraments, and if you don't get married, you miss a sacrament.


So you fail in that progression through life, through the sacraments. And I remember we had a teacher at school who was really lovely, called Ms. Williams. And we were awful, people asked her why she wasn't married, you know, and why she hadn't gone through that sacrament. She was like, it was my vocation to be single.


And it's just a terrible thing that to have to justify why you haven't jumped through these hoops, through this natural progression. And so many people who don't wanna get married or don't wanna have children have to almost come out against society saying that they should have to do these things.


So even, queerness aside, there's lots of ways that people might not be normative or follow that natural progression.


KERRY MCINERNEY:

Mm. That's really interesting. I think the sort of coming out as not wanting to have children, I think for a lot of of women, like that's a really challenging thing and yet I think it's increasingly Important to talk about the fact that there are so many kinds of gendered pressures that really have to be negotiated.


Um, and tools like the ones we are discussing that claim to be able to deduce your sexuality from your face or make other kinds of judgments about your life stages or your sexual practices are really deeply unhelpful in that goal.


ELEANOR DRAGE:

And it's not anyone's goddamn business.


KERRY MCINERNEY:

Exactly. I mean this is the the big takeaway, I think, which is why is this your business?


You know, and again, even though we think the model of privacy is not enough to engage with the false science and the dangerous logics behind These tools. You know, I think it's also just worth saying like, you know, people's sexuality is their own business. Why are you trying to behave in this invasive way?


What gives you a right to these attributes of someone you know? You don't have a right to know these things about someone. And if you wouldn't do it without machine learning, if you wouldn't do it without AI, if it would be considered strange and invasive and unethical to do it without AI, applying machine learning techniques does not make it any better. It is still unethical. It is still unwise. It is still illogical, and it's still bad science.


ELEANOR DRAGE:

Read Kevin Guyan's Queer Data about how privacy has been given to some people and not to others. Privacy is something experienced by white people. It's something experienced more by people who are heterosexual.


It's so predictable that this invasion of privacy is justified by the fact of being queer.


KERRY MCINERNEY:

Exactly. And so Eleanor, if anyone has not managed to make it through this entire episode, or if anyone has just come in now, what do you want their final takeaway to be? Like? What's your one line hot take on AI that claims to be able to reduce your sexuality?


ELEANOR DRAGE:

Correlation is not causation. There is a strong correlation between margarine consumption in the 60s and divorce rates. That does not mean that buying margarine makes you more likely to get divorced. Same thing here. We need to be really sure really careful that we understand what we're looking at when someone tells us that an AI system is 90% accurate.


We also need to think about, methods. So if you're working with a system that is scraping people's photos without consent from a dating site that is unethical and it's likely that your system is gonna be messed up and unethical too, that it's doing something that's not right. Because you don't have stakeholder engagement, you're not engaging with queer communities from the very beginning.


I highly doubt in those two cases that happened. Stakeholder engagement means more than just like talking to someone and taking notes. It means full integration from the beginning. So bad methods equal bad system.


KERRY MCINERNEY:

Mm, I fully agree with you. Yeah, and I think my kind of main takeaway or main point, I guess when it comes to AI systems that deduce your sexuality is firstly they can't, secondly, don't be politically naive. There is nothing good that comes out of making these systems. And I think that's really important because there is a small subsection of scientists, not all scientists, who are very keen to act and experiment first, and then leave the ethical and the social and political mess to legislators and ethicists, and that is absolutely the wrong order of things.


Scientists are ethical stakeholders. It's incredibly important that you are working to create ethical research cultures that do not enforce relations of oppression, discrimination, and hierarchy. But then third, again, you know, look, the eight researchers who were involved in this paper, they should not have done this study.


But at the same time, I think it's really important to note that this is a systemic failure. This is not about single-handedly calling out these eight individuals who happen to make a paper that I think is very homophobic, the issue is that not only did they create this study, this probably got approved by an ethics board.


This went through their university. This then got, I would assume, peer reviewed, although I dunno that much about the journal this thing got published, right? So that's a failure on the part of all those people involved at every step of the process. And so I think what this journal and this university and medical school needs to do is needs to go back and say, actually, where have we failed as institutions to prevent this kind of harmful research from happening?


ELEANOR DRAGE:

Yeah, and as you said, it's not about individuals, but I will be messaging them all individually.



Individuals make up systems, so I'm not against an email, but you know, again, I think this is like one of the challenging things about talking about science and sexuality is that it can be very quickly co-opted into this cancel culture narrative of like, oh, you are just like canceling people with your like queer rage.


And it's like, well, no, like, as you can tell, We're disappointed. This is bad science. And actually, as AI and as queer people, we have a right to kind of demand reasonable accountability for these kinds of really, really, bad faith research efforts.


As you can tell, Kerry is the nicest person in the whole world.


I'll leave Eleanor to send some angry emails, but, uh, thank you so much for listening to our latest hot take, and if you have any thoughts about the episode, feel free to email us or tweet us. Let us know if there's anything you want us to discuss. And as always, there'll be a full transcript of our episode, lovingly done by Eleanor. And there's going to be a reading list attached, which will include all the episodes by phenomenal thinkers and scholars and writers that we've had on the podcast who have been thinking about queerness. That includes Sophie Lewis of Family Abolition fame, Arjun Subramonian, they're part of queer in AI . They do phenomenal research in queerness and AI and machine learning. Kevin Guyan of Queer Data that Eleanor mentioned. Os Keyes on all sorts of wonderful things, including ethical teaching and practice and so many more. So we'll be attaching all of those on the Good Robot website, which is www.thegoodrobot.co.uk, and you can find this episode under transcripts.


But until then, thanks so much for listening and we'll see you next week.






63 views0 comments
bottom of page