In this episode, we talk to Yasmine Boudiaf, a researcher, artist and creative technologist who uses technology in beautiful and interesting ways to challenge and redefine what we think of as 'good'. We discuss her wide-ranging art projects, from using AI to create a library of Mediterranean hand gestures through to her project Ways of Machine Seeing, which explored how machine vision systems are being taught to 'see'. Throughout the episode, we explore how Yasmine creatively uses technology to challenge the colonial gaze and the predominance of Western European ideas and concepts in ethics.Â
Yasmine Boudiaf is a researcher and creative technologist focusing on data, epistemology and the absurd. She was named one of 100 Brilliant Women in AI Ethics™ 2022. She produces art and research projects and consults on project design, strategy and public engagement. She is a fellow at the Ada Lovelace Institute and the Royal Society of Arts and a researcher at the University of the Arts London (UAL) Creative Computing Institute. She has contributed to the The Institute for Technology in the Public Interest's 'Infrastructural Interactions', investigating public data infrastructures using traditional and non-traditional research methods. She contributed the chapter 'AI Justice Matrix: The Futility of Policy Craft' to the Transmediale 2022 workshop 'Rendering Research'.
Image: Yasmine Boudiaf & LOTI / Better Images of AI / Data Processing / CC-BY 4.0
Reading/Watching List:
Check out all of Yasmine's art projects here! https://yasmine-boudiaf.com/
John Berger's Ways of Seeing: https://www.ways-of-seeing.com/
Octavia Butler, Xenogenesis: https://www.octaviabutler.com/xenogenesis-series
Transcript:
KERRY:
Hi, I'm Dr. Kerry McInerney. Dr. Eleanor Drage and I are the hosts of The Good Robot Podcast. Join us as we ask the experts what is good technology, is it even possible, and how can feminism help us work towards it? If you want to learn more about today's topic, head over to our website, www.thegoodrobot.co.uk, where we have a full transcript of the episode, and an especially curated reading list by every guest.
We love hearing from listeners, so feel free to tweet or email us. And also so appreciate you leaving us a review on the podcast app. But until then, sit back, relax, and enjoy the episode.
ELEANOR:
In this episode, we talk to Yasmine Boudiaf, who is a creative technologist and researcher. I love her answer to what the good life is or good technology is. And she reminds us that it's having your needs met plus a little extra fun. She tells us about how art teachers are now helping children understand the AI generated art, and she's using AI to create a library of Mediterranean hand gestures. And I love this because my mom, who has Italian origins, uses 80% gesture to get her meaning across in about 20% language.
I hope you enjoy the show.
KERRY:
Thank you so much for being here today. Just to kick us off, could you tell us a bit about who you are, what you do, and what's brought you to thinking about feminism, gender, and technology?
YASMINE:
Yeah, sure. I'm Yasmine Boudiaf. I'm a researcher, artist, technologist based in London. I am originally from Algeria, and I've always been curious about the way that technologies affect people. And in my work I produce work that kind of plays around with the relationship between technology and the public.
KERRY:
Fantastic, and we can't wait to dive into those projects with you.
But now I'm gonna ask our three good robot questions because luckily for her, sadly for me, Eleanor is on holiday in Italy this week. So I will be asking you instead, what is good technology? Is it even possible? And if so, how can feminism help us work towards it?
YASMINE:
Yeah, that's a tricky thing to get your head around. So in my view, I think, that there's always been technology, as, as long as there have been humans, and I would include in technology things like tools and medicines, even social habits and rituals that we've invented. But then if we're thinking specifically about technologies that like use electricity in silicon chips, can we have good technology?. So I'd say yes, and it's possible. I start with myself and I think that I deserve nice things and because I like me very much and I want, and then I also think I want my neighbor to have nice things. And I like the idea of a community that has their needs met, plus like a little extra fun.
So that forms the basis of what I would then, like, try to conceive of as good technology. And then I guess in my work, I arrive at what that might look like in two ways. So first is just pure art, which is just a kind of unrestrained exploration playing around with concepts that I find interesting, and then thinking about what I might produce from that.
But then the second way, a more direct way is to first interrogate examples of bad technologies, and I would define that as, harming the public, be it like humans and otherwise. And then having a clear idea of what bad technology looks like, and then use that as a starting point to interrogate and then try to conceptualize alternatives.
Yeah, at the most basic, I'd say good technology means that it does no harm in order to fulfill its function. But and ideally it would solve some sort of problem. I'd also add to that technology that could entertain or just do something cute or just look cute. So that's determining goodness by function, we can also.
Probably, and I think that we should generally start thinking more about this is determine goodness by how that piece of technology came to be. So thinking about the material resources it took to produce it, did it require destabilizing a country in Africa to extract cobalt or, did it require lobbying governments to ignore human rights or safety standards?
So that's, I think that should also be like a determining factor. So yeah, both in terms of function and circumstances for how it was developed. And I think the third question was about how how can we work towards good technology? I struggle with that cuz I, I think goodness or being ethical is something I think about a lot and I haven't reached a kind of conclusion, but I would say it's not easily arrived at because we don't have an agreed upon definition of goodness or ethical behavior in the first place.
Like when Greco-Roman or capitalist ideologies dominate, then that sets the tone and all other epistemologies or systems of knowing are suppressed or even erased. And so we can't really access them. And thinking about who determines what good behavior is and how it materially benefits people and who it materially benefits is good starting point.
And then we can design around that. We can't recapture what's been erased, but we could at least first acknowledge the limitations of what we consider as ethics and apply that and work from there.
KERRY:
I think why it was particularly interesting to me is because it really felt like it captured for me a lot of the undercurrents of your different artistic works because, I see all your different art projects as fascinating but I didn't necessarily see the connections between them until you shared this idea of, what kinds of knowledges gets excluded, who gets to decide what good is?
And then, how do we grapple with both the innate badness of many kinds of technologies that we currently have? Cause as the little sidebar, I don't really agree with this idea of oh, technology isn't good or bad. It just depends like what you do with it, it's just a tool. I'm like, I don't know. I think there are some technologies that are just bad, and I think it's okay to say that.
But I want to talk about some of your different art projects in detail, because I think your projects show how incredible these speculative projects can be for actively working towards or thinking about different kinds of goodness and how these tools which have often been used to control or to oppress, might also be used in these kinds of countercultural projects.
So I really enjoyed one project, which I saw called Ways of Machine Seeing, which focused specifically on machine vision, and I was wondering if you could tell us a bit more about it.
YASMINE:
Oh yeah. That's a collaboration between myself, the Centre for the Study of the Networked Image (CSNI) at South Bank University Annie Davey at UCL, and Nicolas Malevé, plus a few other kind of orbiting people and institutions. It has, it's like multifaceted. And the project asks how is seeing being taught to machines. So we interrogate machine vision and the kind of reference point we're using is from visual arts and visual artists in their education, they generally are taught lots of different ways of looking at visual art and assessing it and defining it and expressing their work using this language of visual art.
But we don't really have anything when the viewer is a machine or if the like visual artifact has been generated by a machine. And so we need to come together and acknowledge this this next iteration of visual culture and start to construct a vocabulary around it. Or at least just interrogate the kind of underlying mechanisms a little bit.
John Berger was an art critic who really opened up the art world to lots of people and made it accessible through his project, Ways of Seeing, which was a book and also a TV series. And through our process and workshops we are hoping to do that, but for Computer vision so one thing we do is that we have workshops where we give people the opportunity to see what goes into constructing these like massive image databases. And what it takes to tag images that makes these databases that then go on to generate, like AI art for example.
And what people see is the amount of like human labor that goes into it, like they have a fraction of a second to look at an image and then describe what it is knowing that this will then be fed into a massive image dataset like ImageNet or something. And then that forms a basis of what an algorithm would select to then produce images through a piece of machine learning software.
At the moment we're focusing on art teachers, teaching kids in schools and I think it's our job as people who work with these technologies to like, make it conceptually accessible to as many people as possible. And so we're like constructing a toolkit, if I can use that word. I know it's like a buzzword at the minute and every time I say it, like something inside me dies. But it's really useful to think about. And I also think that it's good to use words that people are familiar with, but it's a toolkit for how we can talk about images generated by computers, like generated by people using paint, and try to give people more confidence to, to be able to do that and also some more understanding of some of the underlying mechanics and politics of how those images came to be.
KERRY:
That's so fascinating, and I think it's also such a testament to the importance of the arts and humanities and social sciences in computer science and in machine learning, because this to me is such a distinct disciplinary area of expertise.
To the visual arts that I just don't think you could get it from any other discipline. And this is not the only image-based project that I know that you've been working on. I had the great delight of hearing you present at our mini Worlds of AI conference, which was in April in Cambridge, and you talked about a project which was creating a kind of visual dictionary of hand gestures.
Could you tell me a bit more about it?
KERRY:
Sure. Yeah. So I'm working to produce like a gestural library of Mediterranean hand gestures so that using a camera you could identify hand gestures and their like corresponding meanings. I had two aims for this. First was to think about, what could be possible with using computer vision and AI and like just gesture detection.
How could preexisting training models be used with a new dataset and how could you structure it in a way that's not extractive, violent, harmful? And the second was a bit more conceptual. It's to redefine what community or shared culture is or could be. Cuz that's often defined by borders and hard labels. But the nature of the Mediterranean as a site of like artistic interrogation is that it's ever changing and it's constantly being redefined. And there's always historically been so much movement and it's just, even now it's in flux for various reasons, and yet there is still a shared culture. It has persisted despite the political chaos, and there are signifiers of that culture and one of the signifiers are these hand gestures. And I think there's something really special in reminding people of that of this kind of, shared cosmic tapestry that kind of unites everybody who's, Touching the ocean or like being near it.
Cause it's almost like the exact counterpoint to I'm thinking of the idea of like nationalism, as like the imagined community, but this is almost like pushing back on that and it's like, no, like what does this shared bodily communication kind of community offer us for thinking differently?
KERRY:
Yeah, that's definitely something I like understand sideways as someone who's like Asian diasporic and thinking about how like many different Asians communities find these points of convergence when they exist in diaspora or exist in like other places, yeah, of course there's like these meaningful differences, but there's also, I think shared moments of convergence where you're like, okay, this is someone who understands this particular framework.
And it's also just nice to see these tools which often are unfortunately used as tools, as of loss kind of being used as ones of gain. And that actually reminds me of another project you are working on which it was called an Algerian- techno ritual, which aims to document the practice of Algerian woman's facial tattooing.
And that was something I found really interesting because in the context I'm from, New Zealand, the camera serve to erase or to because it couldn't capture Maori people's facial tattoos. And so unfortunately has the camera has historically I think served as a technology of erasure there.
And so I was again really interested in this idea of turning something of loss into gain. And yeah could you tell me a bit more about this particular project and yeah, also how you grapple with that process of transformation, of like using tools. To bring things back without, at the same time trying to codify like a single idea of what a cultural practice is.
YASMINE:
Yeah. I'll certainly try to. So this project looks at the practice of facial tattooing in women in Algeria specifically, but more broadly, North Africa and the Middle East. And it's a practice that isn't done anymore. And the only women that survive with these facial tattoos are in their like eighties and nineties.
And I am just completely fascinated just on pure aesthetic value as well. But my grandma had a tattoo on her hand and I remember asking her What it meant. And she couldn't really tell me because, somebody had done it to her when she was very young. And also by then it was taboo and a lot of people were hiding their markings or just like trying to get rid of them.
And it's a really complex history that has meant that this, practice isn't done anymore, partly because of displacement maybe colonialism, maybe just, changing times and values with the way that religion is practiced. But because I found it fascinating, I tried to think of ways of participating in the ritual, knowing that I can't time travel or read minds. So how can I use the place where I'm at, which is in the uk? How could I, as a member of the diaspora, like engage with this as an example of a kind of cultural ritual at a distance? And then the question that came up with me and it's a dilemma, how could I do this without reproducing the western colonial gaze?
So the artifacts that we have of these face tattoos are mostly because French tourists under French colonialism just took loads of pictures of young women and turned them into postcards and they were like seldom in Paris and everywhere else. As well as some like French anthropologists and some tourists also from Western Europe.
And so I thought, in that process, what am I? How can I not reproduce a colonial gaze? And I think that it's not entirely possible to do autoethnographic work with moral purity, and I don't think that's what I'm aiming for. My aim is to try to think really hard about how can I participate using the tools that I have.
And just set a set of standards for myself that kind of honors, what I would consider like anti-colonial an anti-colonial process of ethics. So what I thought of is, yeah, and at the same time as like not reproducing, like colonial habits. How do I resist surveillance culture and biometric data capture?
Like I don't just want to go and photograph people and kid myself that these photographs will never be used in a kind of police database somewhere, so that's not true. So the idea I had is to use like face masks that are like skin masks that you might have and then take prints of faces and then only photograph those faces and then have that as images to go into this image data set.
So I did a pilot project just using my own face and facial markings. And it worked and I produced like a mock set of images that I then ran through an AI kind of algorithm that used this training data set that was open source images of like botanical plants because a lot of these markings were based on plants.
And then it generated some really interesting markings. And then I thought in that way I felt that I had tapped into this ritual. And I'm looking to build this project further and do like on the ground research to capture oral histories and produce a dedicated open source data set of markings and their corresponding meanings just for posterity and to like better understand my own culture.
Having that kind of reflexivity is a real privilege. As an artist I can say, I'm gonna pause and really think about it. But as an academic, perhaps in your work it's not so easy because you might have funding with deliverables and everything else.
And so where is that space to just say, to just pause and think and replan? Yeah I'm very happy with my process as it is, and I'm also okay with that process changing should new knowledge come about.
KERRY:
I think that's really wonderful and I do think there's something in the slowness and the friction and the reflexivity that to me really aligns with a lot of insights from feminist artists and thinkers and scholars around how we should be thinking about tech design and the that, in the kind of 'move fast and break things' ethos of Silicon Valley, there's this emphasis on slickness and on smoothness and on a lack of friction.
But what happens when friction is actually generative and it helps us think differently about something and also to stop when something's required. And yeah, and I also definitely was very touched by what we were saying around the kind of like inability to act with moral purity when doing autoethnographic work, especially I think when trying to do decolonial work. And I think some of my favorite writing on kind of colonialism is, Octavia Butler, the sci-fi writer, she has the Xenogenesis trilogy. And I think just the queasiness of that whole set of books and the kind of complicated intimacies and relationships coexisting with coloniality, to me just feels somehow like emotionally so true compared to a lot of other bin stick accounts of what it means to exist after and with colonialism.
I wanna come back to the toolkit idea. Yes, there's definitely a plethora of really interesting toolkit projects out in the AI ethics space, but yours in particular, and this is actually a different toolkit of yours that really has captured my imagination. So I saw that you've been working on a project called the AI Justice Matrix, and I'd love to know a bit more about it. So why did you start this kind of alternative ethical toolkit and what does it tell us about, what you were talking about the beginning of this interview, which was the kinds of ethical knowledge that get prioritized in the field that we call AI ethics.
YASMINE:
Sure. So this was born out of a fellowship I did at the Ada Lovelace Institute, and my idea was to open up the process for how we construct AI ethics and it's called the AI Justice Matrix, but the subheading is the futility of policy craft. And I think that's really where my thinking sits, which is through observing how knowledge is produced and then how it ends up defining how we should behave in a kind of tech ethics context.
And I think we touched on this before. Fundamentally, some systems of knowledge have been prioritized and others suppressed. So my thinking is that therefore ethics and then regulation that goes to influence tech is never as good as it could be. And in fact, could just reproduce all the harms of how that kind of prioritizing came about.
So I thought of making this matrix, which is a series of nodes that exist online. And each node has a different theme that is, Either directly or tangentially related to AI technologies. So it might be surveillance or I dunno, feminism or, whatever. And then I invite practitioners who are self-defined, to contribute to one or more of these nodes, and then I add their contribution to these nodes.
But It's a kind of Chatham house system of you, there is a list of contributors, but you don't see who contributed what. And I felt that would give some freedom to talk about topics in a way that wasn't self-limiting. And then if somebody looked at a contribution and wanted to respond to that, then they can.
And I curate the whole thing. So far I've only taken a handful of contributions. It hasn't fully launched yet, so watch this space. But it's a kind of acknowledging that the way that knowledge is produced and then the way that it's instilled or crystallized within knowledge institutions, particularly in Western Europe it carries over.
It's like a hangover of the colonial mindset. And I think that limits imagination and potential for what we could do. And then these institutions they produce policymakers, they produce politicians and management consultants. And that if they're all from the same crop and the same way of thinking, there's this kind of performative way of trying to consult with quote unquote normal people where a think tank might do a workshop or might have a work focus group that just ends up like reproducing the same stuff. Like they, they know from the beginning what the outcome should be.
And particularly if you're a public institution, you are beholden to the government of the day and whatever like strategy they have and political intentions they have, it determines where the funding goes. And so you as a member of an institution, there's informal self-censorship that goes on, oh, that, and, the tech lobby is super powerful and like really good at what they do. And so there's a relationship between governments, the tech lobby, the tech industry, these large management consultancy firms and then academic institutions and think tanks that just reproduce the same stale ideas and even, high level people within those institutions that just circulated around and round.
And I just grew frustrated and I was like, it's boring. And so I wanted to make a space where people can just start being a bit more playful. And then perhaps from that in the future forms of regulation will appear from that, or ways of behaving and treating these technologies can emerge or organically.
But at the moment, I just wanted a space where, you know, what I've just described is acknowledged like in the background. And let's just talk about it in another way.
KERRY:
That's really interesting. And I also, I really empathize with the way that a particular way of thinking gets so codified into just even how you speak. Like, I remember just hitting a point in academia, an academic scholarship, often when you're talking about a group of people who are affected by a technology you say they, right?
But then I would find it weird because I'd say, oh Chinese diasporic people, they're affected like this and I'd say, why aren't I saying we, I am one of these people. But there's just something about the whole language of it that tends to be quite separating and tends to assume a particular power position or a particular sort of viewpoint in society.
And that's why I think things like your justice matrix are really important. Cause it's yes, it's challenging the same set of concepts, but also yeah, the same set of language, the same set of actors, and trying to disrupt that. But yeah, thank you so much for coming on the podcast. It was a real joy and privilege to get to chat and I'll be following all your different projects with lots of interest.
And for our lovely listeners I'll be linking these all on the website thegoodrobot.co uk where we also always have a transcript of every episode. But most importantly, thank you so much again for coming on.
YASMINE:
Thanks, Kerry. This was great, cheers.
ELEANOR:
This episode was made possible thanks to the generosity of Christina Gaw and the Mercator Foundation. It was produced by Eleanor Drage and Kerry McInerney and edited by Eleanor Drage.
Comments