top of page
Search
Writer's pictureKerry Mackereth

Hot Take: Twitter is to X as Barbie is to Ken

Welcome to this week's Hot Take, where your hosts Kerry and Eleanor give their candid opinion on the latest in tech news. This week they discuss the rebranding of Twitter as X and how people like Elon Musk have an outsized impact on the daily technologies that we use, on the kinds of technologies that get made and created, and on the kinds of needs that get prioritized in the tech industry when it comes to user preferences and desires. From X to the Barbie movie, they explore why diversity matters in the tech industry, as well as why trying to understand what 'diversity' is and what it means in context is much trickier than it sounds.


Reading List:


Alan Turing, Report: Where are the women? Mapping the gender job gap in AI https://www.turing.ac.uk/news/publications/report-where-are-women-mapping-gender-job-gap-ai


Costanza-Chock, S. (2020). Design justice : Community-led practices to build the worlds we need.


Klein, L. and D'Ignazio, C. (2020). Data feminism.


Ahmed, S. (2012). On being included : Racism and diversity in institutional life. Durham ; London: Duke University Press.


Criado-Perez, C. (2019). Invisible women : Exposing data bias in a world designed for men.


Newkirk, P. (2019). Diversity, Inc : The Failed Promise of a Billion-Dollar Business.


Transcript:


ELEANOR DRAGE:

 Hi, and welcome to another episode of the good robot hot takes. Every couple of weeks, Kerry and I are giving our hot take on the biggest issues in tech. Our conversation today about what diversity means in tech. Takes us from the new Barbie movie to the Twitter rebrand, Ursula Andress, and all women panels. We hope you enjoy the show.


KERRY MCINERNEY:

 Hello and welcome to our latest Good Robot Hot Take, where we bring you our spicy opinions on whatever is happening in tech. And as you know, if you follow anything to do with AI on the news there's always a lot happening.


And so this morning, for example, I was on Twitter and was finding out about Elon Musk's plan to change Twitter from the little bird logo and being called Twitter to just being called X, like the letter X. I don't know why, honestly, the whole like last year of Twitter has just felt like a sort of crazy fever dream.


But for those of you who are thankfully not on Twitter anymore and have escaped, well done. But this really brought to light one of the things that Eleanor and I think about quite a lot, which is diversity in the tech industry or thinking about these kinds of big tech figures, people like Elon Musk, who just have such an outsized impact on the daily technologies that we use, on the kinds of technologies that get made and created, and on the kinds of needs that get prioritized in the tech industry in general when it comes to user preferences and desires. And so Eleanor and I, we work on gender, race, and technology.


So this is something that we think about quite a lot. Eleanor, just to kick us off, could you tell us a little bit about the tech industry or specifically our area, the AI industry? What does this industry look like at the moment? Who's there, who's in the room, and maybe who's not in the room?


ELEANOR DRAGE:

I just want to go back to your Twitter point, because the X thing is interesting. It implies that X stands for everything, X is used in math, it could be this universal...


KERRY MCINERNEY:

And there's something that's it's amazingly universalizing and yet also vague. And I love that because I feel like there's been this profound directionlessness when it's come to Twitter over the last, few months or year. And I feel like the X captures that, but it also very much like the branding of it, it's just like black background. And then there's like black and gray X compared to the kind of cheery blue of Twitter and the white bird. It very much looks when they like brand things for men. And it's like deodorant for men, like Lynx or whatever.


Like it's if Lynx had rebranded Twitter, that's what it looks like.


ELEANOR DRAGE:

Yeah, like a bachelor pad or like Clinique men line, you know, only use concealer if it comes out of a black applicator tube.


KERRY MCINERNEY:

Exactly. It's like you're a men hate? Color. Men can't stand any semblance of color or joy. So yes there's also very on trend at the moment, Oppenheimer film discourse going on at the moment. The rebranding of Twitter is very much in line with the Oppenheimer aesthetic.


ELEANOR DRAGE:

So it's about appealing to men under the guise of appealing to everybody. And that's a real problem. We look at diversity in a lot of different ways. I think that Kerry and I never really expected to be working in diversity and inclusion, with gender studies people.


So we look at particular ways that the world becomes masculinized or feminized, those kinds of processes where gender can be seen because gender is about understanding how people live in a world that is so structured by power.


And so coming to diversity and inclusion in tech was something that, yeah, we didn't really intend to do. And we look at it in lots of different, maybe surprising ways. So we are interested in those questions of who's in the room, but there's people that do it better than us, like Judy Wacjman or, the people that have been looking at how tech companies and generally the workplace deals with gender for a long time. Instead, I think maybe what the contribution that I see in our work is that we look at things that you may not normally look at, like how is The idea of maintaining AI systems being thought about and could we inject some ideas from feminist ideas around maintenance and an ethics of care.


And then, by looking at those things we think okay, we need also to have different voices that can introduce these kinds of ideas. Obviously it's still really important when we think about which technologies can be harmful. For example, internet of things devices can be used to track and monitor people in the household.


And this is often used by domestic abusers. Now, if you're someone that's already experienced domestic abuse, you're more likely to flag to your tech team that's something that could happen. And in the company that we are working with, that's what happened. There was someone in the team that had experienced that.


And so they flagged it to the rest of the organization. And that's why you need diversity, also a diversity of experiences. It's not just women, right? Not all women experienced domestic abuse. It's usually women, which means that to have someone in your team that can flag that is really important.


KERRY MCINERNEY:

And I think this is a tension in our work that I suddenly experienced quite a lot, which is that on the one hand, like we see gender and race is these kinds of socially constructed kind of systems of power. And so we don't really want our work to just be reduced to this kind of head counting, where we're saying like, who's in, who's out. But on the other hand, like we also do really believe in this idea of lived experience and the way that our lived and bodily and personal experiences are deeply political and that they really matter. And so in that sense, it actually does actually really matter who was in the room.


And so I feel definitely myself oscillating between sort of these projects, which are very much about how do we create more meaningful, engaged participatory tech design processes where everyone is meaningfully involved but also recognizing that like just bringing people into a space, especially one as Sara Ahmed argues, that might not have been designed for you or for you to flourish isn't the only answer and certainly can't be the whole answer.


ELEANOR DRAGE:

Yeah, this is such an important point that we bring all of ourselves to work. We all know this, you turn up and it's not just you and your expertise on the day. It's, what you've been through that morning and you can get ideas from all aspects of your work. So that's the reason why diversity is important is because you're not just coming to the work with your qualifications, you're coming with all your embodied experience, everything you've experienced day to day and all those things influence the kinds of products that we develop or don't develop the choices that we make.


KERRY MCINERNEY:

And I also think though, like something that the pandemic definitely really showed me, and I think the switch to a lot of remote working was how uneven these experiences are and the extent to which people can bring themselves to work. And this is something that I particularly notice when dealing with say, like male colleagues or kind of men in the tech industries who often very much, say things like, Oh, I'm in this, or I care about diversity because of like my children or because I'm a father or this is my kind of main interest in this area. Or this is why I need to shape my career in a particular way. And on the one hand that's really lovely. And again, like there's important feminist work to be done in challenging very rigid and very heteropatriarchal ideas of parenthood. On the other hand, I think, oh, I'd love to just go into my workplace and be like, I'm a mother if ever when I have kids, and not have people immediately be like, okay she can't do her job then, which I think it's still this very pervasive sexist belief about women in the workplace who are also parents. Or thinking about how also, race has really mediated people's ways of being able to bring themselves into a workplace and part of creating better and more inclusive and more diverse workplaces is recognizing, there's a lot of people for whom like they very deliberately construct a lot of walls and they say, I'm not willing to be open and to bring myself into this space because I don't feel safe here.


And again, sorry, looping back to the pandemic, this is where I was originally going with that point. The swap to remote working and how many particularly women of color I talked to were very much like, Nope, I prefer working at home. I don't want to go into the office. It's just safer for me to work from my home environment.


And that obviously was not all people who are marginalized experiences. But it was interesting anecdotally to hear how unsafe a lot of people felt in the institutions that they worked in, that they felt like this was safer to have a separation.


ELEANOR DRAGE:

Exactly. So some people are allowed to bring their full selves to work, and some people are not. And that's often why diversity and inclusion fails and organizations because you can hire a diverse group of people to work for you. But if they're not allowed to bring their full selves to work, or they're having to change who they are in the workplace, then you're not creating a diverse workforce. You're creating it in name, but not in reality.


KERRY MCINERNEY:

Yeah. And I think also recognizing there's so many differences between people who get coded within corporate structures as diverse. And I think we see a lot of these inequalities playing out, I'm really interested in relation to race, but there's so many kinds of inequalities and harms, which are not as easily visibilized as something like race, which we understand as being very epidermal, as being written on the skin and on the face.


And yet, these kinds of inequalities are very profoundly important, but they aren't easily commodified and tokenized in the way that race, I think is in these corporate structures. So for example I'm incredibly passionate about access to institutions like Cambridge for students of color.


But the most kind of underrepresented group when I was originally helping out on Cambridge's access side at the student undergraduate level is people have been through the foster care system or people who have been through various kinds of state or social care. And and this is a group, I think, that actually doesn't necessarily get the same kind of attention or talked about because it's not as visible as a kind of exclusion, but it's a really profoundly important one.


And I'm sure there's so many other kinds of examples of this, of what kinds of diversity or what kinds of difference from this kind of very white sort of middle upper class heterosexual norm gets counted and what kinds of difference don't get counted.


ELEANOR DRAGE:

That's really interesting. I didn't know that.


KERRY MCINERNEY:

In my like kind of access training at Cambridge someone said, Oh, this is actually the group that we're working hardest to try to create more pathways for. And yeah, that was just something I was not aware of.


ELEANOR DRAGE:

Yeah, you wouldn't be able to create that kind of diversity without knowing in the first place and that's why a lot of this research or however they discovered this is a really important aspect of making institutions more diverse because without that kind of informational knowledge, you know that group doesn't exist because we don't know that it's something that needs to change.


KERRY MCINERNEY:

Yeah. And I think it's like, again, this is another one of those kinds of like tricky tensions like the one I discussed earlier around like head counting versus kind of actually people have to be in the room which is also like collecting data around diversity.


Again, I keep saying on the one hand, on the other hand, which is like a very lazy kind of essay structuring, but it's early on Monday morning and so that's fine for now, I think. I think it's yeah, diversity data can be incredibly powerful for generating change. And, we had Monica and Ella on an episode of the podcast previously I'll link it in the show notes and on our webpage where we have a full transcript for each episode where they talked about their kind of anti- racist data collection process, End Everyday Racism at Cambridge where they collect a lot of effective and emotional data about racism and how it affects how students and staff move through these different kinds of institutional spaces. But on the other hand, and Kevin Guyan talks about this in his episode and queer data.


It's something I think we've probably both personally experienced how we collect data can cause a lot of harm, whether that's boxing people into preexisting categories that don't make sense for them through to creating and using categories that are offensive. But also collecting data can sometimes stand in the way of, maybe doing more important work.


So for example, Cambridge, it just came out this morning, I got an email uh, that in 2018, Cambridge University very silently and quietly stepped away from its Stonewall commitment. And again, it didn't advertise this. We didn't really know this. So it doesn't matter how much data that the university is trying to collect, say about queer experiences, there's been this now fundamental breach of trust with the kind of queer advocacy networks and organizations among staff at Cambridge and the university, because of the way that they've shown that they actually don't really care necessarily about creating this kind of positive queer affirming environment.


So yeah, I think that this question of data collection itself is also just such an interesting and contested one.


ELEANOR DRAGE:

So you're saying that there's this double bind because on the one hand we need to collect information so that we know, for example, that people who've been raised in foster homes are less likely to get into Cambridge.


But on the other hand, The process of counting people is not as easy as we think because it's difficult to know how people are being counted and the way that they're being quantified and whether that's helpful or not helpful and whether it's going to be used against them as well as for them.


KERRY MCINERNEY:

Yeah, exactly. And I think just quantification to me, and I say this as someone who very clearly is like a social scientist from a more qualitative and arts and humanities based background methodologically. But I think there's always something necessarily reductive and quite ugly in many ways about trying to quantify life experiences that don't fit very well into discrete categories.


And yet many of these Institutions only understand data that has been quantified in a particular way. They want to track trends and map people very neatly against certain data points. And that in and of itself, I think poses a lot of problems because again, most people don't want to be reduced down to a neutral data point.


And this is something we talk about in our episode on hiring tech, our hot take, which again, you can find on our webpage, and you can find this YouTube, Apple, Spotify, many platforms. There's something in that process of reduction itself, which I think almost kind of mimics, I think, some of the forms of stereotyping or some of the forms of kind of reduction that a lot of marginalized people experience throughout our lives.


And then that itself can be quite painful to have to relive that in the name of creating positive change.


ELEANOR DRAGE:

Yeah, people hate being reduced to data points, but this is exactly the same thing when it comes to why, for example when white males were told you're just a white male, they're like, no I'm me, I'm a person.


And they got to feel what it was like to be reduced just to a demographic oh, I'm not just a white male, like, why does everyone hate white males? And you're like now you understand what it's like to be discussed as a group of people rather than as an individual. So that's a really interesting thing that I think has happened recently.


KERRY MCINERNEY:

The whole like Ken and Barbie and Ken and like the kind of male outrage about the portrayal of Ken has been like very on point with this. And it's just for those of you who haven't seen the Barbie movie, it is hilarious. But also Ken is just portrayed how every woman is portrayed in pretty much every single film ever written by a man. And now, people, particularly men, are very outraged at how Ken is objectified, how Ken is like, kept to the sidelines of the plot. He's also not, it's also not even accurate, but to some extent, I think it does bring me a little smidgen of petty joy to be like actually, this is how women have seen themselves on screen for decades.


ELEANOR DRAGE:

Yeah. And Barbies can be defined through their work, right? They do this diverse range of what like, I'm a rocket scientist or I'm in the Congress, whereas Ken is just defined as beach, and beach is not even a job description. So there's this very kind of limited way that Ken can live.


Because of what kinds of boxes he finds himself in literally.


KERRY MCINERNEY:

I love the beach as well as his job, because it also reminds me of every Bond girl, like coming out of the water in a bikini and it's like, was she just like waiting for someone to come by and it's like her job is beach, like what other job does she do? I don't know. And it felt very like an inversion I think of these very tropes, which are very normalized in film otherwise like of course there's going to be some woman in a bikini just waiting underwater with a straw to breathe through for some guy to walk by so she can come out of the water. But when it's Ken with his job beach, suddenly it becomes this kind of big societal flashpoint.


ELEANOR DRAGE:

Yeah, that's excellent. And it's an ontological statement as well, which means it's a question of being, because beach is not a thing that you do, it's a thing that exists. And so if you are beach. That's just defines the whole of you. It's not something that you can, an activity that you can engage in, it's literally who you are. He is just beach. And he can't do anything else because... he can't even really do beach because it turns out the waves were made of plastic, it's interesting. You just hurt yourself on the waves.


KERRY MCINERNEY:

Yeah, no, exactly. And I was actually giving a talk yesterday at this fantastic event organized by Green Lions, which is like an East and Southeast Asian community group around climate justice. And so if you are, London- based and ESEA and interested in climate justice, definitely check out green lions. But it was on this idea of objectification and thought thinking through different kinds of gendered and racial histories, objectification, how that relates to environmental degradation. And then one thing I was talking about was having like, just seen the Barbie movie, I was saying, trying to think about what it means to be an object. And I really liked that line in the film when she said, I want to be the one doing the imagining, not the idea. And I think that's this beach thing. Like beach is an idea like beach isn't even one located place. It's just this plasticky concept of sunshine and beach. And I feel like the film captured that really well where it's what it means to only be an image or only be an object.


ELEANOR DRAGE:

Exactly. So back to Musk, right? He has so much power to imagine the world. He is literally creating tomorrow. And that means that we need to be really careful about who we put in those positions that are so crucial in deciding what tomorrow looks like. And if there's a lack of diversity, and you can see that with Musk's friends, the other sort of innovators who all have similar disciplinary backgrounds, they look pretty similar, their life experiences aren't really far away from one another. This means that tomorrow is extremely limited, but also very dangerous because it's a place that will be genuinely unsafe and unfruitful for lots of different kinds of people.


KERRY MCINERNEY:

Yeah, and I think, again the point here is not saying oh, we... Don't think that like men should be making technology, the point is we just need to get more people of a certain demographic in the room. I think it's almost more broadly just zooming out and saying not only should communities who are affected by technologies be the ones making them, that there's such a disconnect between the people like Musk who are having control over the tech that we use, and the people who actually experience the ill effects of this but I think also just this bigger zoom out question of what do we take to be normal? Like, why is it normalized for us that there's a very narrow group of people from a very narrow demographic, which tends to be white men who are just expected to be occupying these spaces of power.


So I went to Sustainable AI conference in Bonn. It was fantastic. But there was a gender and AI panel and then two separate male audience members asked, why is this panel all women? And this kind of garbage is unfortunately quite normal at conferences to get these sort of slightly aggressive questions. I think in this case, they were actually very earnest questions. I was like why aren't men represented in this panel on gender to which kind of the panelists handled it very well. And they said, look, the point is not that we don't think men have important things to say about gender because gender and patriarchy, profoundly shapes and affects men's lives too.


But it's just so normal to go on a panel at a conference on AI and for there to be no women, for there to be one token woman. That we put together this panel of all women to show conference organizers and show people how many phenomenal women are out there working on AI. And so next time they do a panel, there's no excuse.


They need to be having a lot more women on those panels. But just the fact that just the vision of an all female panel at a conference itself is so abnormal to some people. It's such an affront to their understanding of what it means to be an expert, where knowledge comes from, that they have to actively question it and push for a different vision.


That's something that just really worries me.


ELEANOR DRAGE:

It's a real misunderstanding about what balance means. You're looking at balance only within the context of four people, rather than within the context of our entire sort of, history in the past however many thousands of years, you're trying to extract those four women from this wider context where balance means slowly undoing the masculinization of most of these important domains, which is really closed minded and also a very kind of little angry thing to do. How annoying.


I think only Ken in the Barbie movie is entitled to ask why Congress is only women, but we don't live in a Barbie world, unfortunately. And so you cannot imply that, reverse discrimination is a complete undoing of the world in which we live, because it's unfortunately not, doesn't even scratch the surface equally and all female panel is asymmetrical to reality rather than it's opposite.


KERRY MCINERNEY:

Yeah. And I think it's one of those things where it is genuinely tricky in some ways to negotiate these questions around who speaks and how do you try and create spaces. And again, this comes back to this question of like, how do you thrive in institutions that weren't made for you? I like to create spaces where women's voices and marginalized people's voices are like, meaningfully taken seriously, are meaningfully taken as expert voices and are just treated equally and prioritized in this way. In a way that is sensitive to these kinds of centuries of educational exclusion, of systemic injustice that have led to people of color, women being seen as non- authoritative voices.


But anyway, thank you so much, Eleanor, for joining me for a very long discussion of the Barbie movie. What a nice way to start off a Monday. If any of you are interested in some of the questions that we raised, we have a lot of different Good Robot episodes that touch on the themes we've raised around diversity, around inclusion also around systemic change in the tech industry.


And so they'll be linked on the transcript for this particular episode, but you can find out our website. www. thegoodrobot. co. uk. We also love to hear from listeners. So if there's anything in particular that you'd like to us to discuss on our next hot take, feel free to write into us, our email addresses are available, or you can find us on Twitter, Instagram, and tech talk.


But until then, we will see you in a couple of weeks time for our next episode.


 





19 views0 comments

コメント


bottom of page