top of page
Search

Leonie Tanczer on Gender, Security and Technology

In this episode, we chat to Dr Leonie Tanczer, a Lecturer in International Security and Emerging Technologies at UCL and Principle Investigator on the Gender and IoT project. Leonie discusses why online safety and security are not the same when it comes to protection online; how to identify bad actors while protecting people’s privacy; how we can use ‘threat modelling’ to account for and envision harmful unintended uses of technologies, and how to tackle bad behaviour online that is not yet illegal.



Leonie Maria Tanczer is Lecturer in International Security and Emerging Technologies at University College London’s (UCL) Department of Science, Technology, Engineering and Public Policy (STEaPP). She is member of the Advisory Council of the Open Rights Group (ORG), affiliated with UCL's Academic Centre of Excellence in Cyber Security Research (ACE-CSR), and former Fellow at the Alexander von Humboldt Institute for Internet and Society (HIIG) in Berlin. Prior to her lectureship appointment, Tanczer was Postdoctoral Research Associate for the EPSRC-funded PETRAS Internet of Things (IoT) Research Hub, where she was part of its "Standards, Governance and Policy" research team. Tanczer holds a PhD from the School of History, Anthropology, Philosophy and Politics (HAPP) at Queen's University Belfast (QUB). Her interdisciplinary PhD project included supervision from both social sciences and engineering (ECIT) and focused on the (in)securitisation of hacking and hacktivism.


Content Warning: This episode contains discussion of tech abuse, domestic violence, gender based violence, and coercive control.


Reading List


Parkin, S., Patel, T., Lopez-Neira, I., & Tanczer, L. M. (2019). Usability analysis of shared device ecosystem security: Informing support for survivors of IoT-facilitated tech-abuse. Proceedings of the New Security Paradigms Workshop, 1–15. San Carlos, Costa Rica: Association for Computing Machinery. https://doi.org/10.1145/3368860.3368861

Tanczer, L. M., Lopez-Neira, I., Parkin, S., Patel, T., & Danezis, G. (2018). Gender and IoT (G-IoT) Research Report: The rise of the Internet of Things and implications for technology-facilitated abuse (pp. 1–9). London: University College London. Retrieved from University College London website: https://www.ucl.ac.uk/steapp/sites/steapp/files/giot-report.pdf

Slupska, J., & Tanczer, L. M. (forthcoming). Intimate Partner Violence (IPV) Threat Modeling: Tech abuse as cybersecurity challenge in the Internet of Things (IoT). In J. Bailey, A. Flynn, & N. Henry (Eds.), Handbook on Technology-Facilitated Violence and Abuse: International Perspectives and Experiences. Emerald Publishing.

McGlynn, C., Johnson, K., Rackley, E., Henry, N., Gavey, N., Powell, A., & Flynn, A. (forthcoming). ‘It’s Torture for the Soul’: The Harms of Image-Based Sexual Abuse: Social & Legal Studies. (Sage UK: London, England). https://doi.org/10.1177/0964663920947791

Harris, B. A., & Woodlock, D. (2018). Digital Coercive Control: Insights from Two Landmark Domestic Violence Studies. The British Journal of Criminology. https://doi.org/10.1093/bjc/azy052

Dragiewicz, M., Burgess, J., Matamoros-Fernández, A., Salter, M., Suzor, N. P., Woodlock, D., & Harris, B. (2018). Technology facilitated coercive control: Domestic violence and the competing roles of digital media platforms. Feminist Media Studies, 18(4), 609–625. https://doi.org/10.1080/14680777.2018.1447341

Citron, D., & Franks, M. A. (2014). Criminalizing revenge porn. Wake Forest Law Review, 49, 345–391.

Freed, D., Palmer, J., Minchala, D., Levy, K., Ristenpart, T., & Dell, N. (2018). “A Stalker’s Paradise”: How Intimate Partner Abusers Exploit Technology. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, 667:1-667:13. New York, NY, USA: ACM. https://doi.org/10.1145/3173574.3174241

Havron, S., Freed, D., Chatterjee, R., McCoy, D., Dell, N., & Ristenpart, T. (2019). Clinical computer security for victims of intimate partner violence. 28th USENIX Security Symposium, 105–122. Santa Clara, CA.

Freed, D., Palmer, J., Minchala, D. E., Levy, K., Ristenpart, T., & Dell, N. (2017). Digital Technologies and Intimate Partner Violence: A Qualitative Analysis with Multiple Stakeholders. Proceedings of the ACM on Human-Computer Interaction, 1(CSCW), 1–22. https://doi.org/10.1145/3134681

Yardley, E. (2020). Technology-Facilitated Domestic Abuse in Political Economy: A New Theoretical Framework. Violence Against Women, 1077801220947172. https://doi.org/10.1177/1077801220947172

Henry, N., & Powell, A. (2018). Technology-Facilitated Sexual Violence: A Literature Review of Empirical Research. Trauma, Violence, & Abuse, 19(2), 195–208. https://doi.org/10.1177/1524838016650189

Tseng, E., Bellini, R., McDonald, N., Danos, M., Greenstadt, R., McCoy, D., … Ristenpart, T. (2020). The Tools and Tactics Used in Intimate Partner Surveillance: An Analysis of Online Infidelity Forums. 29th Security Symposium (USENIX) Security 20, 1893–1909. Online. Retrieved from https://www.usenix.org/conference/usenixsecurity20/presentation/tseng


Messing, J., Bagwell-Gray, M., Brown, M. L., Kappas, A., & Durfee, A. (2020). Intersections of Stalking and Technology-Based Abuse: Emerging Definitions, Conceptualization, and Measurement. Journal of Family Violence. https://doi.org/10.1007/s10896-019-00114-7

Stevens, F., Nurse, J. R. C., & Arief, B. (2020). Cyber Stalking, Cyber Harassment, and Adult Mental Health: A Systematic Review. Cyberpsychology, Behavior, and Social Networking. https://doi.org/10.1089/cyber.2020.0253


Transcript


KERRY MACKERETH:

Hi! We're Eleanor and Kerry. We're the hosts of The Good Robot podcast, and join us as we ask the experts: what is good technology? Is it even possible? And what does feminism have to bring to this conversation? If you wanna learn more about today's topic, head over to our website, where we've got a full transcript of the episode and a specially curated reading list with work by, or picked by, our experts. But until then, sit back, relax, and enjoy the episode.


ELEANOR DRAGE:

Today, we’re talking to Dr Leonie Tanczer, a Lecturer at University College London in International Security and Emerging Technologies at UCL. We discuss why bad behaviour online can still be violent, even if it’s legal; how to find the perfect balance between online safety and security so that we can identify bad actors but also protect people’s privacy, and how to improve threat modelling to account for and envision scenarios where things go wrong in unintended uses of technologies.


KERRY MACKERETH:

Great, thank you so much for joining us. Could you please introduce yourself, tell us a bit about what you do and what brings you specifically to the topic of feminism, gender and technology.


LEONIE TANCZER:

Yeah, my name is Dr. Leonie Tanzer. I'm a lecturer in international security and emerging technologies at University College London. And I believe the topic of feminism, gender, and technology relates to my work in this space, and I’m particularly interested in the intersection of gender, security, and technology. And I've been keen about this topic probably already, since I'm 16/17. That's when I started to become very interested in gender equality issues because I realised, well, boys were treated quite differently to girls like myself. And I think after my interest in gender inequality issues and feminist issues, I started to become more interested in technology issues after I took a summer school with Professor Nikolaus Forgó a couple of years into my university degree and was opened up to issues around privacy and safety and security that really blew my mind. And I saw immediately the connections between the importance of making sure that technology is safe and secure, and gender and feminist issues.


ELEANOR DRAGE:

Our podcast is called The Good Robot, and with this in mind, we'd like to ask you what good technology means to you? Can technology ever be good?


LEONIE TANCZER:

You know, that's a very interesting question because the term ‘good’ is a normative word, and it depends on good for whom. I certainly believe that there is something like good, but what I understand is good and useful and helpful and beneficial, might be different to what some other people believe as being good. But I just generally think that we could try our best to make more secure, safe and privacy-aware technology that has the human rather than, for example, economic values in mind. And I think that's perhaps the ground rule for good in my regard, I think good has to do with making technology human centred, making it focused on the needs of the individual or the society. And even society is a very broad concept here, but like for the specific community that you're designing a system or device for, and I think that's really what I think good technology is: having a specific user group in mind, but with this not making it specific to white heterosexual men, but potentially applicable to a very unique group that may have other needs beyond what an able-bodied person with a lot of privilege might have in mind, and I think that's what good technology is: designing for a specific user group, with their interests in mind.


KERRY MACKERETH:

Fantastic, I'd love to hear a little bit more about this. What kinds of harms or problems can emerge from not taking into account the specific needs of a particular user group?


LEONIE TANCZER:

Gosh, there's dozens of harms that I can think of. I mean, you mentioned racism and sexism. I think these are the ones that we hear a lot about in the news today and they're extremely harmful to so many groups and communities on this planet. But I think there's also other social categories that we need to discuss more carefully. I think one term that is, you know, heavily contested, and potentially even in the UK context, that is perceived negatively is class, you know, social class is something I think is really important when it comes to the accessibility and usability of technologies. Who has the opportunity to purchase certain devices, but also the knowledge and time to engage with these devices, to make sure that they're safe and secure? I think that is something I think is really important, and, unfortunately, is not discussed enough at this moment in time. The other thing I think about when it comes to harm is also an able bodied person, you know, thinking about people with speech recognition problems, people with an accent, I mean, you can hear it from my own way of speaking, I don’t have perfect English. As of now, I'm working very hard on it. But nevertheless, my Amazon Echo continuously does not understand me asking for certain requests. So I really think it's important to consider the diversity of people, whether it be about their gender, their ethnicity, their language, their sexuality, but also their class. And so I think that is something where harm can really come into play when these categories are ignored, or simply become oblivious, because we treat all people the same. But I think one particular one I want to hone in on is actually economic harm. I think, you know, one thing we are often focused on is economic harm, because we are considering it as a crime. You know, there's prosecutions, but I think there is some value to discuss both illegal forms of harms, which we already have legislation against such as cybercrime, but also, what is now incredibly important is to talk about things that are perhaps not yet illegal, but are harmful nevertheless. So in the UK, we have discussions around this with regards to the online harms white paper. And I know that the communities that work in this space, that study this space, are absolutely torn apart on the question of, you know, what's illegal, what's legal, and that we can't regulate things that are harmful, but legal. And I think, you know, that to showcase the necessity to broaden our horizon of what's ok, and what's not ok. And ok, again, that’s your normative question. But potentially, you know, what we have defined as illegal has transformed over years and decades and centuries as well. And perhaps we need to broaden our understanding in this regard as well, in the future. That's something I have been thinking of for a while. But I think it's important that in the conversation around harm, that harm does not necessarily mean something is automatically illegal, because it can be legal, but harmful. Nevertheless, I can give a concrete example if that's useful. With this regard, for example, I'm working a lot in regards to intimate partner violence and technology. And unfortunately, with the support organisations that we're working with, with the victims and survivors that we're working with, many of them are challenged by the fact that their experience is not necessarily illegal. So for example, a partner, you know, posting something on Facebook, that is absolutely legitimate, like a picture of a dog or simply an mp3 of a certain kind of music recording, you know, you can share that content, but it has a very harmful or potentially harmful meaning to the person that is on the receiving end. So how are we going to deal with that, you know. A technologist would probably say, just block that person, ignore it. And, of course, that is one avenue to deal with this. But in the context of a prosecution, I think also, these elements need to be taken into account by a judge to say this person deliberately tries to create an environment which makes it difficult for the victim and survivor to for example, use this platform or be freely acting online. And I think, while it's not illegal to post a picture of a dog or an mp3 file, I think it is important to contextualise these things more than we have done in the past.


KERRY MACKERETH:

Absolutely, thank you so much for all those really different insightful points and examples, I think it's particularly illuminating to hear about the work that you do on [intimate partner violence] IPV and new and emerging technologies, which is something that, you know, personally really, really concerns me. And you've mentioned that you come from a background in international security, and something you mentioned a lot as this idea of safe and secure technology. And so could you explain a little bit? What do you mean by the term safe and secure? And how do those concepts play out in your work?


LEONIE TANCZER:

That's a really, really good question. So I have a German speaking accent. Now, the fascinating thing is, in German ‘Safety and Security’ does not exist, it's just one term, and probably the best way to describe it is security. Now, since I have worked more in the English language, I really love the fact that we have ways to differentiate between safety and security. And for those unfamiliar with the difference, I always give the example of a door, you know, for security purposes, we should lock those doors all the time, we don't want an intruder to come in, right. But for safety purposes, all these doors should be open all the time. Because if there's a fire, or if there's something happening inside, all people, all individuals should be able to leave, for example, to house immediately. And you can see that these both concepts are extremely, you know, exclusionary, they don't necessarily work together. And that's what I like about the English language here of having a word to describe these differences, which does not really necessarily work like that in German. So what I encounter in my work is day to day, these discrepancies don't work together. Because we may think that a certain functionality a certain feature is beneficial for a security purpose, such as, for example, locking things, or making things harder to access, or making things harder to trace, for example, but for safety purposes, it is really important sometimes to have these traces and those bits of information that, for example, victims of intimate partner abuse can use in prosecution purposes. So I think I encounter the day to day dilemma of these two concepts, and then bringing even privacy into the mix makes it even harder. Because, you know, it gives this additional level of making sure that individual information from all the people that are involved in a system is guaranteed. But sometimes we want to perhaps uncover someone who is harmful in a situation and reveal their identity, which is not possible because we have ensured those privacy settings are in place. And to this very day, I think we haven't found a very good solution of how to make a good measure of all these ingredients - to make it nearly like a shake and think about it as a cocktail. We haven't found the perfect mix yet. But I do think it is essential that we have a discussion around how much safety, how much security, how much privacy is needed for, let's say, certain groups of communities, and also allow for different levels of safety, security and privacy for high-risk individuals as well. And I think the fact - coming back to the original question - that we just have this standard template of a user in mind means that we often forget that we need to ensure that different settings and features should be applicable to different individuals differently.


ELEANOR DRAGE:

You've applied feminist methods and ideas to your work in lots of ways. So what does feminism mean to you? And has your relationship with feminism evolved over time?


LEONIE TANCZER:

Absolutely, I would say I - I teach a course at UCL called Gender and Tech. And I always laugh in the first term about my course because I feel like it's really easy to give this idea of feminism as this kind of all encompassing concept, but it's not. And I often ask my students at the start of the term, do you consider yourself a feminist? And, you know, I would say this year 50% of the students were like, yes, I am. And all the rest of the class was not, and that changed by the end of the term. I hope not to the negative, because of my course, but nevertheless, I think it's important to discuss feminism as not one single thing. I initially started off, I would say, as a very much Marxist feminist but I think it's really important to account for aspects around like, you know, class and the interrelationship with gender and how they play out together. But I would say, I've moved on and become more interested in other elements as well. Where, you know, I nearly hear us say it, perhaps we can call it neoliberal feminism where you talk about things about equality in a workplace, where it's no longer just about smashing the system, but actually kind of working in it when you have to. So I would say, where I'm situated now, I'm not yet clearly sure. I certainly see the roots that I have, and where I've come to. But for me, feminism is a simple understanding of ensuring that all genders - and it’s really important here, not just women and men, but all other forms of genders that are there - have a form of equal representation and equal opportunities in the workplace. in the public, anywhere that is important. And I think that's kind of what feminism boils down to for me. But that then, you know, it’s important to contextualise that it's different for different people. So if you talk - I'm a white privileged woman, we're from Central Europe. So you know, the realities that I'm exposed to that I feel on a day to day basis are completely different than to, for example, some of my students that may have come from the Global South, maybe having, you know, a huge discrepancy with regards to access, to abilities, etc. And what they've probably experienced as feminists to be will be very different to me and my history and may experiences. Nevertheless, I think, what I think we all can agree on is that it's important that we all have justice, that we all have equality, and that we're all treated fairly. And I think that's really what it boils down to.


ELEANOR DRAGE:

I wanted to ask you about how feminism plays out in very different ways in all of your different work, all of your publications, can you give us some examples of how you explore feminist theory or methods in different ways in your work?


LEONIE TANCZER:

I think one of the most important texts ever read, and one of the most important experiences I had during my university time was, you know, thinking about situatedness. And in thinking about, like, your position when you write something, when you research something, and I think that is the core understanding I took away from feminist theory, that for me that when I write something I make clear of who is writing it, and from what standpoint I'm writing it. So standpoint theory is a really important theoretical approach for me as well. I probably make it not very explicit in my publication, but I do, I’m very considerate of having limitation sections, accounting for my positionality and being transparent about from what standpoint I'm writing from. And I think that's something that, you know, I think we all need to take into account, because when I read a paper, let's be honest, most of us immediately envision what that person is like, and probably what their experiences are. And I think instead of having someone interpret this, just make it explicit and say, Okay, I'm writing this, you know, from that perspective, and I'm coming to this as an outsider. And I think that's also something important to make clear in a text, because when you, for example, are not, you know, a Black woman in writing about, let's say, Black feminism, you know, your words, and your analysis may come to different conclusions. And if you were, it's important to reflect upon that. So I think that's something I tried and aspire to do, I certainly could do better. But I think there's also this clash with the expectation of the scholarly discourse of how much you write about AI and how much you bring yourself as an individual into your research. I think there's more openness in certain pockets of academia where it's accepted, but I say that with the caveat that I'm also working a lot in areas in engineering, where this is probably still rather old. And I think, nevertheless, it's important to make this transparent, at least in your public communication of your work. So I think that is something where I bring in feminist perspectives. In addition to that, I think I'm just so concerned always to put a gender lens on an thing I do, to the detriment of being not always very liked, even with some of my colleagues, probably. But I think, you know, of course, I could bring in other categories such as class, you know, ethnicity, anything like that. But again, because of where I am and who I am, gender is something that just comes immediately to my attention. And I can see immediately like, things playing out differently than if I were not the person who I am. So gender is something that immediately when I look at even a technology that pops out, and I think immediately, ok, why could it be that this is probably more user friendly to, you know, certain type of user than another? And that's, that's probably a reason why I'm drawn to this category, because there's no way I can not see it, if that makes sense.


Eleanor

Timnit Gebru, has said that there's still a strong feeling in computer science, that if you're an activist, you can't be a computer scientist, because computer science requires neutrality. And we were speaking last week with Sneha Ravenur, who is this fantastic young activist, and she said that Gen Z will produce a new generation of ethical developers, which is quite an exciting idea. And I was wondering, as someone who teaches and also interacts with younger generations, do you think that activism will relate to computer science differently for younger computer scientists? How might the relationship between activism and computer science evolve in the future?


LEONIE TANCZER:

That's a really, really good question. I mean, I have pondered about my role as an activist in the scholarly discourse for a very long time. Because I would say, as I said, earlier, I started off in a socialist Marxist environment, and then probably started my career in academia. So my roots are in activism, but I have ended up in a scholarly discourse. So here, I say it and make it public. But with computer science, I wouldn't even say it's, you know, limited to computer science: this idea of neutrality of objectivity is enshrined in all academic disciplines with a few exceptions. And I think I have no clear answer to it. I think we need to navigate a very, very thin line. And I'm saying this because I also am very interested in questions around scientific advice. So how is the scholarly community also helping to make decisions and evidence-based decisions about policy interventions? I mean, what better time to talk about this than during a global pandemic? And let's be honest, if the public would have any doubt about the, you know, information that, let's say scientific advisors would give, there would be immediately a critique and a doubt in their analysis and in their conclusions. So, of course, I understand why it's important to give that impression of neutrality, even though I accept that if you have read some of the science and technology studies literature it’s impossible to achieve. But with that being said, and that brings me back to the standpoint theory idea that I said, it's important to be transparent of where you're coming from, because that's the only way I can interpret the evidence and the conclusions that you draw. And so I think it's ... I'm sitting in this dilemma that like, of course, every part of my research has the intention and the aspiration to make society better, and well better, there is that, but also to change policy. Does that make me an activist? Probably it does, insofar as I'm trying my best to get my research into policy. And, you know, try my best to communicate it to a public audience beyond the scholarly community. But I'm also treading this fine line not to overstep and come across as like a full time activist if that makes sense. Because I do want to still have this notion of I'm having based my estimations or my assumptions or my conclusions on evidence. I understand that this is so difficult to achieve and I you know, I you know, I, I would understand if someone would still accuse me of some form of like, higher intention beyond my scholarly work, and, of course, because I'm interested in having a fairer and more gender equal world. But I still try my best to produce research that like, where it's traceable of how I came to this conclusion, make it clear in my methodology sections, how I went about doing certain things. And, you know, make it clear of what standpoint I'm writing this from, and then you can reject that based on all the limitations that are there, full disclosure here.



So it's difficult, I really don't know where the fine line is. I certainly - because you've mentioned my students, I was joking with them as well, that like, you know, they might accuse me of, you know, having an agenda in my classroom, and I certainly do, I want them to be critical, I want them to question what they are learned and taught and what they see in the world. But does that mean it's it's unacceptable? Or it's too close to activism? I don't know. I think it depends on the discipline. I mean, it's easier in a social science discipline to make this argument than, for example, in maths, potentially, not to say that maths is neutral, either, but ...


KERRY MACKERETH:

Thank you so much. It's really interesting to hear you think through these kinds of thoughts and questions because I do feel like as feminist scholars, we're often asked to bring forward kind of such strong justifications of why this like very outward-focused work and why this work, which does have an agenda and a rationale, you know, is scholarly work as well. And I feel that destabilising that kind of binary between scholarship and activism has always been such an important project to feminist scholarship and work and is one reason why, you know, I think Eleanor and I both really love it and are so invested in it. Yeah, and also, all the things you were saying about sort of the perception of neutrality and the way in which feminist perspectives and scholarship has also so fundamentally been challenging that. And it also reminded me of something you were saying, though, in your common your previous answers around the interesting distinction you're drawing between security and safety and privacy. And I'm interested in how certain technologies are perceived to be neutral, and that you know, things like, for example, technologies that are used to lock a door, things like that are seen as improving security, can get repurposed in really dangerous ways, often quite gendered ways as wondering, would you mind talking a bit about the secondary uses of some of these new technologies and the kinds of harms or problems that they can pose in relation to things like IPV or other like social harms and issues?


LEONIE TANCZER:

Yeah, absolutely. I think I should start by saying, you know, technology is not neutral. Tada. And what I mean with that is, every technology is dual use. Now, what I mean, what I mean with dual use is it has a certain type of purpose that was envisioned with the designers and the company that sells it, but it may be, you know, misused or reproduced or used in a different context by another person that sells it. Even simple examples, you know, you may want to use your Xbox to add on or harvest Bitcoin. That was not what the Xbox developers had in mind, but you ended up doing it nevertheless, and it works. So dual use here doesn't mean necessarily dual in terms of positive and negative, it just means it has multiple purposes. And I think the same applies to this context of like, you know, the smart lock or the smart doorbell, whatever you want to call it, if it is installed for a certain type of purpose in a certain type of environment, you know, it exactly fulfils the goals and intentions that the designers had in mind that if I'm not at home, and my Amazon delivery comes that I can let them in, or I can at least like accept the delivery order. If that's, you know, the perfect environment that, again, perfect in regards to what the developers had in mind. But I think - and that makes it so difficult to to account for all the different circumstances under which technologies will and are used, there will be situations where it is not going to be you know, this perfect environment that you had in mind as a developer and I think it's important for purposes of what we normally refer to as threat modelling to account for as many of these as possible and envision scenarios where some things are going wrong. And I wondered then, how would you design things differently? Or what functionalities would you perhaps amend, change or perhaps even omit as a consequence of these different threat models that you play for? And a core critique of our research project that UCL, the Gender and IoT project, is that we consider most threat models to be about external actors, with external being, you know, a person that is not in this perfect environment and tries to invade this perfect environment. But as any intimate partner violence scholar, any gender studies scholar, any criminologist will tell you, you know, the perfect household does not exist. And the domestic environment is not always happy and loving. And there's often you know, fights, there's harm, there's harassment, and there's violence. And so it's important to also account for the internal threat actors that sit within your own system. And technologists do that. They think about internal threat actors, but then they are employees. And employees are different, because you have different mechanisms to act against them, because they're not on the same status as, for example, the employer. Whereas in the domestic abuse environment, we do not want to have these discrepancies and authority levels. And that's something then a designer has to account for, that not everybody has the same abilities and the same access often to the systems as they envision them in the first place to be and to have. So basically, I think it's important that we model different scenarios and account for them and make a decision nearly like you do when it comes to ethics application. For example, in the research environment, I'm currently reading a lot of them. And I'm always wondering if our students or anyone who applies for ethics are thinking about all the scenarios that could go wrong. And that's what an ethics application should be about. And I think that's what threat modelling in the industry should be about thinking about all the different scenarios. And if you're struggling to envision those, because you know, your white heterosexual men, then bring on board other people that have different experiences, because that's the only way we can envision all these different avenues.


ELEANOR DRAGE:

We're interested not just in what feminism can do for technology, but what technology might do for feminism. So what's your take on that?


LEONIE TANCZER:

Well, I definitely am of the opinion that technology can do things for feminism. I mean, in my course on Gender and Tech, we talk a lot about cyber feminism. And it's something that was very, very popular in the 1990s. but has since I would say a bit cooled down. And the idea back then was technology and specifically the internet would revolutionise the way that like genders interact, because, you know, on the internet, nobody knows you're a dog, and so nobody knows that you’re perhaps a woman or that you’re man, or God knows what. And while these hopes were unfortunately not fulfilled, because we do know that there's research showing that the type of writing, the words people use indicate the type of gender they may be affiliated with. And because we have increasingly more not just text-based communication, but you know, we see each other on Zoom calls, on teams calls and you know, we can infer associations with genders because of that, and this anticipation did not unfortunately lead to a fulfilled and was not fulfilled and did not lead to this hype and this deal of cyber feminism. With that being said, I think most recently, the work on casino feminism is really exciting. For those of you unfamiliar with casino feminism, it trace basically to take the ideas of cyber feminism and bring itself into the 21st century by accounting, not just by of these ideals that cyber feminism probably got a bit wrong, but account for the possibilities, not just refers to gender, but also race intersexuality, you know, transgender ideas, and being more inclusive. And it's not just talking about men and women, but talking about other types of identities, and also other types of races and thinking about how technology transforms also the social categories, and how it can both benefits and these categories, but also how they harm them. And to think about the positives. You know, I certainly think the movements and the developments we have seen in recent years decades, in the feminist movement would not have been able without the internet and technologies. I mean, it's it's old fashioned, like saying, or a reference point, but the me to movement would have not been as big as it is. Because if there wouldn't be, you know, Twitter and this whole hashtag, movement or development that was there. Equally, I think there's avenues to use technologies for raising awareness community building, because there are opportunities for people to connect, to share their experiences, and to know that they're not alone. So I definitely think you know, technology has tonnes of things to give to feminism. But I think, you know, as your question poses, feminism gifts to technology with technology gives to feminism and they're both, you know, profit and also, you know, potentially are harmed by each other. And when I say harmed in regards to feminism, though, I don't mean it in a negative way of like, you know, because of technology, feminism is less important or anything like that, but rather, you know, I think the negative aspects we've seen as a consequence of technology use may be intimate partner violence through technology, or maybe just online harassment. That's something I think that has a negative effect on a lot of like vulnerable groups and communities, including, you know, gendered minority groups, that I think, you know, we wouldn't have seen in that scale. And to that, you know, effect. If there wouldn't be the internet, for example, equally, in my research on smart technologies, and the intimate, intimate partner violence situation, you know, the necessity to be physically close to someone to harm them has now been completely transformed, I do no longer need to be in the same environment, let alone in the same country, to change settings to scare someone to check in on someone. And that's something I think, you know, has has meant that like, was previously it was potentially possible for me to move country to move places, and I had still the risk to be found. Now I leave digital traces that make me far easier to find and far easier to connect with. And I think that's certainly transformed the ways intimate partner violence has been playing out in recent years and will continue to unfortunately as well.



Well, thank you so much for such a rich and interesting discussion or the various kinds of issues and problems, but also some quite exciting new directions that your own work and other famous work is heading in as well. Yes,


KERRY MACKERETH:

A huge thank you really from Eleanor and I, for appearing on the podcast. I've certainly learned a whole lot and I'm sure that our listeners will as well.


Perfect. Thank you so, so much for having me. It's a great pleasure and I hope many many people will follow into this technology and feminist and gender studies area to be interested in learning more about this space. Fantastic



38 views0 comments

Comments


bottom of page