In this episode, we talked to Azerbaijani journalist Arzu Geybulla, a specialist on digital authoritarianism and its implications on human rights and press freedoms in Azerbaijan. She now lives in self-imposed exile in Istanbul. Aside from writing for big publications like Al Jazeera, Eurasianet, Foreign Policy Democracy Lab, she also founded Azerbaijan Internet Watch and is writing a political memoir about a lost generation of civil society artists in Azerbaijan. We chat to Arzu about Azerbaijan's use of technology to go after diasporic community members or people who've been exiled from the country, how women are more often targeted than men, subliminal propaganda, misinformation and censorship in the recent Turkish elections, and the importance of tracking and mapping internet censorship and surveillance in authoritarian states.
Arzu Geybulla is the founder of Azerbaijan Internet Watch and an award-winning journalist and writer, with a special focus in digital authoritarianism and its implications on human rights and press freedom in Azerbaijan. In the past, Arzu has written for Al Jazeera, Eurasianet, Foreign Policy Democracy Lab, CODA, and Radio Free Europe Radio Liberty. She is a regular contributor at Open Democracy, IWPR, and Osservatorio Balcani e Caucaso. She is the recipient of several fellowships and was featured on BBC 100 Women Changemakers in 2014. Currently, she is based in Istanbul where she continues her journalism work and working on her book about the story of Azerbaijani political dissidents.
Jamillah Knowles & Reset.Tech Australia / Better Images of AI / Social media content / CC-BY 4.0
Reading List:
Azerbaijan Internet Watch: https://www.az-netwatch.org/
Art, Propganda, and the Cult of Personality: https://chaikhana.media/en/stories/1055/art-propaganda-and-the-cult-of-personality
KERRY MCINERNEY:
Hi, I'm Dr. Kerry McInerney. Dr. Eleanor Drage and I are the hosts of The Good Robot Podcast. Join us as we ask the experts what is good technology, is it even possible, and how can feminism help us work towards it? If you want to learn more about today's topic, head over to our website, www.thegoodrobot.co.uk, where we've got a full transcript of the episode, an especially curated reading list by every guest. We love hearing from listeners, so feel free to tweet or email us. And also so appreciate you leaving us a review on the podcast app. But until then, sit back, relax, and enjoy the episode.
ELEANOR DRAGE:
 In this episode, we talked to Azerbaijani journalist Arzu Geybulla, a specialist on digital authoritarianism and its implications on human rights and press freedoms in Azerbaijan. She now lives in self-imposed exile in Istanbul. Aside from writing for big publications like Al Jazeera, Eurasianet, Foreign Policy Democracy Lab, she also founded Azerbaijan Internet Watch and is writing a political memoir about a lost generation of civil society artists in Azerbaijan. We chat to Arzu about Azerbaijan's use of technology to go after diasporic community members or people who've been exiled from the country, how women are more often targeted than men, subliminal propaganda, misinformation and censorship in the recent Turkish elections, and the importance of tracking and mapping internet censorship and surveillance in authoritarian states. We hope you enjoy the show.
KERRY MCINERNEY:
 Amazing. Thank you so much for being with us today. So just to kick us off, could you tell us who you are and what you do and also what's brought you to thinking about feminism and the internet?
My name is Arzu Geybulla. I am uh, a journalist originally from Azerbaijan currently based in Istanbul, Turkey. I focus on South Caucasus and Turkey, Um, I cover most of the political and social developments in the region. But my passion is in researching authoritarian technology and surveillance.
And this is, an additional hat that I wear. And I, I, I co-founded I founded a project called Azerbaijan Internet Watch a couple years ago that actually looks into tracking, documenting the use of authoritarian technology in Azerbaijan and I'm involved in a number of different initiatives that work on internet freedoms and the safety of journalists in that framework.
And I guess that kind of also brought me into this vast world of technology, feminism, equality, because the more I got involved in, in, in internet freedom work, and the more I started doing research it overlapped with my experience as a journalist, as a woman journalist, and how often we are faced with online harassment and how that online harassment can take a very ugly turn and many ugly forms. And I guess in a way being part of this experience, being the target of attacks myself, I really wanted to see change in this field. And so, yeah, in, in that process, in that timeframe, I've also joined this group of amazing women journalists and various international organizations that are focusing on safety of women journalists online and bringing platforms, to listen , to take measures when these things happen on their platform. So I guess, I don't know if that answers your question, but lots of different angles of research and my own interests and my own experiences brought me where I am today.
ELEANOR DRAGE:
You do phenomenal work and it's so unfair that you have to endure that in order to just do your job. I have such respect for journalists that write about very difficult, very challenging issues, and there's so much pushback because of those themes that they're writing about.
But we'll get into that a little bit later. First of all, can you tell us what is good technology? Perhaps according to you as a journalist or in your context, is it even possible and how can feminism and other pro- justice ways of thinking help us work towards it?
ARZU GEYBULLA:
I think it is possible. And if anything, it's needed as good examples of where good technology that answers to rights equality that answers to accountability that is open to more development when, you know there are mistakes or flaws that are being determined by the users are taken into account and addressed.
I think good technology for me is when me as a user have a say, not only in the usability of the technology, but also in making that technology even better for others. I think of good technology when it's development from the very start of the development of that technology is fair, where people who are involved in writing the code or coming up with the tools or whatnot, I think that they're treated equally, that they're treated fairly, that their rights are respected. And in the process of all that good technology to me is, yeah, something that's open to improvement, not based on some white male technologists, but also take into account a variety of users with variety of backgrounds, with variety of ethnicities in order to understand the nuances, because when we think of technology these days, I often feel like it's always developed by one part of the world to use in the other part of the world without necessarily understanding all the intricacies and nuances of the world that is being developed for. And whether it's local context or the culture or the language or the specific expressions, I think all of these things, can make a technology or a technological tool a much better user-friendly technology. It's just a matter of, of being open to this actually putting effort in wanting this and um, being constantly in the process of developing and making it better.
In the context of as of Azerbaijan, can you give us an example of what that might look like?
Well, I'll give you one example, and this is something that has happened recently, although we've seen these kind of things happen over time and it concerns a political activist who has been in the crosshairs of the state for a number of years, has been in and out of jail, has ran for parliament, has run, I think in municipal elections as well. But long story short he was detained back in December on bogus charges and while he remains in detention he kept getting targeted by what we think are state- affiliated institutions or actors. And one of the ways that he has been targeted was this campaign that used Telegram and Telegram channels to expose his private messages with various women who either live in Azerbaijan or who live abroad.
And among those women were activists, I believe there was a woman who was an artist who lives abroad. The idea was to damage his reputation, but the people behind this campaign or the actors behind this campaign didn't really think of how the women who are being part of this targeting are going to feel and how much danger this campaign was putting them into.
And, we couldn't get these groups or channels removed from Telegram because Telegram was completely unresponsive. We've tried raising it through other channels, through personal contacts at other organizations that do often get involved in these kind of situations. But the platform itself is distant from responding to any of the calls for help.
To me, this is a perfect example of what bad technology is. And had we been able to get faster response from the platform, for instance, or be able to remove those, channels, a lot of the women who were targeted in this campaign would've felt safer, would've felt less concerned because some of them had to leave their homes out of fear of violence from their family members.
Because as Azerbaijan is a traditionally very patriarchal very conservative society, even though it boasts about women's rights and whatnot. The cultural norms are very strong in, in that country. And in that particular example, I think, yeah, this is what technology that was once and still used and advertised as this means of communication in conflict zones in authoritarian or autocratic regimes is advertised, can actually have just as much damage to the users who are relying on these tools.
So that's, for instance, in the context of as Azerbaijan would be a really a good example. And then of course there are platforms that do respond because, these kind of things not only happen on Telegram, you know, we've had these kind of things happen on Instagram. We've had these kind of things happen on Facebook and Twitter and YouTube.
And they're a little bit more responsive, but again, that doesn't make them good technologies either, because at the end of the day the way, for instance, Facebook is using its algorithms is flawed. And the way they respond often to reports that, some of the accounts have been taken down unjustly or without any due course... it speaks to the fact that there's so many gaps in all these technologies out there that it's really hard to take one that exists. And pit it against all the other ones and be like, okay, this is what you need to be like this is what you need to take into account. And I think with all the different platforms and technologies and all these apps that we use, it's.
It can be sometimes really overwhelming because at the end of the day, you rely on them because yeah, they are your only option of safe communication, getting your message out, sharing your work. And yet you as a user are not offered the same amount of solutions that you may have, or ways to intervene. As one may should, probably as a user.
KERRY MCINERNEY:
Yeah, no, absolutely. And I think that's one of the, kind of the biggest challenges I think that we face over here in like the UK or the US is specifically getting locked into a very narrow, but powerful range of kind of big tech infrastructures that they both in terms of the user, you get locked in and it's very hard to move away from those structures also environmentally in terms of, I think, the massive costs of these kinds of infrastructures.
But, we're increasingly building or getting a will built for ourselves because we are not necessarily like actively choosing this that is immensely costly but also quite inescapable. But I actually want to pivot back to something that you said at the beginning of this conversation, which is that you are the founder of Azerbaijan Internet Watch.
And for all our listeners, I really encourage you to go and check out this organization, but we'd love to hear a little bit more about it. So can you tell us sort of maybe about the backstory, like why you started the internet watch, and then I guess what are some of the kind of. Interesting or surprising things that have come out of this project.
ARZU GEYBULLA:
Sure. So the way as much on internet watch started goes back to Internet Freedom Festival that takes place every year in Valencia. Unfortunately it, it had to stop at some point this was like right before the pandemic, and, you know, I've been there consecutively for several years and It's like this really amazing community of developers, of technologists, of civil society and activists and donors who come together for a few days to talk about matters that are pertinent to internet freedom and there I met a lot of really amazing researchers and members of the community who were doing really amazing stuff. And some of them were doing similar work. Some of them were documenting censorship in their country, some of them were documenting the use of the use of various tools to curb on internet freedoms in the country.
And it was also around the time when there were more and more stories of how various autocratic regimes were purchasing um, surveillance technology various other um, technology tools from various companies to spy on their citizens. Mostly to use against civil society, but also to, you know, keep track of what's happening with within the country and to keep tabs on what's going on.
So I spent about six, seven months researching information controls in Azerbaijan. So I went back all the way to the early independence of Azerbaijan, which was in the 90s, and looked through all the different key turning points with regard to this. So when as Azerbaijan started installing black boxes on its mobile operators, when it started purchasing technology from various countries, how it was using it how much it cost what was the impact?
And this paper, this research paper, eventually led me to start the as Azerbaijan Internet Watch because I didn't want it to be a one-time research project. I wanted to document these on regular basis. Because, you know, when you talk about internet and what's happening there, we are not just talking about all this different technology, all these different platforms, we're also talking about how autocratic regimes sensor users in their countries. And so I wanted to have this live platform where like all of these different cases could be documented. And that's how it started in, in 2018 or 2019. It started off first mostly as a place to track all of these developments, and then it grew, so we started providing digital security assistance on the ground to people who were targeted with various forms of spyware technology. We started also working with users on the ground to help us test blocked online outlets in the country. So we partnered with Uni to have them use to, to have the users in use the app, to track the type of censorship that was going on, and then for us to be able to actually analyze that data and put it into reports. And then we also um, started looking into the legal framework of the country because this is how in a lot of these countries changes happen, restrictive changes happen. When laws are changed or new laws are being introduced, that make it extremely hard for people to make sure that they were protected online or that they could take their surveillance case to court or where they could object to how mobile operators are collecting their data and then selling it off to third parties and whatnot.
This is what Azerbaijan Internet Watch is now. It's like the standalone project. I have collaborators on the project in Azerbaijan who were extremely helpful in getting this off the ground, but also keeping the project alive. So, Yeah, for now I continue to document what's happening in the current trade. I took a short break because of other other projects and work, but I'm hoping to go back to it and keep the project going.
ELEANOR DRAGE:
Well, It's an amazing project and I really like how in making these systems explainable to the public, as in this is what they're doing, that also allows people to act on the information that they're being given. So it's not just an explanation, it's tools to resist. And I wondered if very quickly you could tell us an example of the way that this has been received ? How have they responded? And what were they able to do with the information that you were producing?
ARZU GEYBULLA:
I think until a couple years ago, people knew that they were being watched or that their phones were tapped, or there was some kind of technology used by this state. But I think with projects, like Azerbaijan Internet Watch there is a growing awareness of just how deep down it goes in the system. How all of the state institutions are connected, how ISPs, internet service providers, are connected and mobile operators are connected, and how much information that is being collected and shared whether or not it has made them more cautious.
I don't think so. Because there's also this very uh, common modus operandi. Like, you know, It's uh, it's already out there. What's the point of protecting myself or like being feeling um, um, stressed about it, but I think in, in, in terms of raising awareness, it has certainly changed people's perceptions and also for those who are targeted or for those who come, face-to-face with this technology or the different case studies, they come to Azerbaijan Internet Watch seeking help, which is great because then I can put them in touch with relevant organizations that could provide more assistance, or could help them uh, get in touch with platforms where they're facing um, some of the, some of the issues. So in that way it has helped, I think, globally. I can't speak to the global impact, but I can certainly speak to the community of people who do similar work. And the ones that I've met and continue to meet in this space is that awareness doesn't just go within the people, but also among the actors involved in developing these projects in documentation and monitoring because you see far more cases than you think they are, which makes it a slightly depressing because if it becomes a global trend then it's not good news, but it also gives you a lot of data to see where and how quickly technology travels that technology travels, and how quickly it becomes a norm in bad states used by bad actors.
And then it also puts you It also exposes you to developers, to researchers who are actually trying to mitigate all of this. We're actually trying to come up with solutions to all of this, and so I think there's certainly that angle to this work as well. And finally, I think for the platforms themselves, I think they also realize, and the companies that sell, for instance, surveillance technology all these actors realize that they can't just continue doing what they're doing. That there is a much bigger awareness of the impact of whether it's technology or the platforms on the users.
And that there needs to be some checks and balances whether or not we're actually going to succeed at this is yet to be seen. I mean, There are a lot of protocols that have been signed. There have been a lot of advocacy around this. A lot of statements are being made, hearings are being given in the US but I think we still have a long way to go.
Totally. And awareness is really powerful and we shouldn't just throw our hands up in the air and say, you know, what's the point? We can just despair, the data's out there anyway. We can resist, and this just shows that it's possible.
Absolutely.
Does this work intersect a lot with the project that you do on transnational digital repression with the University of Toronto's Citizen Lab?
ARZU GEYBULLA:
It does. Actually that's how I found myself uh, involved in this project as well because it very much overlaps with the work I did in the past and I was interested in doing, because, what's really fascinating about technology that is often used to target civil society and civic groups is that the more we are involved in this work, the more we're seeing that it doesn't only happen within the countries where there are autocratic regimes, where there's crackdown, where there's censorship. We're also seeing how this technology is being used to go after exiled community or diaspora community members who live um, in different countries in order to silence them because now that they've achieved control within their borders, they want to expand and they want to go after those who are still, who still have a voice, who still are being able to do something about or say something about what's happening in the country.
And so within this project, basically there are a couple of countries and a couple of researchers focusing on these countries, and we're trying to see the extent of transnational digital repression and how it's being used, whether the government deployed trolls to go after journalists and their exiled media platforms, or whether they openly harass the users.
And we're focusing specifically on women. Because the exiled and diaspora communities are quite diverse, but for this particular research, we wanted to really focus on the gender aspect and how much of this technology, this targeting, is gendered and how more damaging it could be on the person who is being targeted because, the language is very different.
The impact is very different, and the costs on the individual who's being targeted can actually be really high. So yeah, it's very much related and it's a fascinating project. I'm really excited to be part of it. .
Thank you so much for sharing about it. I also want to ask you a little bit about something that's happening right now at the time that we're recording which is the Turkish election, which is something you've been covering and what have you, you found in your coverage at this time? Have you found, for example, that cyber attacks or other kinds of online misinformation have been shaping the election? And if so, how?
Yes, absolutely. And thank you for that. Yeah, so Turkey had a general election on May 14th. Turks were voting both for new Parliament members and for the new president. The Parliament vote is almost over, as in the vote was over on May 14th, but the vote counted because there were reports of fraud and uh, violations and irregularities.
The, The central election, the Supreme Election Commission was still counting and that is done, at least, we're not gonna vote for the Parliament members again. But we are going into the second round voting for the president because according to the legislation here, in order to be elected as a president in the first round, the candidate must reach 50% and over of a vote, but because neither of the two leading candidates were able to get over that threshold there is going to be a second round now in the run up during and now until the second round, we've seen censorship, we've seen misinformation. We've seen how the state is resorting to all sorts of mechanisms to really control its narrative that, they're better than the opposition candidate that they are. They're doing everything for the good of the people.
Now, it's really hard to explain how damaging it is without really going. Deep into the history of censorship in Turkey. And I'm not going to do this right now, because it's a topic for a podcast or another time, but just, to explain the background to this.
You know, Turkey is A country where there has always been, especially in the last 20 years censorship and crackdown on independent journalism. And a lot of journalists are behind bars and the situation is really dire with the internet freedoms and the state resorts to all kinds of ways and tricks to censor independent outlets, whether through blocking specific stories that they publish on state actors or by intimidating people who are writing them by freezing their accounts. All kinds of things. And so it wasn't surprising, of course when ahead of the election, and now between the second round, when you open pro-government media, when you turn on television channels a lot of it is about the current president um, Tayyip ErdoÄŸan and his successful government.
They would not um, of course say anything about what's happening in the country. You know, no one talks, so they don't talk about the inflation, they don't talk about the devastating earthquake and how much impact it has had on, on the people who lived in those provinces. So the extent of media manipulation is just so vast that it's really it's a really interesting country to be in to observe elections to, continue conversations with journalists because the censorship is real.
The, you can publish something on your website about a graft, some scandal involving some government official, and then the next day you can wake up to the fact that your story has been censored by a court ruling because local courts here, criminal judgeships of peace, have that.
And it's the same with platforms like, um, there's, there's a really great website called achi. So it's this open dictionary so to speak, where people put in information and it's open to public. You can have an account, you can add stuff, and it's, it's often where people make a lot of jokes of, of certain political contexts, but also social issues.
It isn't like Wikipedia. It's really hard to explain what it really is. But in English it translates into sour dictionary. So there's a lot of those kind of like sour words, I guess used in used on the platform. But anyway, Zu has been blocked in the country. And it was blocked ahead of the election, and despite attempt to get the court ruling overturned, the platform manager has not been able to do so. and then at some point they were able to open the website, except on the election day it was censored again. So the website remains blocked. And this is just one example. There are plenty of other websites that are getting censored or they get fines, which is another very popular mechanism used by the state. And one final example I'll give about the extent and the scale of censorship or misinformation, not even censorship misinformation: there's a pro-government channel. It's not even right to say pro-government because most of the media is owned by the state or companies affiliated with the state.
So this one television channel was having a new segment on the new ballots. In the previous election, there were four presidential candidates on the ballot. Now they're just two. And they had the ballot on the screen and the presenter was pointing with his stick how to vote, except, you know that there are two presidential candidates.
You know that it's Tayyip Erdoğan and Kemal Kılıçdaroğlu except they only handed Erdoğan's picture and name, and the other candidate was blank. And under it, it just said the other candidate. And the stamp, when you vote and the, for the presidential candidate it's a stamp, so it's a stamp has a yes on it. So you vote for either and it had a yes stamp under, under current President Tayyip Erdoğan.
So this is the type of misinformation that's, that, that is circulating both online and on, on television channels. But one other thing I wanted to say about all of this is, I do notice that I live in a bubble of people who think like myself. So a lot of the stuff that I get on social media about the importance of this election, about the narrative that the oth that the ruling government is putting out there, I know that all of it is wrong or it that, that this government needs changing, but I don't think the electorate who votes for the government doesn't they don't see this.
They don't, the extent of information that we're exposed to versus the extent of information that they're exposed to is very different. And so misinformation in that environment becomes really easy, and it's really easy to manipulate the narrative.
ELEANOR DRAGE:
Oh my God. Just the meeting of subliminal propaganda, misinformation and censorship is just terrifying. I have a friend I was with yesterday and she's working for a Turkish company and she said that she was really upset and confused that, she hadn't seen it coming and no one told her the elections were rigged and well, I was just thinking about what it really means for elections to be rigged because there are these certain kinds of behaviors that we, I mean, certainly in the uk think about when we hear the term rigged election and just knowing that you have to look out for other kinds of behaviors and other kinds of practices makes things a little bit trickier.
ARZU GEYBULLA:
Absolutely.
ELEANOR DRAGE:
I really want to ask you this next question, um, So on the 8th of May there was this Twitter storm around a open letter that you co-wrote. It says, open letter to news, media and policy makers re: tech experts from the global majority and signed by this extraordinary group of people, including Sasha Costanza-Chock Timnit Gebru, you, Jessica González, just an absolutely astounding range of people, Joanna Varon. It's a letter written by women and non-binary people from the global majority working at the front lines of AI.
But what was the point of the letter? What did you want to achieve and how did you get in contact with this wonderful community of people?
ARZU GEYBULLA:
Sure. So I met a lot of these incredible women through an initiative that was launched this year by Ohana Foundation based out of Vancouver also founded by women who worked in the media industry, but then decided to do something else and I was invited to become part of this initiative that is trying to write a basically global declaration on digital human rights.
And we all met actually now last year this was started last year. Yeah. We all met, we all came together last year, had wonderful few days of discussions trying to really understand how we're gonna do this, whether or not we can actually do anything and what's going to come out of it.
And so of course we, remain in touch because this is an ongoing project and the letter was suggested by Jessica González and by other women in the group because we were discussing these new developments. We we keep discussing, of course, like everything that's happening on the intersection of technology and human rights and feminism and and exclusion and non- exclusion and everything.
And so it all started with what was going on at Twitter, of course with the takeover of certain gentleman. And then we continued discussing about, what it, what does it mean for all the users who rely on these platforms like Twitter to get the message across, to share what's happening within their communities and what it meant when the blue check marks were taken away from us.
It took a lot of work to get those check marks because we applied for it, because we all had a certain following and we're representing certain affiliations and whatnot. And then of course, the dominance of white dudes on panels talking about importance of platforms and usability across wider platforms and not having, any of the women, non-binary variety of experts that are out there is just, they're never contacted at these meetings and at these platforms. And, there was a lot of frustration. And so this is how the idea came about, to encourage not just international organizations or initiatives that host these kind of events, but also journalists when they interview fathers of the internet or the grandfathers of technology, that it's not just a male- dominated field. There are plenty of incredible experts from the other gender that is often forgotten, within this environment, within this within this work. So that's how this letter came about. And I have to say, all the credit goes to Jessica and all the other incredible women who actually sat down and put together that text that was then sent out.
KERRY MCINERNEY:
And I think it's just so important and so amazing to hear about this project just because, one of the reasons why Eleanor and I started the podcast is cuz people would often come to us and they'd say, oh, what's like, Some good stuff happening to do with gender and tech. Like we're tired of hearing about all the bad stuff. We want something positive and uplifting to, which would say we can't always be telling positive stories because ultimately we are critical researchers who are investigating how power shapes AI. But what we can do is show you that there's incredible amounts of feminist activism in this area, and that, if you say, oh, I can't find any women, I can't find any queer folks, I can't find any people of color in this field to talk to, then maybe you're not looking in the right places, because there's such a depth of expertise. And yeah. To that end, it's just been a real privilege to get a chance to chat to you on the podcast today, hear about your expertise and your work, and we hope to stay in touch with you.
ARZU GEYBULLA:
Absolutely. Thank you so much for having me. This was really great. And I actually want to echo Timnit's words because there's this really great story in The Guardian interview with her where she says I just feel like we have to do what we can. Maybe if enough people do small things and get organized, things will change.That's my hope. And the quote so I really think this is really important that we keep on doing, even if it's a small thing. Yeah.
ELEANOR DRAGE:
Yeah I totally agree and I always bang on about this, but that was a great phrase. I heard once from Nikita Dhawan about the daily act of brushing your teeth. This very incremental kind of resistance that she calls political tartar control, which I'm, I'm obsessed with this idea, and I think that
ARZU GEYBULLA:
I love it.
ELEANOR DRAGE:
It's supposed to be unsexy and it's supposed to be something that, it's hygiene and it's kind of boring and sometimes you forget. Yeah. So thank you so much for all your hard work.
ELEANOR DRAGE:
 This episode was made possible thanks to the generosity of Christina Gaw and the Mercator Foundation. It was produced by Eleanor Drage and Kerry McInerney, and edited by Eleanor Drage.
Comentarios