top of page
Search

Hot Take: Can you own your data?

In this episode we welcome Eleanor back from Slovenia, where she was speaking at a conference on digital sovereignty. But what is digital sovereignty, and what does it mean for you and your data?


READING LIST:


The future of data trusts and the global race to dominate AI: https://www.bennettinstitute.cam.ac.uk/blog/data-trusts1/


Reclaiming the Digital Commons: A Public Data Trust for Training Data https://arxiv.org/abs/2303.09001


TRANSCRIPT:


INTRO:

Hot takes with the good robot.


ELEANOR DRAGE:

Hi, and welcome to this episode of the good robot where Kerry and I will be talking about digital sovereignty. I went to this amazing conference in Slovenia, and this was the title of the conference. So we'll talk about what sovereignty means today, what it means in relation to technology. And we'll have a little bit of back and forth, a gentle debate about whether it's a good term to describe digital rights issues.


We hope you enjoy the show.


KERRY MCINERNEY:

Welcome back to our latest good robot hot take episode. And today we're going to be talking about something super interesting, which is the concept of digital sovereignty. And if you're thinking, what on earth is that? Don't worry. We're going to be walking through how interesting and how rich and how contested this idea and this term is.


But the inspiration. from this episode came from a conference that Eleanor went to in Slovenia, from which Eleanor has just returned. So welcome back Eleanor, tell us a little bit about your trip. What is Slovenia like?


ELEANOR DRAGE:

I love it. I'm such a huge fan. My favorite thing about it is these really eerie kind of spooky atmospheric mountains.


And then there's a bridge called Butcher's Bridge, which has the most exciting version of Prometheus I've ever seen. So the guy that stole fire from the gods. And it's like really gruesome and he has a tail and there's also butchered humans on the bridge. And it's part of this like really interesting, rich kind of cultural mythology that Slovenia has, and also incredible food. Maybe not so much for non- meat eaters, but it was fantastic. And the people were great.


KERRY MCINERNEY:

That sounds incredible. For context, I don't eat meat, which is why Eleanor has made that caveat for me, but it sounds like it was a wonderful trip. So you had a conference wonderfully named DISCO .


Could you tell us a bit about the organization and what you were discussing at the conference?


ELEANOR DRAGE:

Yeah, so the conference was on digital sovereignty, but that was a term that was created by the people who started the conference. And the idea was just to talk a little bit about what it means to have authority and control over your data.


So sovereignty in that sense, but Kerry and I are going to, we're going to discuss what sovereignty might mean and why it may or may not be the right term to use, but basically it was, a conference that brought together civil society actors in tech, and this is super important because they often fly under the radar.


So NGOs that sue Facebook, engineers that aren't working for big tech that beam Wi Fi networks throughout big council buildings... there's so many really cool, really important people who are making sure that GDPR stands up, that don't work for state actors. And it always shocks me a little bit how much democracy relies on non- state actors in order for society to thrive.


So it was a real joy to meet these incredible people. There were some people that really stood out to me. So Amnesty Technology is quite a small ish group, but it's evolved from a team of 5 a few years ago to a team of 25- 30, and they are serving Facebook for Myanmar. And this is an incredible thing because standing up to Facebook involves an immense amount of resources.


So they have, lawyers, they have tech experts that are basically holding big tech accountable. You might think, how does that relate to human rights, but they look at the business model of big tech as a human rights issue. And I think this is really important, to draw in human rights from a legal context into what Facebook is doing, because it's really difficult to find a law that you can use to combat the way that Facebook is acting.


So if you position it as a human rights issue, then you can bring in human rights legislation into how they're behaving.


KERRY MCINERNEY:

That's fascinating and a really important conversation to be having right now. Given that we're recording this on Monday, the 30th of October, it is the AI safety summit in the UK this Wednesday.


And very disappointingly, I think some of the really crucial issues that you're raising around human rights, around business structures, around the impact of AI on our societies are not necessarily going to get foregrounded at the summit. And like you say, that we're relying now on non- state actors and civil society actors to be doing that work for us.


ELEANOR DRAGE:

There you go. It's literally that. So in some ways it was encouraging to see that there are incredible people who are working , in politics, but also with different networks and communities to try and make a difference. But also, there's a lot of issues that were raised that I had never heard about before.


So for example, there was a group of these pro bono lawyers who hold big tech accountable. One thing that they're doing at the moment is trying to make Facebook reinstate the Facebook page and the Instagram page now of a Polish NGO called the Civil Society Drug Policy Initiative. And they were conducting educational activities on the harmful consequences of drug abuse, and in 2018, Facebook just removed all of their pages and fan pages and they didn't say anything, of course, they don't feel like they have to say more than that they were violating their community standards. And so this group of lawyers, the Panoptykon Foundation, helped this group file a lawsuit.


And the really interesting thing about what they were doing is that it involves a lot of creative lawyering, right? Because there's no law saying that you can sue for being taken off a platform. And Meta, interestingly, said that they shouldn't go to the Polish courts. Which is where this whole thing was taking place because it was a Polish group, because they really didn't want to be held accountable under Polish law. What they wanted to do was go to use their courts in the U. S. where Facebook operates often. So they know what they're doing in that jurisdiction. And they justified it by saying that Meta doesn't speak Polish.


KERRY MCINERNEY:

Interesting.


ELEANOR DRAGE:

The Poles were like up in arms. People were like, how can you say that? You have literally millions of Polish users and people who work for the platform.


KERRY MCINERNEY:

Absolutely. I think it also raises interesting questions and really troubling questions about what happens when you have these huge corporate multinationals like Meta slash Facebook and like many other big tech companies, which do operate in a hugely transnational way.


And yet like, how are they supposed to then not be? beholden to the laws of the countries in which they operate. That's extremely bizarre to me.


ELEANOR DRAGE:

I know. It's shocking. It's also so rude. I just I'm shocked by the whole thing. The other really cool group that I met was a group called the European Centre for Digital Rights, which is a non profit based in Vienna. And it's an agency that tries to get different actors to uphold GDPR. Now GDPR was this bit of legislation that the EU came up with, the General Data Protection Regulation.


And it's an amazing bit of like legislation, but as with all EU regulation, it's really hard to to make people uphold it because you have to literally go onto every website to make sure that they're upholding GDPR. And GDPR was really important for us citizens because it means that our data is being protected, our privacy, all those things.


So we should really take it seriously too. And they said that only 42 percent of cookie banners comply with GDPR. Now, I tried to remember what a cookie banner is, but it's that thing that like pops up?


KERRY MCINERNEY:

Pop up where it's oh, this website uses cookies. Am I fine with that? And if you click yes, then it disappears and it takes your data.


And if you click no, it's okay, sure, we'll give you the website. And it's grrr, and it's like full of these clashing boxes. And they give you like the worst possible operation of the website.


ELEANOR DRAGE:

Great, there you go. What a summary.


KERRY MCINERNEY:

A very technical definition.


ELEANOR DRAGE:

Our specialty. So yeah, so what they have to do is scan websites and monitor how data is being used and then they save a file and then they attach it to a complaint that can be then used by a lawyer if it comes down to that, but what I found really interesting is that they said that 90 percent of people responded better to nice emails saying you don't comply with GDPR than mean ones.


And I think that's just a really useful life lesson that actually sometimes if you remind people gently that they're making a mistake, it's more effective than shouting, which is my specialty.


KERRY MCINERNEY:

It's just shouting or correcting people nicely when they've made a mistake.


ELEANOR DRAGE:

Yeah, you've seen all the like mean emails I've sent .


KERRY MCINERNEY:

But no, I really am so impressed with the work that they do though, because that is arduous work. And I think that's, the kind of unhappy, but important side of civil society where it is often tiring and boring and it just involves a lot of the grunt work that is absolutely necessary.


So a huge round of zoom applause for the civil society. UK government stopped crushing civil society. But yeah, civil society is hugely important. But I actually wanna come back to the term that you mentioned at the beginning, which was digital sovereignty. And framed in this conference as control over your own data.


And that was really interesting 'cause when you first say the phrase digital sovereignty, to me, I thought you meant something very different. I thought you meant how countries or nations approach issues of internet governance and cybersecurity.


ELEANOR DRAGE:

That's the difficulty with that term. I think they went with it because it was a conference in Slovenia, right?


In the context of the breakup of Yugoslavia Slovenia was the first to declare sovereignty in 1990. And... Then Croatia followed in May and in August, Bosnia Herzegovina also declared itself sovereign, right? So sovereignty for them is something that is really important in recent history, but of course this took place last week when all these conversations around Jewish sovereignty and the right to self determination and the violent effects of this have really come to the fore in Gaza.


Yeah, because you're the international relations expert, maybe you can tell us a little bit more about how sovereignty is used in different contexts and why it gets us to think at least.


KERRY MCINERNEY:

Yeah, that's really important framing, I think, because I think this term sovereignty I think does mean a lot of different things in different contexts, right?


And I know, Eleanor, you've done a lot of work on the concept of autonomy and how that's often used in HCI, human computer interaction, how it's used in AI. AI and computing to think about the rights and the movements of the individual. And I know that there's many feminist and anti racist critiques of this term because it often denies our interdependency or can entrench us in an idea of what it means to be human, which is really individualistic and is not very responsive to our physical needs or our cultural and social needs.


I think sovereignty has some parallels in this sense, because in the same way that Autonomy has been critiqued and yet also is very important to certain communities as a way of expressing their claims to self determination. So for example, Indigenous groups in Chile often do rely on this idea of autonomy, autonomia, as a way of laying claim to their data and their land, I think sovereignty can have a very similar connotation.


On the one hand, sovereignty can very much be a word or a term that is used to lay claim and to dominate over certain lands, to remove dissent, to form homogenous kinds of nation states with very violent effects. On the other hand, sovereignty can also be a way that people say I want to lay claim to this space.


I want to be self realizing and self actualizing in a way that is oriented towards justice and towards freedom. I personally, because I'm both an IR theorist, but also come from a country Aotearoa, New Zealand, where this idea of sovereignty has been extremely contested and extremely controversial.


I think the term sovereignty, I find it, a really interesting choice when it comes to thinking about digital rights, because there's just so much historical baggage that comes with that term.


ELEANOR DRAGE:

Are there any particular movements or organizations or political statements that are being made around sovereignty in New Zealand at the moment?


KERRY MCINERNEY:

I'm going to say this with a huge pinch of salt. I'm like all the lawyers, all the New Zealand historians.


I'm just like a gal from New Zealand who studied this in like high school and like my first semester of university. So like my understanding is probably extremely rudimentary, but like one of the kind of foundational documents, if not the foundational document in colonial New Zealand and the founding of what we might call the contemporary New Zealand state.


So Aotearoa is the Māori word for New Zealand. And so you'll hear me swap between the two as I'm talking. So this foundational document's called the Treaty of Waitangi, and it was signed between some Māori iwi or tribes and representatives of the British government in 1840. It is now still the foundational legal document, but it is hugely contested.


Partly because not all iwi signed on to this document, partly because it was, again, there were huge issues with translation because there are two documents. There's one in English and then there's one in Māori, to the point where these documents were translated so badly and are such different documents that people now say it makes more sense to talk about them as two different documents.


So we often say the Treaty, which is the English version, and Te Tiriti, which is the Māori version. And some of these issues are to do with just huge issues with linguistic translation and Eleanor, you're multilingual, you know how hard it is to translate well. Also, I'm pretty sure like the missionary who translated this also did it in an all nighter, which is not the right approach for a foundational legal document.


But also this treaty has three articles and within these articles, there's also some linguistic differences that to me. Talk to these really fundamentally different cultural framings of particular ideas. And so a key one is sovereignty. And I think this is the first article. So article one says an English version basic Maori people hand over sovereignty and all of its attendance rights to the British government.


And the British government understanding that as full empowered autonomous rule over New Zealand, whereas in the Māori version, I believe it says that Māori iwi give kawanatanga, which is more like chieftainship or governorship, to the British government. And so I think it's been broadly argued that what maybe Māori iwi thought they were giving was more like chieftainship or iwi governorship in the Māori context, which is more like stewardship, not this kind of notion of totalizing sovereignty.


And and then we see. Similar issues with the second article where the English version says, Māori hand over land, fisheries, forestries, enlists a whole lot of natural resources. The Māori version says. Māori hand over taonga, so that's treasures, everything that's precious to you, right?


There's a really different ideas of property. And so that was a very long way of saying the concept of sovereignty is still really contested in New Zealand law because of the way that it was the foundational principle. Of this colonial document, this sort of central colonial tool.


ELEANOR DRAGE:

Yeah, which is cheating and cheating people out of what belongs to them.


And this language of sovereignty is now being reused in the Māori context with AI, right? Because there's this big conversation I'm reading from the a really cool resource about chat GPT and Māori data sovereignty that we can include in our reading list. And Māori populations are trying to work out whether Chat GPT is going to threaten that the sovereignty over their data, over their information or whether it could be a useful tool.


And these are really interesting ongoing debates.


KERRY MCINERNEY:

Yeah. And I think it's also an interesting point as to the way that the term sovereignty itself is also not a static term. Yes, it's a very contested one, but also one like autonomy that gets claimed back in certain ways. Yeah. The huge one being data sovereignty, which is a very active question in many regions around the world, but one which I think in Aotearoa has been, particularly prominent, but also space where I think New Zealand's really leading in that area, particularly indigenous data sovereignty groups are.


ELEANOR DRAGE:

Awesome. So there you go. History lesson for free from Kerry. And now we're going to listen to some responses, which is really exciting from the participants at the conference to the question of what is good technology? Is it even possible? And what is data sovereignty? So we're going to do live listening and commenting, which is going to be a little bit experimental.


KERRY MCINERNEY:

Kerry, are you ready? I am. I've gone slightly blue. I'm using a new camera for the people watching us on our YouTube. If you'd like to see how blue I look, you can go to our YouTube channel because we have video as well as audio.


But apart from that, I'm very ready to listen and I'm excited.


ELEANOR DRAGE:

So the first person that we interviewed, they're called Maria and they work for this organization called Mnemonic and they do really cool work around social media and human rights violations.


MARIA MINGO:

My name is Maria Mingo.


I work for an organization called Mnemonic. I work on the preservation of social media content that regards serious human rights violations, crimes against humanity, war crimes and essentially it's making sure that preservation is that it can be used for investigations and accountability.


So good technology is, it's a difficult one. Overall we thought that it was in our group discussions, we thought that it was technology that responds to a specific problem, a specific need, without creating more additional problems in the process in light of the current circumstances. So it's something that needs to be future thinking sustainable, but therefore might not really exist. Or there's always a trade off. And that was really the question that we were thinking about. So it's about talking about technology that perhaps does the least harm, or finding a way to always reevaluate it as we go along. So having some kind of oversight mechanism would be key to making sure that technology can indeed be good.


ELEANOR DRAGE:

What I think is quite interesting about Maria's response is that, and she like quite a lot of people said good technology is not necessarily something that will exist or can exist. And keeping it in the domain of the speculative I think is really important, because it shows that good technology isn't a thing it's a process.


KERRY MCINERNEY:

No, I thought that was a really wonderful and thoughtful answer, something I did find quite interesting that I would be really interested to debate more with her is this idea of good technology responds to a problem because I completely agree we see a lot of wonderfully useless technology and horrifically useless.


So wonderful useless to me is things like jelly cats. I have a lot of these little stuffed toys that serve no purpose except they bring me joy. So that's the function they bring joy, right? I'm not saying everything has to be like utilitarian and max efficiency. But yeah, I do think we also see a lot of really bad AI applications, which are clearly just being made because someone thought, Oh, I can make money from this, not because it's a good idea.


Like actually a friend of mine was telling me, Jiré, who is a podcast listener. Hi! Was telling me about this awful one on German shark tank, that like Dragon's Den that show you like debut business ideas. And it was this glove that you could wear designed by men to put for women to put in a tampon so that you didn't have to touch your own vagina when you're putting in the tampon.


I know every woman was just like we're not disgusted by our own bodies like that, like, why are you making this? So that sits in the kind of horribly useless category for me. But anyway, where I was going with that is what I would be interested to ask is I do think we have a lot of examples of bad technologies or technologies that I would deem to not be good, which have quite meaningfully be made in response to a problem that is a very real problem.


And that's the particular category of tech that I look at a lot, which is things like I would broadly call carceral technologies. So technologies, for example, that increase surveillance or have been, or things like, for example, underwear that claim to be have anti- rape properties or various kinds of tracking technologies that are meant to protect women and gender minoritized people when they're out at night. And a lot of these technologies come from female founders. They come from people who are very meaningfully concerned about gender based violence. But I would argue that actually, even though they're responding to a real need, there's just this very pervasive fear. quite rightly, unfortunately, of gender based violence, the technologies they're making are actually not good.


ELEANOR DRAGE:

Yes, I totally agree. I think that problem of utility of solving a problem, who defines the problem, it's really difficult. I think, at least the question should always be raised of, what is this for?


I think that's what she's getting at because there was actually someone that said yeah, I think to me a good technology would be like a pen that turns things like orange and pink, and a lot of the stuff that we love is just the things that just bring us a bit of joy but don't have some really meaningful side to them.


Yeah, for sure, we have to pay more attention. Okay. So now we're gonna listen to Rok, who's a data scientist at the Open Data and Tech Institute in Ljubljana in Slovenia. All right.


ROK:

So hi, my name is Rok.


I'm a research assistant from the Open Data and Technology Institute here in Ljubljana. When thinking about good technology and digital sovereignty I've got this Heideggerian perspective on it. I guess this is the author that comes closest to thinking about this technology, these questions.


So I would just say that for me good technology is one that helps us to connect and understand the world rather to just help us use it in a way that's more like instrumental.


ELEANOR DRAGE:

Oh, I love that answer. These are really lovely.


KERRY MCINERNEY:

Yeah, no, I think that kind of. Impulse to move beyond instrumentality is important because I do think there's also such an interesting moment in there to think about connecting and understanding the world as also being these entwined things. Because I think sometimes understanding it can also have a very instrumental approach.


It can be very much about like, how do I, from the outside, observe, objectively this phenomenon and. We have a wonderful episode with Lorraine Daston, one of the co- authors of the book Objectivity, which unpicks the way that objectivity became seen as this principal, scientific epistemic virtue.


And if you haven't listened to that, I would highly recommend that.


ELEANOR DRAGE:

Okay. Next we have Zorica from North Mastonia.


ZORICA:

Hi, my name is Zorica. I'm co founder of a center for social innovations Blink 42-21 in Skopje, North Macedonia. We are a very small social tech organization and we believe or we try to develop technology for good because I, we don't know when we are not sure if we have the right to say what's good, what's bad technology, but at least let's focus on use the technology for good.


I think that technology really should improve the well being of the people, of the society, should support progress of the people, should make a positive impact that on a faster and on a better way, and for sure should not create additional open problems and conflicts and harm to anyone in the society that is using it.


It's very hard to, I think that it's a very hard time that we are living and we are in a position when technology is still here, already here, and now we need to take over controlling the technology. Usually, in the past, first, we establish control and rules and then we apply.


Now, we are in a position where technology is here in our lives. We are using mostly without complete access, control, and understanding. It will be very hard for us to get the control back or to establish control globally. Because we don't have control who develops technology, who collects data, who owns data, where is the data, on which servers in which countries, and that it will be very hard to make decision who should control and how it will be controlled and how we are going to get on track to use the technology really for good.


But I'm completely sure that technology might and can be used for good and for real impact in the societies.


ELEANOR DRAGE:

I like how at the beginning she has that real humility of saying, who am I to say what is good and what is bad technology? But instead, what we can do is say, how can we apply technology for good?


KERRY MCINERNEY:

Absolutely. Such a thoughtful answer. I think what was really interesting is she looks at all these different ways that technology could be used well, or things that we should building technology towards. And the profit motive never came in once. I think it speaks to what happens when you have civil society organizations together, like really meaningfully generating and changing our ideas of what good tech should be, how we should be building technology.


And that's why I think these voices are like. So critical to holding big companies to account.


ELEANOR DRAGE:

These are really incredible, selfless, really guided people. That's why it was such a kind of honor for me to be around them. All right, let's go to Eva.


EVA:

I'm Eva. I work for a local institute called Danes je nov dan, which translates to today is a new day. We are the organizers of DISCO. In my free time, I'm also an assistant professor of social and cultural anthropology at University of Ljubljana.


Thinking about digital sovereignty on a personal level, I think it Basically means owning the means of digital production. So having access to code, access to hardware, having a right and an ability to disconnect, and also of course, a right to exercise privacy, your right to privacy, good technology. I think it's not really a question of good or bad technology, but rather good or bad policy, regulation, and also political economy, ideology, chauvinism, racism, exploitation, marginalization of particular groups, etc. All these cultural constructs are already built in technology. So what I think is that we need a new social contract and technology will follow.


Before that happens, we can only try to find novel ways and novel and subversive ways to use, to misuse, to even abuse existing technologies.


ELEANOR DRAGE:

You're gonna love that.


KERRY MCINERNEY:

I do. I do. And I do find that also really interesting the framing of digital sovereignty as this kind of individual ownership, not just of data, which I think is often where I see the conversation stop.


There's been a lot of really interesting political work around things like public data trusts, for example, what would it mean for us to collectively own our data? I think that's really important work. I'll link some papers on that actually in the show notes. If that's something you're interested in on our website, we always have a full transcript of each episode transcribed by the lovely Eleanor and hand in hand with AI.


Because we are nothing if not hypocrites. No, we use the technologies for good when we can. But also we always have a reading list with each episode. So if you go to www. thegoodrobot. co. uk, I'll include a number of links and papers to this idea of data trusts. But I really like that she went a step further and she said it's not just about owning the data.


It's not owning the whole means of production, the compute, the hardware, the software, the environmental resources that are required to make these technologies. And linking back to the last answer as well. I think what comes across both of those is too often technologies are thrust upon us and we have to use them.


I was talking about this on Cambridge radio the other day with Eleanor, how we all received zoom in 2020, and we just had to start using it. None of us really wanted to do it. It was just brought to us by the circumstances and the companies and our workplaces. Employers. I love about that kind of framing.


It's very much taking about it back and saying, how can we make and use technology on our own terms? That doesn't mean technology itself is inherently bad or that we should never use it. But how do we more meaningfully center it around us.


ELEANOR DRAGE:

Yeah, absolutely. And I can only add to that Eva is just the nicest person and did the most incredible job of organizing the conference.


And I think there will be another one in two years time, in 2025. I can't recommend it enough. I will be there if I can. If you have any interest in all of the themes that we talk about, these are really inspirational people.


So next we have a lawyer, which mixes us things up a little bit, also doing awesome work in Croatia.


ANELLA:

Hi, I'm Anella. I come from Croatia. I work in a law firm in the IP, IT and privacy department and the regulation of technology is and the relationship between law and technology is area of my interest. I think the Ideal definition of a good technology is that it is developed ethically, it's beneficial to people, accessible to everyone, and it does not harm anyone.


It does not create inequality, and its intended purpose is not secured by its use or misuse. And I think this is a really utopian definition, and there is no inherent good. Technology or even bad technology, and it all depends on other factors, as previous guests have said, and with regards to digital sovereignty, I think it's having power over your digital identity of your own personal data and what we create in the digital space, as well as the control of the hardware and software that makes it.


ELEANOR DRAGE:

I think it's really interesting how she says that the intended purpose of good technology isn't secured by its use or its misuse.


KERRY MCINERNEY:

Yeah. I think there was a lot to unpack in that as well.


I think again, like coming back to the idea of data sovereignty or kind of being about this individual ownership of data, I thought control over your own digital identity was a really interesting way of framing that. And I, on the one hand I think it's a hugely important thing because the sense to have a right to your own person or the sense of kind of legal personhood is so central to a legal understanding of rights and human rights. On the other hand, I do think this is one of the complexities of digital spaces and because of the inference economy or the idea that, even if you're not sharing data, if someone else shares it about you or your interactions online can generate forms of connective data, like connective tissue that people can exploit and use, even if you have never done this.


So my classic example of this. My mom, I love her, always post photos of me on Facebook. Doesn't matter that I haven't consented to that. They're out there online. I think that there's something much harder about the digital space when it comes to that kind of regulation of identity. So again, I'm not saying that I don't think that's like the right approach.


I just think it's a distinct problem that comes to online space.


ELEANOR DRAGE:

Yeah, we're always complicit in all the other things that determine how much agency we have over the spaces we inhabit. And it's the same with the offline world as well. And in some way, asking these questions of the digital realm gets us to reconsider how much of ourselves we give away or are forced to give away in the real.


KERRY MCINERNEY:

Yeah. And I think an interesting example of that is going to work in person versus working online, because I think, people chose to mediate that relationship very differently during and after COVID in terms of what do I have to give to a space and like what is more damaging, and everything is online and that can be very risky. And look, Eleanor and I both know this, like we both experienced forms of harassment or we experienced forms of solicitation or, again, we love listeners reaching out to us respectfully, just to be very clear. We are not against being contacted, but, there, there is a set of vulnerabilities that come with being online. It's something I think about very actively.


But at the same time, I know a lot of people chose not to go back into the office after COVID because they felt less vulnerable at home. They felt actually, I give less of myself when I only meet people online.


ELEANOR DRAGE:

We both felt that. Last one.


MILA JOSIFOVSKA DANILOVSKA:

Hi, my name is Mila Josifovska Danilovska from Metamorphosis Foundation for Internet and Society in a foundation based in Skopje, North Macedonia, where we actually work on leveraging technology to increasing the educational outcomes, but also supporting different stakeholders.


In reaping out the benefits of digital transformation and at the same time helping them build resilience towards the risks that of course arise from it. When it comes to good technology or tech for good, I would always say it's civic- centric technology that's based on the needs and priorities of the citizens with a primary goal of improving their lives.


Now, of course, it needs to be accessible as it currently isn't transparent in the way it was manufactured in how it functions, user friendly and understandable to the users, but also inclusive in. All of its processes from the mapping out of the need to the designing process to how it will function and also at the end of the day, it's not supposed to be discriminatory to any groups, but it should be very much open to a diverse plethora of stakeholders who could benefit from it. When it comes to digital sovereignty, it's a really fluid concept, according to me, and my understanding of it fluctuates day by day. But it basically goes in line with controlling my own digital destiny, or my digital body. The thing is that each day I try to practice good cyber hygiene and all that, all of that, but I feel each day I'm losing a battle because technology develops with such a fast speed that I just can't keep up.


I'm trying also not to invade other or break other people's digital rights at the same time. So it's a whole learning process that I'm trying to keep up with, but I'm afraid that sometimes it's not that easy.


KERRY MCINERNEY:

What do you think?


ELEANOR DRAGE:

Super interesting, a lot to unpack and actually the in depth EU AI act toolkit, which looks specifically at things like how do we bring a lot of diverse stakeholders into the tech production process?


KERRY MCINERNEY:

Yeah, absolutely. Absolutely. Again, like a super rich answer from people who are really engaged with the stuff on the ground.


ELEANOR DRAGE:

There was something. I wanted to flag from Rok, who then gave an answer on digital sovereignty and said something similar about having control over your digital destiny, something that we've just heard.


And I like this idea of of the destiny I haven't really heard in this context before, that's this kind of language but he said, and I will just play it:


ROK:

and regarding digital sovereignty for me, that doesn't just correlate to my controlling of data and using my personal Okay. Yeah. Sorry. Let's just collect my thoughts again.


Okay. Okay So for me, digital sovereignty would not mean just about controlling data and infrastructure. It was also mean to deeply under question and understand how technology shapes my relations with the world and churn it's aligned with my authentic self.


ELEANOR DRAGE:

This is a really interesting point, right?


Because what is the authentic self then when it comes to interactions with technology?


KERRY MCINERNEY:

Absolutely. Because I do think that on the one hand, like there are such important critiques of this idea of the fabricated self, the constructed self on social media. And yet on the other hand, there's also very important critiques of this idea of the selling point of authenticity and how that can essentialize different kinds of gendered or racial characteristics, how that often relies on an image of authenticity, which is used to sell products. Like I think, these conversations around authenticity around falsity are really complex. So it almost reminds me a little bit of the political philosopher Hannah Arendt, who is an incredible philosopher, many of her opinions are now super controversial, but I think she's undeniably one of the great thinkers of the 20th century.


And Hannah Arendt writes in one of her books on politics, I think it's on revolution, it might be a different one, so don't come after me Arendt scholars but she writes a lot about. Hypocrisy as this necessary function for human society, right? She says without hypocrisy, without this kind of -politics requires you to put on a second face and you can't bring your full self into a political space because in that way society stops functioning correctly. And so she says hypocrisy is not a bad thing. It's an essential social function. And so I often sometimes think about social media as being yet another way that we practice these kinds of socially essential hypocrisies in our lives.


Whether that's the face you put on before you go to work or that's the face you put on your Twitter. I think there's actually something quite important in that. But yeah, at the same time, I also know that being pushed away from your authentic self forcibly can also be a really harmful and damaging thing.


And I love this framing of destiny or digital destiny, I think, as such a captivating and much more expansive way of thinking about these ideas of digital ownership.


ELEANOR DRAGE:

Thank you so much for listening to this with me, Kerry. The people I met were so amazing and I hope you can come with me next time.


KERRY MCINERNEY:

Thanks so much for including me on this like super interesting conversation about digital sovereignty.


And we will see you all again soon for our next hot take. And just to remind you, once again, if you want to see a full transcript of the episode, find any of our previous episodes and look at the reading list we've curated for this episode. Go to our website, www.thegoodrobot.co.uk, and if you're listening to us on Apple, Spotify, anywhere else you get your audio podcasts, remember you can also watch us now on YouTube.


So if you're interested, go check out the Good Robot over on YouTube.


OUTRO:

Hot takes with the good robot.




43 views0 comments

Comments


bottom of page