top of page
Search

Michael Kwet on Big Tech's Colonial Takeover

Updated: Feb 23, 2023

We all know about Microsoft Excel and Outlook, but did you know about the kinds of tech they develop in and sell to the global south? These include Escape Management system for jails, police cars inbuilt with sensor data, and software that supports facial recognition systems. To tell us more about this, we talk to Dr Michael Kwet, a visiting fellow of the Information Society project at Yale Law School and a postdoctoral researcher at the Centre for Social Change at the University of Johannesburg. His extensive investigation of how Global South economies are increasingly dependent on big tech companies like Microsoft shows that they get bad deals when they hand over valuable raw materials and labour. They end up seeing little of the vast wealth that big tech amasses through this unfair exchange. His work has focuses on South Africa, where economies, schools and prisons rely on Microsoft software and services. We hope you enjoy the show.


Michael Kwet is a visiting fellow at Yale Law School’s Information Society Project and hosts the “Tech Empire” podcast. His work has been published at the New York Times, Vice News, Al Jazeera, Wired, BBC World News Radio, Mail & Guardian, CounterPunch, and other outlets. He received his Ph.D. in sociology from Rhodes University in South Africa.

Reading List


Kwet, Michael. The Cambridge Handbook of Race and Surveillance, Cambridge University Press (2023). https://www.cambridge.org/core/books/cambridge-handbook-of-race-and-surveillance/D806F08C054EEEC15C08FF88BEF5FD3E.


Kwet, Michael. "Digital Ecosocialism: Breaking the power of Big Tech." Transnational Institute (2023). https://longreads.tni.org/digital-ecosocialism.


Kwet, Michael. "Digital Colonialism: US Empire and the New Imperialism in the Global South." Race & Class 60, no. 4 (2019): 3-26.


Kwet, Michael. "The Digital Tech Deal: A Socialist Framework for the Twenty-first Century." Race & Class 63, no. 3 (2022): 63-84.


Kwet, Michael. "Operation Phakisa Education: Why a Secret? Mass Surveillance, Inequality, and Race in South Africa's Emerging National E-education System." First Monday 22, no. 12 (2017): First Monday, 2017, Vol.22 (12).


Kwet, Michael. "Fixing Social Media: Toward a Democratic Digital Commons." Markets, Globalization & Development Review 5, no. 1 (2020): Markets, Globalization & Development Review, 2020, Vol.5 (1).


Transcript


KERRY MACKERETH:

Hi! We're Eleanor and Kerry. We're the hosts of The Good Robot podcast, and join us as we ask the experts: what is good technology? Is it even possible? And what does feminism have to bring to this conversation? If you wanna learn more about today's topic, head over to our website, where we've got a full transcript of the episode and a specially curated reading list with work by, or picked by, our experts. But until then, sit back, relax, and enjoy the episode.


ELEANOR DRAGE:

We all know about Microsoft Excel and Outlook, but did you know about the kinds of tech they develop in and sell to the global south? These include Escape Management system for jails, police cars inbuilt with sensor data, and software that supports facial recognition systems. To tell us more about this, we talk to Dr Michael Kwet, a visiting fellow of the Information Society project at Yale Law School and a postdoctoral researcher at the Centre for Social Change at the University of Johannesburg. His extensive investigation of how Global South economies are increasingly dependent on big tech companies like Microsoft shows that they get bad deals when they hand over valuable raw materials and labour. They end up seeing little of the vast wealth that big tech amasses through this unfair exchange. His work has focuses on South Africa, where economies, schools and prisons* rely on Microsoft software and services. We hope you enjoy the show.


*Apologies, the inclusion of prisons here was an error - Microsoft collaborates with militaries and creates carceral technologies, but not in South Africa.


KERRY MACKERETH:

Fantastic. So thank you so much for joining us today. Just to kick us off, can you tell us a little bit about who you are, what you do? And what's brought you to thinking about race, colonialism and technology?

MICHAEL KWET:

My name is Michael Kwet, and I'm a visiting fellow of the Information Society project at Yale Law School and I'm a postdoctoral researcher at the Centre for Social Change at the University of Johannesburg. So I started getting into the topic of education technology and when I started looking into that subject, it became apparent that the tech giants down there all had skin in the game, and they had a presence in particular. First and foremost, definitely Microsoft, Microsoft has a huge presence in the country. And they also do things like they have a small army of teacher trainers who are basically learning Microsoft software so they can transmit it to the students and classroom. Pearson, the textbook giant, which needs to now go digital, also has a big presence down there. Google not as much because internet connectivity is not as widespread as it would need to be for it to be as lucrative as they would want it to be. But they still have a presence down there. And so in the process, you start looking at the question, not just about the education technology sector, but why would these tech giants want to be down there [in South Africa]. And that's when I started looking at the broader digital economy, contextualising it to the broader economy. And so that's when you start looking at the question of these tech giants and as a kind of force of Western colonisation. So that's where it all started.

ELEANOR DRAGE:

Amazing. What a great journey. Well, you'll tell us more about that in a minute. But first our billion dollar questions, or as we’ve been told, recently, questions we probably shouldn't ascribe commercial value to you. So what is good technology isn't even possible? And how can feminism help us get there?

MICHAEL KWET:

So I mean, obviously, there are some things that might be inherent in a technology. So we have this interesting question of, can tech be neutral? And I think that it can, but it only in the most abstract sense, right? So if somebody says, is something like, artificial intelligence neutral, you might have different opinions on it, somebody says, it's software neutral, or is a computer itself neutral. Right now you're getting more abstract. If you get down to particular technologies, like facial recognition, I think that you're getting more specific. And then you can say, well, maybe this technology is less neutral, and you can go all the way down to a nuclear bomb, which you can say is is pretty much not neutral at all, and, you know, is actually built for destruction. So where can feminism come in here? So I mean, obviously, if we're looking at good technology, we're looking at a mix of human behaviour and its interaction with the design and structure of what a technology is and can do. As far as what feminism can do. I think that feminism, and I did my women's my master's in women's studies. So I'm well schooled in the third wave especially, from a global women's perspective. And I think it can bring intersectionality analysis to the picture, from an academic standpoint, or from just a practical standpoint, bringing in a lens that will look at the various harms.

KERRY MACKERETH:

Amazing, thank you. And I really love that answer in relation to whether or not the technology can be neutral, being related to the degree of abstraction, that's possible, because I think we spend so much time talking about whether or not these technologies can or cannot be neutral. And to some extent, it is kind of just a distraction from being able to call out and critique the way technologies are made and use that are really harmful. So on that note, you've been a really influential voice and thinking about digital colonialism. I know so many scholars who really look up to you. So we'd love to hear you talk a little bit about this work. So for our listeners to kick us off, could you tell us a bit about what digital colonialism is? And what it looks like in the world today?

ELEANOR DRAGE:

Yeah, so it obviously covers a lot, it's a very big topic, and no one person can really cover it all. But if I were to define it, I would say in a nutshell, it's the use of digital technologies for political, economic, and social control of another group or territory or nation. And it's achieved principally through the ownership and control of the stuff of the digital economy. So that's your software, your hardware, your network connectivity, it's the knowledge like intellectual property, data, and digital intelligence derived from data, platforms, and so on. And if you look at the global digital ecosystem, there's this conversation about the United States and China, which I think has been from an empirical standpoint, relatively misleading. So it's typically portrayed as a battle between the United States and China, and there is obviously competition on the technological frontier there. But if you look empirically at the situation outside of US and Chinese borders, so if we're going to exclude both the US and China, the United States, if you're looking as a whole, so you can break down by country, like, you know, obviously, things are changing in Russia, and Russia already had the index and it’s got its own, you know, kind of Facebook equivalent and so on. But in most countries outside of mainland China, in the United States, the US is extremely dominant. And so the issue is there that you have these US tech giants that own and operate the core stuff that needs to be used if you're using a computer every day. So that's your operating systems. That's your cloud computing, your social media platforms, yes, there is tick tock, but the other ones are still Western. It's business networking, your LinkedIn, your Office productivity software, you can keep going down the line, United States is king in this regard. And so what you wind up with is, and unequal exchange and division of labour. The unequal exchange is the countries that do the “higher level thinking” and own the means of computation, and knowledge are in ownership of the most most lucrative stuff in the global economy. And so at that point, they're making a lot of money for what it is they do. And then in the exchange that you get, where you start looking at, let's say in the Global South, where you have two forms of labour that often can come up one is the extraction of raw materials like minerals, and the other is doing menial labour. Like, you know, labelling AI, you know, labelling images and things like that to train AI and stuff like that. Well, they're doing the menial labour and So they're getting much less in return. So you have this unequal exchange in terms of value. And it's an unequal division of labour. And that tracks very closely with the global divide that was opened up under the last several centuries of colonialism. It's an evolution of colonial order.

ELEANOR DRAGE:

Yeah, absolutely. And you've talked about this as infrastructure as debt as a kind of debt. And given the example that you gave just now, and also the beginning of the episode about South Africa, can you tell us why it's helpful to think of this kind of relationship as one of indebtedness to these massive multinationals?

MICHAEL KWET:

Yeah, so this idea, you know, kind of became apparent to me when I was looking, again, in South Africa, and some of the projects that were going on down there. So one of the things that you notice is, I mean, obviously, there's a fierce competition for resources everywhere you look in the world. If we're looking in academic spaces, the elite institutions like your Yales and your Harvards, they have enormous amounts of resources, even by comparison to other US institutions, right. Or other …same thing obviously, in the UK, you got Oxford and Cambridge and so on. But if you start getting down to the global South, the inequality in the lack of access to resources becomes even sharper. So I used to play basketball. When I was at Rhodes University, where I started and in South Africa, and we went to travel to play a game at Walter Sisulu University, which is one of the formerly all black schools. And they're poor right there, they don't have that much. And all the guys on the team are like, you know, ‘bring some toilet paper’. I'm like, why? And they're like, once you get there, their facilities don't even have toilet paper: bring your own toilet paper. So you have your law school, which has a billion dollar endowment fund, you have Sisulu which doesn't even have toilet paper. Right. So then you have a place like Wits, which is an elite school in South Africa. And even they don't have the kind of money that the Western institutions have - especially the big ones. So they decided to embark on this. To build this precinct. They call it called the st. Malo honge precinct. And it was built with money from Cisco, money from Microsoft, a few tech giants, IBM is involved, they have their own building on the corner on that street, if you go to Google Earth, you can actually walk down the street and take a look. And at that point, you start looking into it. So what they do, it's hosted by Wits University, and it's a centre for research. It's a centre for cultivating startups and doing training for those kinds of things. They have courses that people can take, and they train them on how to use various technologies. But they also have a Microsoft app factory. They don't exclusively train people on Microsoft software and things like that. But the problem is twofold here. So one, South Africa had a free and open source software policy preference - and I review this in my dissertation in great detail. It was designed to prevent Microsoft in particular, because we're talking the early 2000s. Now and when the tech scene was different from colonising their, their tech ecosystem, and they're supposed to be if you have a public institution, you're supposed to be using free and open source software, but they're not doing it. Right. And so you're basically there are - my point is here, there's alternatives, you don't actually have to be using this stuff. And they're not doing it because obviously, you're not going to antagonise a company that's pumping millions of dollars into you and giving you a state of the art facility. So there's a quid pro quo here and that's why I say it's like infrastructure debt. You have a university that wants a facility and these facilities are money and they want to track students and they want them to have a nice place they can go and learn technology and bring minds together who are working on tech. But you have these companies that are for them what are pennies - they're throwing this money down at you. And now you're going to pump their software into your ecosystem and into your society. And they're also going to prevent propers discussions about ethics. Like about a company like Microsoft, which funds police, it funds militaries*, if you know it has business relationships with prisons, it does all sorts of other things, too. So you're also going to defang any sort of possible criticism. So in that sense, they're basically providing infrastructure. And it's a form of debt. Because you immediately want that infrastructure for your own university and for your students and so on, you're now beholden to these corporations.


*A correction from Dr. Kwet post-interview: Microsoft doesn't fund militaries, but rather, it collaborates with them

KERRY MACKERETH:

Yes, absolutely. And I think something that your work does and what that answer just did there is shows how the binaries that we often draw between policing and militarization really don't work out. And often the same actors are at play in this complex sort of flux between sort of juxtaposing external militarization and internal policing. Um, so could you talk a bit more about that? How do big tech firms like Microsoft, engage in both forms of carceral surveillance and also forms of colonial surveillance?

ELEANOR DRAGE:

Yeah, so actually I did a lot of work on carceral tech, both in South Africa and in the United States. And Microsoft, in particular, has been one of the companies I looked into a lot. And they, so I published an article at the intercept in, I think it was June of 20. I was just after the George Floyd murder, and Breanna Taylor. I was doing a lot of research on carceral tech, I had known about Microsoft's involvement in this. So they have for quite some time, and it wasn't, it wasn't talked about in the US, which is, maybe I'll comment on a minute, but they have a Public Safety and Justice division, which was not really ever talked about in critical circles or in the press. And through that they have basically their own products. And then they have, they use the Microsoft Azure Cloud. And they partner with all these tons of different surveillance vendors, and also, you know, carceral tech, you know, third party, vendors, that will run their software on their cloud. And then Microsoft takes a cut in that way. So as far as their own products go, they have a product called Microsoft Aware, which is also called the Domain Awareness system, which was formed in partnership with the NYPD and unveiled in 2012. And then exported to other cities, and in the United States, and then also Brazil, Singapore, Bulgaria was another place they went to recently. And basically, it's just a platform to centralise surveillance like CCTV cameras, any other kind of on the ground surveillance of sensors that you can deploy. And database records, you know, it could be census data, could be criminal records, histories, whatever they can get their hands on. And so you have this kind of like, centralised, almost like mini NSA centres that are built for police. And then you have to have a cop car solution, which kind of does … is almost like a microcosm inside of a car, and it brings in sensor data, all this stuff in the cloud. That was being piloted in South Africa and like, so I just kept looking as far and wide as I could on the internet to find out stuff about this. And we don't really know so one of the things about and then they have a prison, they adopted the Domain Awareness system for prisons, and they have it listed still to this day, on the UK Government website. You know, for licencing out. They have a Moroccan firm called Netopia solutions, which offers a prison management system, they were Africa Partner of the Year for Microsoft. And they have things in there like Escape Management … I mean, in Morocco. We don't know if it runs in Morocco. The companies in Morocco we can't say for sure they didn't respond to requests. But they they have a horrible human rights record of torturing prisoners, locking up dissidents and stuff like that. So you can see, you know, Microsoft's doing all this stuff. And when you're looking at a lot of what's going on, you really still don't, you can only gain so much information from what's public publicly available, including in terms of deployments, like in South Africa, it's like, if nobody posted a video with this cop car on display on YouTube, in Durban, we wouldn't even know what's going on down there. And so, you know, they're doing all these things, there's no transparency about it. And then they're getting up there. And they're saying, oh, you know, we care about anti racism, they have all this PR, their President Brad Smith, you know, their CEO, and so on. But also they have a network of researchers that have basically kept their mouth shut about all this kind of stuff. And it is a huge threat. Because, you know, not only are they doing the things that harm the communities that they are saying that they're looking out for their interests, and trying to help. But also, the people who are most vulnerable are also recipients of this technology. And nobody says anything about it. I mean, Brazilian police, São Paulo, Brazil, they're more violent than us police. But we don't really know what's going on with the use of those cameras. Singapore is an authoritarian state. Right. So we don't know, how is the Microsoft System is still being used? And you know, what happened with that? How much money Microsoft made, etc.? So, yeah, I mean, if we're looking at this issue, it's really, it's really a disturbing thing. And when I did find, when I did my research is that a lot of the Microsoft researchers themselves who are prominent, and also universities who are taking money from Microsoft, they're not talking about carceral tech … all of this was publicly available, if you look at the articles I published, but they’re silent about it. So I think also, these corporations do things like give money to academics and to researchers, and then they wind up capturing them as well.

ELEANOR DRAGE:

Yeah. It's fascinating. And you know, it's so understandable why people go to those companies and get paid to do the kinds of research that they really want to do about their communities, as we were told last week, but again, you know, we're so indebted to people like you who are not only I think doing your research as you're so underselling yourself, this is just a huge extensive amount of investigatory work, and the articles are so fabulous, so please read them. They're just amazing. They'll blow your mind. It's just incredible what we don't know, to end on something, because everybody listens to these episodes. And they think, oh, what can I do? Or what's the way forward? And you've outlined some sustainable ways to build tech? You talked about degrowth, talked about a number of other strategies. So tell us, what should the world look like?

ELEANOR DRAGE:

Yes, so I won't go through in the interest of time, too much detail on this question. But degrowth is something that's really important, because I think it's worth mentioning. So if you want to see some positive proposals for things that we can do differently, that are attacking the roots of the issues, right, like intellectual property, phasing out intellectual property, things like that. A little criticism of antitrust. You know, I have a few articles out about a digital tech deal in a nice short version called Digital eco-socialism. So you know, listeners can look that up. But I want to mention degrowth wrote really quickly, because I think it's really important to establish a framework. And that's what I try to do in my research is to look at Tech from the broadest possible view. Big picture, right? So if we're looking at the question of any sort of policy that we need to make any sort of designing of society at this point, we have to account obviously, for the environment and the limitations of exceeding environmental boundaries, ecological boundaries, because if we do that, we're all screwed. And not equally, but at the same time, nobody wants to see the kind of nightmare scenarios that we all know are widely discussed in the scientific community. And so there are different versions of the environmental crisis. And one of them has what you might call carbon tunnel vision. I've seen that word recently. Right? So this idea that the only thing we have to think about is carbon emissions, and therefore all we need is solar panels and wind farms, and maybe some other things like some curves and excesses of industrial agriculture, being nice to the trees and things like that. And then we can just save society. But there's another branch of scientific inquiry, which mixes in with policy research called degrowth. And some people don't like the term because it implies that everybody is going to be poor, or we're not going to have that much, we're not gonna improve our lives. But really what it's just saying is post growth, it means that we can't just keep extracting resources from Mother Earth. And if you look into that literature, Jason Hickel is my favourite entry point, because he's very clear, and he has a good command of the scholarship. Basically, it's saying that we can't prevent overheating the earth. And also ecological destruction of biodiversity, things like that, if we keep growing the economies aggregately, which means if we were to cap global resource use, and actually reduce it a little bit, and make it fixed, then we have a huge inequality, maldistribution of resources. People in the West are rich, especially the Western elites, people in the South are poor. You know, with the exception of southern elites, so we need a mass redistribution of wealth and a reorientation of our economy. So when I was writing my most recent pieces, what do we do? My argument here is that we have to fit any sort of tech policy that we have within a box. And since inequality is not sustainable inequality plus social justice is not sustainable anymore - you can't just keep growing the global economy to try to fix, you know, poverty, all, you know, the rising tide lifts all boats kind of idea. Then if we keep rising that tide, we're going to, you know, drown the world. So then you have to look at a policy, which can't produce inequality, and these tech giants and digital capitalism in general - they’re inequality machines. And it's a system that reinforces that unequal exchange in division of labour, and so we need policies, and we need movements to try to basically break this down as quickly as possible. So yeah, and there's a number of things in there in the proposal, phasing on intellectual property, you know, kind of socialising or collectivising physical infrastructure like server farms, basically taking the digital economy out of the hands of profit seeking entities, and making it serve the public good.

KERRY MACKERETH:

Yeah, I think that's such a powerful note to end on, right? These ideas of these big tech companies being machines of inequality. And thank you so much for coming on, talk to us, because I feel like things that you've said today are really going to resonate with our listeners, and hopefully also be the starting point of a much longer conversation and research period. So thank you so much.

MICHAEL KWET:

Thank you, Kerry, and Eleanor. And I also want to note that, um, myself and some colleagues are going to be launching a website in the next few months dedicated to these issues. People's Tech.

KERRY MACKERETH:

Definitely check it out. We're going to link it in our website and then you'll be able to find it really easily and we'll also have a selected reading list available on our website as well as the full transcript.

ELEANOR DRAGE:

This episode was made possible thanks to our previous funder, Christina Gaw, and our current funder Mercator Stiftung, a private and independent foundation promoting science, education and international understanding. It was written and produced by Dr Eleanor Drage and Dr Kerry Mackereth, and edited by Laura Samulionyte.



100 views0 comments

Commenti


bottom of page