top of page
Search
Writer's pictureKerry Mackereth

Darren Byler on how China Surveils and Controls Uyghur Muslims

In this episode, we talked to Darren Byler, author of Terror Capitalism and In the Camps, Life in China's High Tech Penal Colony. We discussed his in depth research on Uyghur Muslims in China and the role played by technology in their persecution.


Darren Byler is a sociocultural anthropologist whose teaching and research examines the dispossession of stateless populations through forms of contemporary capitalism and colonialism in China, Central Asia, and Southeast Asia.His monograph, Terror Capitalism: Uyghur Dispossession and Masculinity in a Chinese City (Duke University Press, 2021), examines emerging forms of media, infrastructure, economics and politics in the Uyghur homeland in Chinese Central Asia (Ch: Xinjiang). The book, which is based on two years of ethnographic fieldwork among Uyghur and Han internal male migrants, argues that Chinese authorities and technologists have made Uyghurs the object of what it names “terror capitalism.” It shows that this emergent form of internal colonialism and capitalist frontier-making utilizes a post-9/11 discourse of terrorism, what he shows produces a novel sequence of racialization, to justify state investment in a wide array of policing and social engineering systems. These techno-political systems have “disappeared” hundreds of thousands of Uyghurs and other Turkic Muslims in “reeducation” camps and other forms of productive detention while empowering millions of state workers and private contractors who build the system. The book considers how the ubiquity of pass-book systems, webs of technological surveillance, urban banishment and mass internment camps have reshaped human experience among native Uyghurs and Han settler-colonizers in the capital of the region Ürümchi. Ultimately, the book presents an analysis of resistance to terror capitalism, what he terms a “minor politics of refusal,” that emerges from the way Uyghurs and Han allies use new media and embodied practices of care taking to oppose this colonial formation.


In public-facing work regarding the crisis confronting the Uyghurs and others in Northwest China, he has worked in an advisory capacity with faculty and researchers at the University of British Columbia and Simon Fraser University to build a Xinjiang Documentation Project featuring personal testimonies and archives, internal police reports, translations and other documents concerning the ongoing detention of Turkic Muslims in China and the erasure of their native knowledge. He also writes a regular column on these issues for the journal SupChina, as well as essays for other public outlets such as the Guardian, Noema Magazine, and ChinaFile. He has also been asked to contribute expert witness testimony on Canadian and Australian foreign policy issues before the Canadian House of Commons and the Australian Parliament, and he has written policy papers on technology and policing for the Wilson International Center for Scholars and the University of Pennsylvania's Center for the Study of Contemporary China. As part of his work in amplifying the voices of Uyghur cultural leaders he has co-translated a Uyghur language novel titled The Backstreets (Columbia University Press 2021). The novel is written by a leading modernist author Perhat Tursun who disappeared into the internment camp system in Northwest China 2018.   


READING LIST:


Terror Capitalism: Uyghur Dispossession and Masculinity in a Chinese City (Duke University Press, November 2021).


In the Camps: China's High-Tech Penal Colony (Columbia University Global Reports, October 2021). 


TRANSCRIPT:


KERRY MCINERNEY:

Hi! I’m Dr Kerry McInerney. Dr Eleanor Drage and I are the hosts of The Good Robot podcast. Join us as we ask the experts: what is good technology? Is it even possible? And how can feminism help us work towards it? If you want to learn more about today's topic, head over to our website, www.thegoodrobot.co.uk, where we've got a full transcript of the episode and a specially curated reading list by every guest. We love hearing from listeners, so feel free to tweet or email us, and we’d also so appreciate you leaving us a review on the podcast app. But until then, sit back, relax, and enjoy the episode! 


ELEANOR DRAGE:

In this episode, we talked to Darren Byler, author of Terror Capitalism and In the Camps, Life in China's High Tech Penal Colony. We discussed his in depth research on Uyghur Muslims in China and the role played by technology in their persecution. If you're just listening to this on Spotify or wherever you get your podcasts, you can now watch us on YouTube at The Good Robot Podcast.


We hope you enjoy the show.


KERRY MCINERNEY:

 Brilliant. Well, Thank you so much for being with us today. It's really a pleasure to get to talk to you.


So just to start us off, could you tell us a little bit about who you are, what you do, and what brings you to thinking about gender, feminism, race, and technology?


DARREN BYLER:

Sure. Thanks so much. It's a real honor to be here. I'm an anthropologist at Simon Fraser University, which is in Vancouver. I do research in China on uh, technology, surveillance, and migration, looking at the effects of those things on each other but mostly on surveillance on migrant workers, especially Muslim men.


And how did I start thinking about feminism and gender in relation to technology? I suppose I was pushed in this direction by my field site and the people I was hanging out with but also by my advisor, who was Sasha Su-Ling Welland, and, and so she connected me to a scholar named Lila Abu-Lughod, who's a feminist anthropologist who's done a lot of work on Muslim societies. And one of the ways that Laila has talked about scholarship is that, what we're trying to do is not produce knowledge that dominates the people that were studying the situations we're studying.


We don't want to conquer them and reproduce the colonial framework or power structure, but instead to speak beside them and amplify voices and really treat the people that we're engaging as carriers of knowledge. That for me is what feminism is. It's a methodology. It's a way of thinking about the world. It's also a way of attending to gender dynamics that are, everywhere in the world. And that is what has pushed me to think about. communities and being accountable to communities as I'm producing knowledge about them. Technology is a really important part of that story because it's it's everywhere in the world as gender is and it's... It's how power gets reproduced in communities now is through technology. And so I was interested to see how these young Muslim men that I was hanging out with in Northwest China were using social media or using technology to produce knowledge about themselves and about their future and what that did in relation to the state, in relation to police in relation to their own community.


ELEANOR DRAGE:

I love your framing of technology as part of these other stories rather than as what we're seeing in the media today, which is technology as the story is the whole story. So maybe we'll come back to that later. But to begin with, can you answer our three good robot questions? So what is good technology? Is it even possible? And how can feminism and thinking about race help us work towards it?


DARREN BYLER:

Sure. I can try. So a lot of the ways I've thought about feminism and about decolonial practice is in relation to an older body of scholarship called subaltern studies, which is something that, you know Gayatri Spivak and others have talked about: can the subaltern speak, which is, can the voiceless speak? And really what she's asking in that essay, which is a really important essay in post colonial, decolonial literature and feminist literature is is the microphone turned on? Are, of course, subaltern people can speak but are their voices being carried?


Are they being recognized by others? What does it mean to change the frame, expand the frame to allow those voices to actually be heard? And I think that's what good technology can do. It can allow people's voices to be heard. And, a lot of social media does that kind of work.


Initially when I was doing research in Northwest China, I was seeing how social media was allowing people to narrate their own stories to broadcast and connect with each other. But there's also problems in a lot of technology and that they don't, there's not necessarily protections for vulnerable people.


There's less attention paid to safety, I'm using the word safety instead of security because I'm thinking with of feminist abolitionists, scholars of Black feminists who are trying to make that distinction that, community safety is different than security. It's not about maintaining sort of the police order, which is the status quo, which is just reproducing hierarchies of power.


Instead, it's about helping communities to do what they want to push them towards. Enable men and promote forms of abolition and an autonomy within those communities. And so I think we need to both turn on the microphone. That's what good technology does. But then we also need to create forms of safety and security for people.


And then there's a perhaps a third position or third kind of technology, which is reversing the gaze, which is holding power accountable by broadcasting what they're doing, by exposing it as a way of naming that thing as what it is that's also really important in, in seeding forms of change and actually building the world that we want.


So those are the kinds of technology that I look for that I think are possible. It's, a lot of technology development is really Is tied to power and to money and getting power and money, finding ways to eat into power and money or getting money to work for power against power is the trick really but I think there are forms of technology that are really important that can be built.


KERRY MCINERNEY:

That's really fascinating and actually reminds me a little bit of one of our previous guests' has answered to this question to Maya Indira Ganesh, who teaches on the MSt in AI ethics here at our center in Cambridge, and she talks about good technology being hackable, being open to these forms of change and these kinds of joyful forms of play, but also being appropriated and used often against power rather than for it, and maybe often against its sort of original aims and ends.


I actually want to move to some of your work. So the author of two books, Terror Capitalism, and also the book In the Camps: Life in China's High Tech Penal Colony. And both of your books are based on in depth ethnographic research with Uyghur Muslims in China. And so could you share some of the firsthand accounts and some of the stories that you heard and recorded with us about the Chinese state's genocidal violence against Turkic Muslims and the central role played by technology?


DARREN BYLER:

I started this project back in 2010, which is right when 3G networks arrived in this part of China, and so people were really using the internet for the first time, and they were using the Chinese internet because that was all that was available to them. An app called WeChat, which has since become a super app that allows you to do everything you want when it comes to a consumer activity.


But also allows the state to really track people's movement. The company is doing this too, but the company has opened itself to the state and works really directly with the state. And so initially I was doing a techno optimism project of how technology was going to allow Uyghurs to find forms of autonomy despite really extreme forms of control.


And it, it was happening. People were, publishing poetry. There were, there was these film forums that I was really attracted to where people were making their own films and putting them online. And then there'd be 500, 000 people that would watch those films in one week. So these young men that were coming to the city and trying to find a future for themselves and for their community, the Uyghur community is 12 million people, plus another 3 million other Muslims in the region. It's a significant group of people and it was an interesting thing to think about, how technology can enable futures. But, one of these young filmmakers, he told me that his mom now refers to this period as the phone disaster, because it's the time when people were using this public private space to talk about the things that they actually wanted in their lives, which had to do with being part of the global Muslim community. So they were like, interested in the Arab spring, which was happening at that time. It also had to do with connecting to the West. So there's these different kinds of cosmopolitanism that they were producing online. But all of those things eventually by 2017 became signs or predictors of potential extremism.


If you're connecting to the Muslim community or separatism, which meant that you were thinking too much about other countries in general and other communities in general. And so what I saw was the building of a system of control that uses checkpoints at jurisdictional boundaries that uses digital forensics tools, scan people's phones, looking through their digital history, and then has a stoplight feature, which means that you get a red, yellow, or green sign and results in your detention if you get a red or untrustworthy reading on your phone.


I was really looking at how technology was being utilized to predict future criminality. Really, it's like thought crimes, basically and then how that was resulting in a mass internment system. And of course, like in the internment camps, there's no escape from the technology either, because there's face recognition systems in those camps, at the perimeters, in the cells, there's distance learning.


People are watching flat screen TVs on the wall of their cell while they're being watched by the cameras watching the TV. So it's a completely immersive digital space within the camps as well. It's a smart camp experimental space. So I was seeing Uyghurs being used as test subjects for digital technology for new tools around predictive policing, but, beyond that smart systems in general. So this is a whole layered system of smart city systems and how it was really starting to shape people's futures in ways that were really unprecedented. So there's, it feels as though there's no outside to this system.


Um, And one of the ways I've gained access to this space was being there as it was being built but was also working with a journal called the intercept and examining the internal police files that were produced by this company called Lando Soft, which is something, a company that builds itself as China's Palantir.


And, it's hundreds of thousands of the internal police documents showing how people are being stopped what effects are caused by all of the disappearances of these people how the bakers disappear and people complaining that they can't find places to buy bread because all of the bakers were Uyghur and they're all migrants, and so they were some of the first people taken how the children are being bullied in school, how people can't find places to rent because their parents, because their husbands are in the camps.


So it's, it's a system that is affecting the entire society of this Muslim group. And it's really a limit case in terms of how technology When it's built out at intensity and scale can really begin to shape the future of an entire community. It's not as though this technology is unique to China.


Like it, it is mirroring and sort of reverse engineering things like Palantir, U. S. based police, policing and military equipment, and also Israeli technology. But it's being used in a novel way in this context. And so I talk about it as terror capitalism because I see it as a particular frontier of a security- industrial complex and of the global economy. And there's more I could say about it, but that's the basic premise of the book and what I've done in my work.


ELEANOR DRAGE:

You've just talked about that immediate kind of surveillance -of being watched. But what is the afterlife of surveillance, the digital afterlives that we have once we've already been watched. Can you tell us a bit about that?


DARREN BYLER:

There's events that happen within this system, but there's all, it's primarily a structure which means that it's an institutionalized, legalized system of automated racialization, basically.

So in 2017, hundreds of thousands of people were detained. They had their phones scanned, usually, in the first scan is when they were deciding. Who should be detained or not but it's an iterative process. So people are scanned over and over again. Those that weren't detained initially and might be on the, they might have a yellow signal associated with their scan. It means that they're, they have maybe a family member in detention. They have someone, there's some reason why they're on a watch list of being suspicious. And those people are watched in a continuous process. And the digital file begins to follow that person as they move through space and time.


People that were detained if they are eventually released, which does happen at times though, though many are not. They're also marked with this sort of negative credential of having been detained before. So it is similar to having a criminal record, really. It follows you around. And so the digital file begins to move in front of the person.


And it doesn't even have to, be associated with the phone scan. It can also just be, your face being scanned as well, which happens all the time now in China as you move through space, especially in high traffic areas, your face is being scanned. And it's an individualized form of, of recognition that really has nothing to do actually with your activity and it's not something you can get rid of it. It's not associated with your behavior. It's associated with what the system has determined to be the truth of who you are. And so it's a new kind of digitized racialization.


And in this case, it's an anti- Muslim racialization. There's other afterlives too. So I'm thinking about reversing the gaze aspect of this, which is this system was born digital basically. Which means that as researchers, we can see it happening almost in real time. And the in, in initial capture of people's Identifying materials is now something that researchers can see as, that's happened and we can expose that's happened.


But there's a more nefarious part of this which is that, the data is being used to train algorithms, is being used in large scale studies around DNA and so on. And so it's being used as a material to develop new technologies. And those technologies show up in all kinds of things like, COVID tracking equipment and so on, which then gets sold around the world.


And the tech itself isn't stay in Xinjiang. It's now across the country. It's now being sold as part of parts of smart city systems and. In most of the developing world. That's how it becomes a frontier of a global system.


It's interesting that you talk about gender when you talk about surveillance, because often when we're in discussions about surveillance, gender and race are not far behind. So can you tell us about what the significance is of being a young male as a primary target of Chinese state surveillance?


Sure. One of the reasons why I was pushed to really think about gender was because of the people I was spending time with, which was mostly Muslim men and that has to do with my access as a foreigner.


I'm a white American settler studying a group of people from a world that's not my own and that means that I have to do a lot of work to understand what gives people meaning and purpose in their lives, in their languages, which is Uyghur and Chinese. But it also means that the Chinese state was concerned with where I could do that research.

So I was, I could only live in the city. And I had to frame my work as, focused on development in a general sense which I was interested in economic development. But I was, of course, interested in who's left out of that development. And so I was in the city and in Uyghurs are a Muslim community.


There's a lot of traditional gender roles, a patriarchal system. And that meant that the young men were the ones that were sent to find work in the city. And that's how I found them. But I was interested to understand, why and how they lived their lives, why they were sent to the city, and how they lived their lives when they got there, how they were, how they found ways to survive despite having very limited resources in terms of money, in terms of legal power, and so on.


And one of the things that I found was the way that they protected each other by building forms of safety by listening to each other's stories, by getting each other out of bed in the morning, and pushing each other to find work and sharing a form of community. And, a lot of that work did not focus on technology because they understood that the technology was going to be used to track them. And so it was like a refusal to use the technology at times, to leave your phone behind, to delete your app when you think it might be checked. There's so many moments in my research where I, had those kinds of experiences where we thought we were being followed, and so we would delete everything on our phones, and then we'd realize, oh, we were just being paranoid. So it's basically sharing in fear, sharing in pain.


That's a way that young men found ways to survive. But of course the story I'm telling is just, is, it's a partial one, and so I really want to emphasize that in my work, that I'm representing what it means to be a young Muslim person who is the most targeted by these systems whose future is the most precarious because it, in the government documents, the police documents, they say being, between a certain age, 18 to 55, is like, those are the targeted groups.


And young men are far more likely to be detained than anyone else. And so that's why I decided to focus my work there.


KERRY MCINERNEY:

That is really fascinating. And, it's really Interesting also just to hear about the process of research and like the complexities built into this journey that you've gone on as a researcher in terms of being able to build these relationships and tell these stories.


And, I think you're a great example of someone who does turn on the microphone because, this is something that I think so many people are really deeply concerned about, but don't necessarily know what's true or how to access information. And I guess I wanted to ask you in closing about what your work can tell us about the kind of globality of surveillance, but also about how we talk about complex racial interactions, forms of racialization and prejudice that are happening in the global south or the majority world because, so I work in anti Asian racism and AI and something I definitely really struggle with in my own research is, how to make these like stringent critiques of the Chinese state and the genocide of Uyghur Muslims, trying to reconcile that the way that this narrative is then often co opted by particular political players and industry players and particularly in the U. S. But the West more broadly to push forward like a wider anti China narrative, which has quite a detrimental effect on Chinese diasporic people like myself and trying to think about how do we actively combat this horrific injustice happening within China while also, recognizing that the way that these critiques are made sometimes are themselves quite ethically compromised.


DARREN BYLER:

So for me it makes the most sense to, to really expand the object of critique beyond sort of national boundaries to think this isn't a discreetly Chinese problem or American problem.

It's both. It's, and it's, it's beyond both of those two national actors as well. And the way ...I it's not just me like wanting to do that. It's actually what the people in the field told me. So I'm just thinking about this moment in a taxi with a Han taxi driver. And he was pointing out the window at a Uyghur person and said, what do you think of our towel heads? Which is, a anti Muslim slur that is connected. He's saying this in Chinese which you could be translated as turban head, but basically he was doing the same thing as the U. S. military in talking about Muslims in Iraq in a derogatory way that flattens out who they are, that, uses stereotypes and he was placing me in the same position as him as a kind of colonizer, occupier of Muslim space. And it's quite clear that Chinese authorities see themselves as mimicking what the U. S. has done, what Israel has done, what Russia has done in relation to Muslims. They say that very directly in the policing literature. And so, you know, counterterrorism or the global war on terror is truly global and that really should be an object of critique that is, beyond sort of the nation state form and, should be opposed because of the way it racializes Muslims.


So you can start with that sort of framing. You can also think about how policing itself is violence work. There's different lineages or genealogies for police. But in general, that's what it does. It reproduces power. And so you can critique police and police technology in any context, I think.


And so, you know, that enables us to think beyond the nation form as well. But then there's these complexities of, anti Asian racism in the West and how that feeds into anti China sort of sentiment and this, new Cold War rhetoric. I think expanding the object of critique to the, and think about technology in general can push the people position, at least on the left to think about a critique of technology and all of these contacts.


But what does it do? How do you protect Asian diaspora folks? While at the same time opposing the use of this kind of technology in China. It's it's a difficult question. And it's one that I think is quite context specific as to how you do that work. But I think by foregrounding the travel of this technology back and forth, how it's, it's learned that helps in some of this and also thinking about the history, specific histories of colonialism and imperialism, how China and India are sub colonies or sub imperial spaces where they're now colonizing others, but how they've learned that from, the original colonizers.


And how that, helps those in power in those places to see themselves as equal to the colonizer. Holding all of those things at the same time in your mind is, and as you talk about them, present them in your work, I think is important because it helps, I think, people to recognize that there is difference, but there's also these similarities in terms of how technology is being used in these places how racialization is moving and and finding these different forms. There's not a good answer, but that's the start of one.


KERRY MCINERNEY:

Can I jump in on that really quickly?


Yeah, no, I feel that's a really wise answer. And I also think it speaks more broadly to how our understanding of race requires so much contextuality and racialization, because, I think it's really tragic that you have, this kind of extraordinarily violent Han nationalism, within China, and then yet you have, Chinese diasporic people who have been forced to, throw away a lot of their cultural heritage as they're trying to reclaim that.


And, but it's I think it's, we have this utmost responsibility to make sure that when we're having that conversation, when we're like taking pride in our cultural traditions, it's not in a way that bolsters this kind of violent political force of hard nationalism, but instead is about saying, okay, on the one hand it's okay for us to acknowledge the forms of racism and discrimination we've realistically experienced in the West, but that can't be used in any way to then erase or minimize or justify, horrific anti Islamophobic violence within the context of the Chinese state.


ELEANOR DRAGE:

Darren, you were amazing. You've given us so much to think about. Thank you for joining us today.


DARREN BYLER:

Thanks so much for having me. It was a real pleasure.   ​


ELEANOR DRAGE:

This episode was made possible thanks to our previous funder, Christina Gaw, and our current funder Mercator Stiftung, a private and independent foundation promoting science, education and international understanding. It was written and produced by Dr Eleanor Drage and Dr Kerry McInerney, and edited by Eleanor Drage.

49 views0 comments

Comments


bottom of page