top of page
Search

AI Needs Fat Liberation! with Aisha Sobey

  • ed5759
  • 21 minutes ago
  • 8 min read

In this episode, Aisha Sobey, a research fellow at the Leverhulme Centre for the Future of Intelligence, explores how anti-fat bias shapes our digital lives. She discusses its effects on health technologies, social media, and generative AI, and explains why anti-fatness must be seen as a systemic issue. The conversation also highlights how ideas from fat liberation can help create more inclusive and fair technological design.


Aisha Sobey is a postdoctoral researcher at the Leverhulme Centre for the Future of Intelligence at the University of Cambridge. Her work examines how technologies construct ideas of normality and how these assumptions affect people, particularly in relation to bodies, health, and lived experience. Drawing on fat liberation, feminist theory, and disability studies, Aisha investigates how anti-fat bias is embedded in digital systems and advocates for design approaches that prioritise plurality, care, and justice. She is also co-developing a diabetes-focused food-tracking tool that supports health management without reinforcing diet culture. Her research brings together social awareness and technical insight to reimagine how technologies can better serve diverse bodies and communities.


Transcript


Kerry McInerney (00:57) In this episode, we talk to Aisha Sobey, a research fellow at the Leverhulme Center for the Future of Intelligence at the University of Cambridge. Aisha shows us how anti-fat bias or structural discrimination against fat people manifests across our digital lives, from health and medical technologies through to the representation of fat people in generative AI tools. She explains why anti-fatness needs to be considered intersectionally with and alongside other forms of discrimination, why fat liberation offers us a crucial alternative to this harmful status quo, and how fat liberation can be baked into AI design and development. We hope that you enjoy the show.


Kerry McInerney (01:36) Brilliant, thank you so much for joining us. So just to kick us off, could you tell us a little bit about who you are, what you do, and what brings you to the topics of feminism, gender, and technology?


Aisha Sobey (01:37) Absolutely. So I'm Aisha Sobey. I'm a postdoctoral researcher at the Leverhulme Center for the Future of Intelligence. My work throughout my master's and PhD has focused on how technologies shape what we consider as normal and how this impacts people. I'm focusing now on how this relates to the body and the way we exist as humans, embedded within a fat liberation perspective. This goes hand in hand with feminism, gender, and technology, because feminism and the body have been sites of contestation for a long time. It would be a disservice not to think about all of these things together.


Eleanor Drage (02:30) Thank you so much. We're excited to have your take on our three good robot questions: what is good technology, is it possible, and how can feminism help us get there? I know you use technologies in your day-to-day life to monitor your health. I'd be interested in what a good technology is on the micro level for you.


Aisha Sobey (02:35) Absolutely. To add context, I'm a type one diabetic. I've recently moved onto a pump, so my life is more technologically mediated by algorithmic systems. From both a fat liberation perspective and a diabetic perspective, good technology for me starts from a plural perspective rather than a singular idea of what is desired or good. In terms of this being possible, the jury is still out. With the financial interest in having something singular that everyone should spend money to attain, challenging that is hard. But I'd like to think it's possible. Moving towards values that are societal and culturally based is where feminism is a great starting point, community, care, ethics of care, and how these can feed into technological systems from the start. That makes it possible, but someone has to fund it, so the jury is out.


Eleanor Drage (04:14) How is your pump going? Tell us.


Aisha Sobey (04:16) Absolutely. It's a weird one. Even as a researcher who thinks about AI hype all the time, I still hoped this technology and the change from injections to this AI system would revolutionize my life and take away the burden and thought. That was naive. This is where the idea of building for “normal” comes in from my academic research. The algorithm that runs the loop technology, where the sensor talks to the insulin pump and theoretically performs as a pancreas, assumes a default human. In our training, we were told that when women have hormonal times of the month and run higher, we need to tell the algorithm that. So the default is someone without that going on, or someone with a regular cycle. Studies show the 28-day cycle happens for about 30% of people with uteruses, if not less. There are so many assumptions built into what is normal. As someone with other hormonal things going on, the algorithm won’t help me with that aspect of care.


Kerry McInerney (06:19) What you're saying reminds me of an earlier episode with Helen Hester, who talks about how new medical technologies promise easier care but often move medical labor into the home, individualizing it, and forcing people to become their own med-tech experts. It's often framed as autonomy but creates massive burdens on carers, usually women doing unpaid work. What you're saying about the pump resonates with this kind of at-home professionalization.


Aisha Sobey (07:33) There’s a real tension. As someone with a chronic health condition, disability studies emphasizes that we know our own bodies. Recognition of that through technology means there's a dual burden of being the one responsible for producing a nice-looking graph for the doctor. For people who can't or don't want to do that, it creates barriers to care that can be harmful.


Kerry McInerney (08:11) I want to ask about anti-fat bias. What is anti-fat bias? Why do you prefer that term over fatphobia? How does it manifest across digital systems?


Aisha Sobey (08:51) Anti-fat bias is negative attitudes toward weight and body size. I'll keep using the word "fat," aligned with activist spaces reclaiming harmful language. People have moved away from “fatphobia” because phobias are real conditions, and the term minimizes that. Anti-fat bias at a systemic level is a more powerful framing. It's not just harmful to fat people, though the bigger you are, the more discrimination you face, but it harms everyone. We're losing generations of young people who invest energy into controlling their body size rather than things that matter. Women experience more pressure about bodies. There's also "weight bias" or "weight stigma" from a medicalized perspective, which often implies we should be nice to fat people so they lose weight, rather than being nice full stop. Anti-fat bias argues we should be kind to everyone.


Kerry McInerney (11:21) How does this operate across digital technologies?


Aisha Sobey (11:50) This is where I started my research. Looking at HCI and design justice, body size wasn't addressed. Searching repositories, most HCI research involving body weight aimed to make bodies smaller, over 600 studies, with only a handful not focused on reduction. That’s embedded in design practice. Recommender systems also reflect societal pressure to be thin. Ads, especially for women, increasingly push weight loss drugs like Ozempic, causing eating disorders and body dissatisfaction. Digital systems reinforce the drive to make bodies smaller.


Eleanor Drage (13:57) It seems you're saying anti-fat bias is bad for everyone because it's about an idea of weight that has become a phantasm people fear.


Aisha Sobey (14:01) Yes.


Eleanor Drage (14:12) I want to ask about the other side: fat liberation. What is fat liberation?


Aisha Sobey (14:19) Fat liberation started in the 60s and 70s with the Fat Dykes movement. In 1973, a manifesto outlined opposition to reducing industries, exploitation of fat people as a market, and biased science. Over 50 years, no country's attempts to reduce population weight have succeeded. Longitudinal studies show dieting predicts weight gain. Our bodies resist long-term weight loss. The movement recognizes that fat people can live meaningful, joyful lives, especially without constantly trying to be smaller. There's also a movement within HCI to bring fat liberation into design paradigms.


Eleanor Drage (16:07) How can fat liberation be brought to life in combating how AI treats people?


Aisha Sobey (16:09) There are many approaches. It includes information-seeking about health without weight being central, rethinking avatar design so thinness isn't encoded as good, and challenging design paradigms like Wii Fit, where avatars became thinner with weight loss. We must move away from seeing bodies as profit sources. Designing health and tracking technologies for profit encodes the idea that something needs to be fixed. Companies want retention, data, and dependence. Instead, technology should support people and set them free, collect as little information as possible. If you love someone, set them free; if you design good technology, don't make people dependent on it. Especially regarding weight.


Kerry McInerney (18:06) Fat liberation also argues that anti-fatness is a structure of oppression in its own right. How do you think about its relationship to other forms of discrimination?


Aisha Sobey (19:27) Fat and anti-fat bias are rooted in other marginalizations. The BMI’s history is tied to hierarchies of bodies, used to claim thin white people were superior. Though originally meant for population-level use, it was popularized by insurance companies to charge more. At the same time, eugenic projects used similar metrics. Fatness intersects with ableism, gender norms, and racism. Yet implicit bias tests show most biases decreasing, while weight bias is increasing. In medicine, access is increasingly restricted, people are refused surgeries or treatments because of weight, regardless of eating disorders or practicality. While interconnected, fatness also functions as its own axis of oppression. There's a need to zero in on fatness to design it into systems or out of harmful systems.


Kerry McInerney (22:17) Representation is another area you’ve studied, especially in generative AI. Tell us about what you found.


Aisha Sobey (22:56) My study looked at fatness in generative AI image models. These models are widely used and shape social imaginaries. I tested nine free-to-use platforms with 20 prompts: nine generic (a person eating, a person exercising), and eleven fat-specific (a fat person eating, etc.), plus prompts using “obese” and “morbidly obese.”Findings:• Prompts without body size produced slim bodies nearly universally, erasing body size diversity.• Some models refused prompts using “fat,” assuming the word is harmful, yet allowed “obese,” which many find more harmful.• When fatness was depicted, many outputs were cartoonish despite requests for photorealism.• Common patterns included topless white male bodies, sad facial expressions, and body anomalies, extra chins, bodies split down the middle, worm-like rolls.If someone wants to see a fat person represented and the model outputs something negative, it is harmful. Yet part of me was relieved not to see myself represented, the absence means our images may not have been scraped into these extractive systems. There's a tension between wanting representation and resisting exploitation.


Eleanor Drage (27:22) You're creating your own diabetes-tracking app with the University of Sheffield. Tell us how you imagine its design.


Aisha Sobey (27:46) Pointing at problems is one thing, we also want to build alternatives. My colleague and I met through fat liberation and both live with diabetes. Food tracking has been critiqued as insidious and normative, but diabetes requires knowing carbohydrate intake. There's little research on how normative values in food tracking affect people who cannot opt out. Existing advice is “just don’t track,” but diabetic people often must. No apps are designed specifically for diabetics. We're working on creating food tracking that isn’t based on diet culture, supporting carbohydrate awareness without calorie counts or BMI tracking. We’re conducting focus groups with diabetic people to understand what would reduce the burden of chronic illness. The challenge is participatory design, people may express desires shaped by societal expectations of weight loss. We’re grappling with how to include voices without reproducing what we aim to challenge.


Eleanor Drage (31:34) Aisha, thank you so much. This was incredible. Looking forward to seeing you tomorrow.


Aisha Sobey (31:37) Thanks for having me.

 

 
 
Join our mailing list

Thanks for submitting!

Reversed white logo CMYK.png
  • Amazon
  • kradl, podcast recommendations, podcast discovery, interesting podcasts, find-podcast reco
  • Spotify
  • Deezer
  • Twitter
bottom of page