Research Areas


We use the AI Supply Chain proposed in “Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model“ by Sasha Luccioni, Sylvain Viguier, and Anne-Laure Ligozat as a starting reference point to map the planetary justice impacts of AI and organize our research projects and campaigns. Each stage corresponds to a research area.

Building on this framework, we will map specific substages and stakeholders for each main stage of the AI supply chain. This includes identifying key actors, infrastructures, labor conditions, environmental costs, and socio-political dynamics within each phase—ranging from resource extraction and hardware manufacturing to model training, deployment, and waste management. By doing so, we aim to trace the asymmetries of power, labor, and environmental responsibility across the AI lifecycle, ensuring that issues of justice, accountability, and sustainability are foregrounded in AI governance and advocacy efforts.

Illustrated arrow pointing right.
Illustrated arrow pointing left.
Illustrated arrow pointing right
Illustrated arrow pointing left
Illustrated arrow pointing down.

Active Projects

Beyond Binaries: Epistemic Justice in AI Datasets

This project, situated within the Model Training research area, investigates how AI training datasets in agriculture embed epistemic hierarchies that privilege Global North ways of knowing. Focusing on India, it critically examines how datasets used to train agricultural models often reflect standardized, techno-scientific approaches—while sidelining local, traditional, and Indigenous knowledge systems.

Rooted Clouds: Mapping the Planetary Justice Impacts of AI Data Centers

This project, situated across the Model Training and Model Development research areas, investigates the hidden infrastructure powering artificial intelligence. Far from being weightless or abstract, AI runs on sprawling data centers that extract water, energy, and land–often at great cost to local communities and ecosystems. This project maps where these centers are being built, who is driving their expansion, and what impacts they leave behind, making visible the hidden infrastructure–and injustices–at the heart of AI.