The hidden costs of artificial intelligence, from natural resources and labor to privacy and freedomWhat happens when artificial intelligence saturates political life and depletes the planet? How is AI shaping our understanding of ourselves and our societies? In this book Kate Crawford reveals how this planetary network is fueling a shift toward undemocratic governance and increased inequality. Drawing on more than a decade of research, award-winning science, and technology, Crawford reveals how AI is a technology of extraction: from the energy and minerals needed to build and sustain its infrastructure, to the exploited workers behind "automated" services, to the data AI collects from us.Rather than taking a narrow focus on code and algorithms, Crawford offers us a political and a material perspective on what it takes to make artificial intelligence and where it goes wrong. While technical systems present a veneer of objectivity, they are always systems of power. This is an urgent account of what is at stake as technology companies use artificial intelligence to reshape the world.
Kate Crawford's Atlas of AI is a sweeping materialist critique that refuses to let artificial intelligence remain an abstraction. Structured as a series of journeys — to lithium mines in Nevada, Amazon fulfillment centers in New Jersey, skull archives in Philadelphia, the highlands of Papua New Guinea, and the Snowden archive in New York — Crawford maps what she calls the "planetary costs" of AI with the rigor of an investigative journalist and the conceptual ambition of a political theorist. Her central argument is deceptively simple: AI is neither artificial nor intelligent. It is a material infrastructure of power, extraction, and classification that serves existing dominant interests while presenting itself as neutral computation.
The book's structure mirrors the supply chain it interrogates. It begins underground, with the lithium brine of Clayton Valley and the toxic black lake of rare earth processing waste in Inner Mongolia, establishing that the "cloud" is made of rocks and crude oil. From there Crawford ascends through layers of exploitation: the labor chapter traces a lineage from Samuel Bentham's inspection houses and Babbage's factory-as-computer to Amazon's algorithmic "rate" that governs warehouse workers' bodies down to the micromovement. The data chapter documents the shift from consent-based collection (the FERET facial recognition program of the 1990s) to the rapacious scraping of ImageNet, where millions of personal photographs were harvested from the internet and labeled by crowdworkers at fifty images per minute. The classification chapter visits Morton's skull collection at the Penn Museum to draw a line from nineteenth-century craniometry to IBM's "Diversity in Faces" dataset, which measured craniofacial distances with the same reductive impulse. The affect chapter dismantles the Ekman paradigm — the contested claim that a small set of universal emotions can be read from facial expressions — showing how ARPA funding, military priorities, and commercial incentives sustained a seventeen-billion-dollar emotion detection industry despite profound scientific doubt. The state chapter traces how intelligence agency logics of total information awareness have migrated from the NSA to Palantir to Amazon Ring doorbells, creating what Crawford calls a "self-perpetuating surveillance network."
What makes the book distinctive is Crawford's insistence on connecting these domains. The same extractive logic that strips lithium from the earth also strips data from human subjects; the same classificatory impulse that measured skulls now sorts faces into gender binaries and racial categories in training datasets; the same temporal control that synchronized factory clocks now governs Google's proprietary TrueTime protocol across planetary data centers. Crawford draws on an impressive range of intellectual traditions — from Lewis Mumford's concept of the "megamachine" to Geoffrey Bowker and Susan Leigh Star's sociology of classification, from Achille Mbembe's "infrastructural warfare" to Donna Haraway's "informatics of domination" — to build what she calls a "humble geography" of computation: partial, situated, and openly political.
The book is at its strongest when Crawford's firsthand investigations intersect with structural analysis. Her visit to the ghost town of Blair, Nevada — a mining settlement that boomed and collapsed within twelve years — becomes an elegant metaphor for the unsustainable temporality of extractive industries. Her account of Amazon workers in Minnesota fighting over "the rate" connects intimate bodily experience to planetary logistics with devastating precision. Her close reading of ImageNet's Person categories — where a woman on a beach towel is classified as a "kleptomaniac" and Sigourney Weaver appears labeled a "hermaphrodite" — reveals how absurdity and violence coexist in the classificatory infrastructure of machine learning.
Where the book occasionally strains is in its breadth. The desire to connect everything — from Victorian gutta-percha extraction to contemporary drone strikes, from the Fordlandia time-clock riots to Chinese data center coal consumption — can make individual chapters feel like they are surveying terrain rather than excavating it. Some readers may also find that the concluding call for "connected movements for justice" remains more gestural than programmatic. But this is partly by design: Crawford's project is diagnostic rather than prescriptive, and her aim is to make the hidden costs of AI visible rather than to propose a comprehensive policy framework.
Published in 2021 but drawing on a decade of research, Atlas of AI has only grown more relevant as the industry has scaled further. The book's analysis of training data extraction, labor exploitation, environmental costs, and the military-commercial nexus anticipates the controversies that would erupt around large language models and generative AI. Crawford's fundamental insight — that AI is "politics by other means," a registry of power that abstracts away its material conditions while extracting ever more from those least able to resist — remains the essential starting point for any serious engagement with the politics of computation.
Reviewed 2026-04-06
In contrast, in this book I argue that AI is neither artificial nor intelligent. Rather, artificial intelligence is both embodied and material, made from natural resources, fuel, human labor, infrastructures, logistics, histories, and classifications. AI systems are not autonomous, rational, or able to discern anything without extensive, computationally intensive training with large datasets or predefined rules and rewards.
Introduction, Crawford's thesis statement redefining AI against the industry's self-image. — AI definition, materialism, extraction, infrastructure
Since antiquity, the business of mining has only been profitable because it does not have to account for its true costs: including environmental damage, the illness and death of miners, and the loss to the communities it displaces.
Chapter 1 (Earth), on the hidden costs of extraction that have subsidized computation from lithium to rare earths. — extraction, environmental damage, externalized costs, mining
From the perspective of deep time, we are extracting Earth's geological history to serve a split second of contemporary technological time, building devices like the Amazon Echo and the iPhone that are often designed to last for only a few years.
Chapter 1 (Earth), on the temporal asymmetry between billions of years of mineral formation and the 4.7-year average smartphone lifespan. — deep time, extraction, planned obsolescence, environmental cost
The cloud is a resource-intensive, extractive technology that converts water and electricity into computational power, leaving a sizable amount of environmental damage that it then displaces from sight.
Chapter 1 (Earth), quoting Tung-Hui Hu on the materiality hidden behind cloud computing's ethereal metaphor. — cloud computing, environmental damage, extraction, invisibility
Enough is enough. Amazon, we want you to treat us like humans, and not like robots.
Chapter 2 (Labor), quoting Abdi Muse, executive director of the Awood Center in Minneapolis, advocating for Amazon warehouse workers. — labor rights, Amazon, dehumanization, worker organizing
They literally wanted it to be an engine where I'm going to give you 100 resumes, it will spit out the top five, and we'll hire those.
Chapter 4 (Classification), an Amazon engineer describing the company's automated hiring system that was later found to discriminate against women. — algorithmic bias, hiring, gender discrimination, automation
Workers, having been alienated from the results of their work as well as disconnected from other workers doing the same job, are liable to be more easily exploited by their employers.
Chapter 2 (Labor), on how fauxtomation disperses labor in space and time, increasing disconnection and vulnerability. — alienation, labor exploitation, fauxtomation, crowdwork
The price of innovation does not need to be the erosion of fundamental privacy rights.
Chapter 3 (Data), quoting the UK Information Commissioner on Google DeepMind's unauthorized use of 1.6 million NHS patient records. — privacy, data extraction, healthcare data, consent
A computer vision system can detect a face or a building but not why a person was inside a police station or any of the social and historical context surrounding that moment.
Chapter 3 (Data), on how training AI with mug shots strips away context, power relations, and humanity. — decontextualization, computer vision, mug shots, power
In the metaphysics of ImageNet, there are separate image categories for 'assistant professor' and 'associate professor' — as though once someone gets a promotion, her or his biometric profile would reflect the change in rank.
Chapter 4 (Classification), exposing the absurdity of ImageNet's assumption that professional status can be detected from facial appearance. — classification, ImageNet, absurdity, physiognomy
Machine learning systems are, in a very real way, constructing race and gender: they are defining the world within the terms they have set, and this has long-lasting ramifications for the people who are classified.
Chapter 4 (Classification), on how AI systems don't merely detect identity categories but actively produce and enforce them. — race, gender, classification, construction of identity
Companies can say whatever they want, but the data are clear. They can detect a scowl, but that's not the same thing as detecting anger.
Chapter 5 (Affect), quoting psychologist Lisa Feldman Barrett on the fundamental gap between facial movements and emotional states. — affect recognition, emotion detection, scientific critique, facial expression
It is not possible to confidently infer happiness from a smile, anger from a scowl, or sadness from a frown, as much of current technology tries to do when applying what are mistakenly believed to be the scientific facts.
Chapter 5 (Affect), from Barrett's comprehensive 2019 review concluding that facial expressions are not reliable indicators of emotion. — emotion detection, scientific evidence, facial expression, affect recognition
Map the entire internet — any device, anywhere, all the time.
Chapter 6 (State), from an NSA PowerPoint slide describing TREASUREMAP, a program for real-time mapping of all internet-connected devices. — surveillance, NSA, total information awareness, state power
If we can get the target to visit us in some sort of browser, we can probably own them.
Chapter 6 (State), from NSA slides describing FOXACID, a program for compromising individual computers through browser exploitation. — surveillance, hacking, NSA, cyber warfare
We kill people based on metadata.
Chapter 6 (State), quoting General Michael Hayden, former director of both the NSA and the CIA, on drone targeting. — drone warfare, metadata, state violence, surveillance
Avoid at ALL COSTS any mention or implication of AI. Weaponized AI is probably one of the most sensitized topics of AI — if not the most. This is red meat to the media to find all ways to damage Google.
Chapter 6 (State), from a leaked email by Fei-Fei Li about keeping Google's Project Maven military AI contract secret. — Project Maven, military AI, Google, corporate secrecy
The true labor costs of AI are being consistently downplayed and glossed over, but the forces driving this performance run deeper than merely marketing trickery. It is part of a tradition of exploitation and deskilling.
Chapter 2 (Labor), on how Potemkin AI and fauxtomation obscure human labor to maintain the illusion of autonomous machine intelligence. — labor exploitation, fauxtomation, Potemkin AI, deskilling
The focus on the hateful categories is not wrong, but it avoids addressing questions about the workings of the larger system. The entire taxonomy of ImageNet reveals the complexities and dangers of human classification.
Chapter 4 (Classification), arguing that removing offensive labels from ImageNet fails to address the deeper problem of classifying people like objects. — classification, ImageNet, bias, systemic critique
As Audre Lorde reminds us, the master's tools will never dismantle the master's house.
Conclusion (Power), Crawford invoking Lorde to argue against the idea that AI can simply be 'democratized' to serve justice. — power, justice, reform vs. transformation, structural critique
When someone says, 'AI ethics,' we should assess the labor conditions for miners, contractors, and crowdworkers. When we hear 'optimization,' we should ask if these are tools for the inhumane treatment of immigrants. When there is applause for 'large-scale automation,' we should remember the resulting carbon footprint at a time when the planet is already under extreme stress.
Conclusion (Power), Crawford's call to translate abstract AI discourse into concrete questions about material harm. — AI ethics, labor, immigration, climate, accountability
AI systems are built to see and intervene in the world in ways that primarily benefit the states, institutions, and corporations that they serve. In this sense, AI systems are expressions of power that emerge from wider economic and political forces, created to increase profits and centralize control for those who wield them.
Conclusion (Power), Crawford's summary statement on AI as fundamentally a tool of existing power structures. — power, state, capital, centralization
This is the most important work I'm doing. It's a simple argument, this is the best planet. And so we face a choice. As we move forward, we're gonna have to decide whether we want a civilization of stasis — we will have to cap population, we will have to cap energy usage per capita — or we can fix that problem, by moving out into space.
Coda (Space), quoting Jeff Bezos on his Blue Origin space colonization project as an escape from planetary limits. — space colonization, extraction, tech billionaires, limitless growth
The underlying visions of the AI field do not come into being autonomously but instead have been constructed from a particular set of beliefs and perspectives. The chief designers of the contemporary atlas of AI are a small and homogenous group of people, based in a handful of cities, working in an industry that is currently the wealthiest in the world.
Introduction, on how AI's worldview reflects the narrow demographics and interests of its creators. — power, homogeneity, AI industry, worldview
These ways of seeing depend on the twin moves of abstraction and extraction: abstracting away the material conditions of their making while extracting more information and resources from those least able to resist.
Introduction, Crawford's formulation of AI's fundamental operational logic. — abstraction, extraction, power asymmetry, invisibility