The Ecological Hypocrisy of the ChatGPT Generation
The stainless-steel bottle is on the desk. The tote bag is by the door. The flight to Barcelona was replaced by a night train months ago. Lunch is plant-based. The person sitting there can explain scope 3 emissions, fast fashion, and why the avocado panic was always a little unserious.
Then the laptop opens, and a chatbot gets hit twenty times before noon.
That scene matters because it captures a new blind spot in climate culture. A generation raised under the sign of ecological emergency has learned to audit food, transport, packaging, and even dating habits through carbon. Yet much of that same generation treats generative AI as weightless, almost innocent, as if text appearing on a screen belongs to a cleaner moral category than meat, flights, or petrol.
Éric Sadin has been pressing on that nerve. He is right to do it. There is a contradiction here, and it is not a minor one. But the strongest version of the argument is not the lazy version. If you want this critique to survive contact with facts, you have to count carefully. The numbers are messier than the slogans. The hypocrisy is still real.
Climate ethics stops at the screen
Modern ecological behavior is built around what people can picture. You can imagine a plane burning fuel. You can see a steak, a landfill, a plastic bottle, a diesel SUV idling outside a school. These things have texture. They have smell. They live in the visible world, where cause and effect feel almost physical.
Digital services slip past that instinct. A chatbot feels like thought without machinery. You type a sentence, wait two seconds, and words arrive. No chimney, no noise, no warehouse full of servers entering the mind at the moment of use. Convenience works like camouflage.
This is why so many environmentally conscious people developed strong norms around eating less meat or flying less often while building almost none around streaming, cloud storage, crypto before the crash, or now generative AI. Their ethics were trained on the physical world. The datacenter stayed offstage.
That does not mean digital emissions are always larger than the visible ones. Often they are not. A long-haul flight remains a massive hit. Beef remains a major climate burden. Pretending that one week of chatbot use equals intercontinental aviation may feel satisfying, but it turns an argument about responsibility into a contest of exaggerated metaphors.
The real problem is subtler. Climate-conscious people have created a category of consumption they barely interrogate at all, even when that consumption scales through infrastructure that is voracious for electricity, chips, water, land, and backup power. They demand carbon literacy everywhere except where their own convenience, productivity, and social status are involved.
The numbers need honesty before they can have force
Let’s do the part most people skip.
A round-trip Paris–New York flight for one passenger is roughly in the range of one to two tonnes of CO2e, depending on methodology, airline, routing, and whether you include non-CO2 effects. That is a huge number. It should stay huge in the comparison, not get smuggled downward so an argument can sound sharper on social media.
Now take a heavy text-only chatbot user. Say twenty meaningful interactions a day, every day, for a year. Estimates for one interaction vary wildly because “a prompt” can mean anything from “rewrite this subject line” to a long exchange with giant context windows and a reasoning model thinking for a while. Depending on model size, token count, datacenter efficiency, and electricity mix, you can find plausible end-to-end estimates ranging from a few watt-hours to something materially higher for expensive requests.
If we use a fairly generous band of 2 to 20 Wh per substantial interaction, twenty interactions a day adds up to roughly 15 to 146 kWh per year. On a relatively low-carbon grid, that might mean around 4 to 40 kg CO2e. On a dirtier grid, maybe 10 to 90. Push the assumptions harder and you can raise that. Tighten them and you can lower it.
That is not “several Paris–New York flights.” For routine text prompting, that claim usually collapses. Anyone making it as a universal fact is doing morality by meme.
But notice what happens when you stop treating AI use as a neat box labeled “ChatGPT text chats.” Many people do not just ask a chatbot twenty questions a day. They keep an assistant open all day. They use code completion continuously. They upload PDFs. They generate images. They ask for rewrites they do not need, summaries of pages they could skim, study guides from notes they already made, synthetic slides for meetings that should never have existed. In many workplaces, AI is becoming ambient rather than occasional. The prompt count becomes the wrong unit.
Now add the system around the interaction. Frontier models require constant serving capacity. Companies are building new datacenters at astonishing speed. Utilities are extending fossil generation because demand is arriving faster than clean supply. Water gets consumed for cooling in places that already feel climate stress. The chip supply chain is not made of moonlight.
So the honest statement is this: a single text exchange is usually less climatically dramatic than activists or critics imply, but the aggregate infrastructure required to make generative AI habitual at population scale is much more serious than most users let themselves see. The personal footprint may be modest compared with aviation. The collective footprint is not.
Invisibility changes behavior faster than data does
People rarely respond to energy use as a spreadsheet problem alone. They respond to stories, symbols, and social rewards. The green norms that spread fastest are the ones that produce immediate legibility. Taking the train instead of flying is legible. Carrying a reusable bottle is legible. Buying second-hand clothes is legible. Posting that you asked a language model to draft your reading notes is not an environmental confession. It reads as competence.
That social layer matters more than it seems. AI use is tied to productivity, fluency, and belonging. A student who refuses generative AI risks looking inefficient. A junior worker who declines it may fear seeming old-fashioned. A freelancer who refuses automated help may lose time against competitors using it without restraint. The technology does not just offer convenience. It offers status and protection.
Climate behavior works differently. Many green choices ask you to give something up publicly. Less speed, less comfort, less spontaneity, sometimes less fun. AI use often feels like the opposite. More speed. More output. More polish. More relief from blank pages and boring tasks. That asymmetry explains why people who are disciplined in one domain become strangely permissive in the other.
It also explains why the criticism lands hardest when it is framed as coherence rather than purity. Most people know, at some level, that their ecological commitments are selective. They just prefer not to inspect the selection criteria too closely. Once you ask them to apply the same moral lens to their favorite digital tools that they already apply to beef or low-cost flights, the exemptions become visible.
And yes, a lot of those exemptions are self-serving. If climate ethics turns into a menu of symbolic sacrifices plus unlimited appetite for hidden infrastructure, then it starts looking less like ethics and more like lifestyle branding with better packaging.
The contradiction gets sharper when AI creates demand instead of replacing it
There is another reason the “my prompts are tiny” defense misses the point. Generative AI does not merely substitute for older activities. It also creates new ones.
Search used to return links. Now many services generate summaries on top of those links, adding compute to tasks that already had a cheaper path. Software teams that once wrote directly now bounce ideas through assistants, generate code they later review, produce tests from the generated code, then ask another model to explain the first model’s choices. Marketing teams create ten versions where one used to do. Students generate practice questions from summaries of notes that were already summaries of lectures. People ask models to make shopping lists from recipes they also asked models to invent.
This is classic rebound behavior. When a task becomes easier, people do more of it, and they do new adjacent tasks because the friction fell away. The ecological impact of a tool is not only the cost per operation. It is the cost per operation multiplied by the explosion in operations that suddenly feel normal.
That is why the individual “my use is small” argument can be technically true and politically empty. Your own twenty prompts may not threaten the climate. But a society that inserts generative inference into search, office software, education, customer service, coding, design, and entertainment is building a new baseline for energy demand. Once that baseline hardens, it becomes infrastructure, and infrastructure is hard to reverse.
The same generation that learned to challenge cheap flights because they normalized high-emission mobility now tends to normalize AI assistance because it feels intellectually lightweight. Yet normalization is the whole game. Habits become defaults. Defaults become procurement. Procurement becomes power plants.
The useful standard is necessity, not innocence
This is where the conversation usually gets derailed by absolutism. Either AI is framed as ecologically unforgivable, or it is defended as trivial compared with transport and industry. Neither frame helps much.
A better question is the one climate politics should have asked from the start: what uses are worth the cost? If you need a model to analyze a dense legal document, translate a complex text, debug a stubborn error, or help someone with a disability navigate information, the case is stronger. If you are asking for six alternative captions, an instant meeting summary no one will read, or a synthetic image because searching a stock library felt annoying, the case weakens fast.
That sounds obvious, but people rarely apply it because the interface is designed to erase friction. The machine never rolls its eyes. It never says, “This could have been a paragraph you wrote yourself.” It never reminds you that every convenience multiplied by millions becomes load.
The ecological version of maturity is not renouncing every use. It is recovering a sense of proportion. Search for facts when search will do. Read the short email instead of asking for a summary. Draft your own first paragraph when the task is thinking, not formatting. Treat image and video generation as expensive, because they are. Be especially suspicious of AI features that arrive uninvited inside products you were already using. They are often a demand-generation machine disguised as helpfulness.
None of this requires sainthood. It requires the same discipline climate-conscious people already claim in other domains. If you are willing to consider whether a weekend flight is necessary, you can consider whether a dozen model calls to avoid ten minutes of concentration are necessary too.
The credibility of green politics now runs through digital life
The wider stakes are not about catching young people in hypocrisy for sport. Everyone lives with contradictions. Climate politics has never been a purity contest that real human beings could win. The problem is credibility.
If the generation that most loudly speaks the language of limits refuses to examine its own appetite for hidden compute, then its environmental politics starts to look selective in a very modern way: harsh on visible consumption, indulgent toward invisible convenience. That selectivity will be noticed, and not only by ideological opponents. Ordinary people can smell a moral framework that stops exactly where personal advantage begins.
There is also an opportunity here. Ecological critique of AI does not need to wait for abstract debates about consciousness, truth, or the future of work. Those debates matter, but they can feel remote. Energy is immediate. Water is immediate. Grid capacity is immediate. The datacenter buildout is not a metaphor. It is concrete, transformers, transmission lines, backup generators, procurement contracts, and communities being told that this new demand is simply the price of progress.
Once that material layer becomes visible, the politics changes. The question is no longer whether AI feels magical or useful. The question is what kind of energy system we are willing to expand so that people can outsource more routine cognition, more content production, and more low-value friction. That is a public question, not merely a consumer choice.
The green generation does not need to become anti-technology to answer it seriously. It does need to stop pretending that digital habits float above the physical world. The server rack exists whether the interface feels airy or not. Climate commitments begin to mean something sturdier when they reach the tools we are eager to excuse.
End of entry.
Published April 2026