The explosive growth of generative artificial intelligence—from ChatGPT to advanced image processing systems—has led to significant changes in the economy, culture, and technology. However, researchers now warn: behind the convenience of AI lies a growing and poorly understood environmental catastrophe.
At a conference in the U.S., a paper titled "Power Hungry Processing: Watts Driving the Cost of AI Deployment?" was presented. The results of the study were alarming: modern universal AI models consume orders of magnitude more energy than traditional software.
Why AI Usage is So Energy-Intensive
The study shows that the reason lies in the very nature of generative models. AI models—especially large language models—perform gigantic matrix multiplications for each token (user query).
Unlike classical programs that execute pre-defined instructions, AI models recalculate a vast number of probabilities each time. This approach requires heavy mathematical operations—specifically, repeated matrix multiplications involving billions of parameters.
The authors note that in generative models, the choice of the next word (token) is made not from a few options but from the entire model vocabulary, which can contain tens of thousands of elements.
Moreover, tasks related to images require even more energy than working with text because the model must "imagine" not just one word but each pixel of the image. Text is a sequence of characters, which are relatively few, while an image consists of millions of tiny colored dots. The AI must calculate the color and value of each of them. Therefore, as the authors of the study point out, "image-related tasks generate raw pixels, making such tasks more energy-intensive than text ones."
The length of the response also plays a role: each new word requires recalculating the entire previous sequence, which exponentially increases energy consumption. Thus, long dialogues or written assignments create a much more serious load than short queries.
Environmental Consequences: Water, CO₂, and Strain on Power Grids
Although training large models has long been known as an energy-intensive process, researchers emphasize that, in practice, the main load occurs later—during the model's use by millions of users.
Popular systems like chatbots or image generators handle billions of requests per day. Even if one inference costs little, such volumes quickly turn small numbers into significant environmental impacts.
The increase in energy consumption leads to higher carbon emissions. Additionally, modern data centers require vast amounts of water for cooling, and in regions experiencing moisture shortages or extreme heat, this can create competition between infrastructure and the population.
The authors of the study conclude: AI makes our lives more convenient but demands colossal energy expenditures—and the smarter and more versatile the model, the higher its environmental cost. The study clearly shows that before launching powerful generative systems for simple tasks, the industry needs to consider whether every convenience is worth thousands of extra kilowatt-hours.