The energy cost of AI is not just a technical problem

Artificial intelligence (AI) has become one of the most transformative technologies of our time, but its rapid advancement brings with it an ever-increasing energy cost that is generating environmental and economic concerns. By 2026, the electricity consumption associated with AI and the data centers that support it represents one of the biggest challenges to global sustainability.

According to estimates from the International Energy Agency (IEA), data center electricity consumption reached around 415 TWh in 2024 (approximately 1.5% of the global total). By 2030, this figure is projected to double to 945 TWh, with an annual growth rate of 15%, driven primarily by AI. The share corresponding to AI-accelerated servers is growing at 30% annually. In the United States, data centers already consume about 4-4.4% of the nation's electricity, and this proportion is expected to increase significantly in the coming years.

The greatest impact comes from two phases: model training and inference (daily use). Training a large model like GPT-4 can consume tens of gigawatt-hours (GWh); for example, some reports estimate 50 GWh for advanced models, equivalent to the consumption of a medium-sized city for several days. However, inference—each ChatGPT query or image generation—now accounts for 80–90% of AI's total computational expenditure. A single typical query consumes around 0.3–0.36 Wh (about 10 times more than a Google search), which, multiplied by billions of daily interactions, adds up to enormous amounts: projections indicate that generative AI could consume 15 TWh in 2025 alone, and much more in 2026.

This voracious energy appetite puts pressure on electrical grids, drives up prices in regions with a high concentration of data centers, and contributes to CO₂ emissions when electricity comes from fossil fuels. Furthermore, it requires large volumes of water for cooling, exacerbating problems in water-stressed areas.

Despite the criticism, there are promising advances. Techniques such as model distillation, quantization, the use of smaller models for specific tasks, and hardware optimizations can reduce consumption by up to 90% without sacrificing performance. Leading companies are investing in renewable energy and more efficient designs to mitigate the impact.

Disclaimer:

The information provided through this channel does not constitute financial advice and should not be construed as such. This content is for purely informational and educational purposes. Financial decisions should be based on a careful evaluation of your own circumstances and consultation with qualified financial professionals. The accuracy, completeness or timeliness of the information provided is not guaranteed, and any reliance on it is at your own risk. Additionally, financial markets are inherently volatile and can change rapidly. It is recommended that you conduct thorough research and seek professional advice before making significant financial decisions. We are not responsible for any loss, damage or consequences that may arise directly or indirectly from the use of this information.

0.01091494 BEE
0 comments