The AI industry has pursued a development path with high economic and ecological costs. However, recent market innovations show that a more sustainable formula is possible.

Manel Domingo

Urgent, imminent, and inevitable. These three adjectives do not clarify what artificial intelligence is but do capture the feeling that has accompanied it in recent years. More than just a revolutionary technology—which is no small thing—AI has burst into our lives as an unavoidable destiny, overshadowing some fundamental questions. Is AI an end in itself or a means to an end? And if it is a means, what end does it serve? Taking it further, does the end justify the means? 

One of the most publicized purposes of AI is its potential to expand the frontiers of science, particularly in addressing the climate emergency. As a tool, AI enables us to track melting glaciers, organize emissions reduction efforts more effectively in certain sectors, and predict catastrophic climate phenomena with greater accuracy. And despite the promise of these applications, AI is far from being an ecological technology

Rematerializing artificial intelligence

According to Irene Unceta, academic director of the Bachelor in Artificial Intelligence for Business at Esade, it is essential to understand that “AI is, above all, an industry.” However, while other industries conjure images of chimneys, smoke, and pollution, our perception of AI is far from these representations. A quick internet search returns abstract images of cybernetic brains, digital patterns on a blue background, and, admittedly, a few microchips. 

Terms like ‘the cloud’ contribute to dematerializing an industry with a significant ecological impact

The term ‘industry’ helps us conceptualize a product that didn’t arise from nowhere. It highlights the entire history behind its creation,” agrees Paula Subías, academic collaborator at Esade and professor of the degree course Machine Learning and Sustainability. This term contrasts with others surrounding this technology, such as ‘the cloud,’ which hosts the data feeding AI. In her view, these terms “contribute to the dematerialization of AI, making it seem like something ethereal with no real or tangible impact.” 

Adding to this misconception, from an end-user perspective, AI looks like any other software. The interfaces of ChatGPT or Copilot resemble those of typical computer programs. “This creates an opacity for the public, making everything behind the scenes invisible,” Unceta points out. Hidden in plain sight lies a vast material infrastructure composed of enormous data centers and countless hardware components (microchips and GPUs). These depend on long and complex supply chains whose environmental impact is challenging to audit

The environmental impact of AI

In the United States, the energy consumption of data centers, which store data, train models, and execute the necessary computations for AI operations, has tripled over the past decade. According to estimates from the US Department of Energy, it is projected to triple again by 2028. Forecasts indicate that annual electricity consumption will reach 325 terawatt-hours (TWh), exceeding the total consumption of countries like Spain, Italy, or the United Kingdom

Data centers powering AI already consume more energy than entire countries

The commercial race to dominate the AI market has also sharply increased the carbon footprint of major tech companies, which until recently led corporate sustainability efforts. In 2023, Google’s emissions were 48 % higher than in 2019, Microsoft’s rose by 29 % compared to 2020, and Meta’s emissions surged by 66 % from 2021. 

To top it all off, the infrastructure supporting AI is particularly vulnerable to temperature increases caused by climate change. Data centers generate massive amounts of residual heat, requiring enormous amounts of energy and water for cooling. As ambient temperatures rise, their resource consumption becomes even more demanding. It is estimated that by 2027, AI will consume between 4.2 and 6.6 billion cubic meters of water annually—four to six times Denmark’s consumption or half of the United Kingdom’s. 

Bigger is not always better

Much of AI’s excessive energy consumption stems from the type of development pursued so far, driven by the mantra bigger is better. As Unceta explains, the AI “triad” consists of training data, computing power, and algorithms: these three levers create more powerful models. In recent years, the industry has focused heavily on data and, especially, computing power, which has the highest environmental impact

The returns from scaling up models are diminishing, while resource consumption continues to rise

However, accelerating computational power essentially means relying on brute force to achieve better results—at the expense of greater efficiency. The evolution of large language models (LLMs) in recent years has revealed a trend of diminishing returns, where the gains from increasing model size become increasingly marginal, while economic and environmental costs continue to rise linearly. 

According to Subías, it is crucial to rethink the excessively short-term perspective used to evaluate LLM performance. “We focus on solving the immediate problem but overlook the long-term economic, social, and environmental sustainability,” she says. “Like any other industry, AI should also be subject to quality controls, impact criteria, and efficiency evaluations,” adds Unceta. 

This short-term mindset also applies to users who benefit from a technology that instantly satisfies our needs with a simple prompt. “AI burns through resources accumulated over many years at an astonishing speed,” Unceta reminds us. “A trivial question consumes a significant amount of water and energy. It’s not about shifting responsibility to individuals, but it’s important to be more aware of the impact.” 

Does DeepSeek change everything?

Despite these concerns, the recent emergence of DeepSeek heralds “a possible paradigm shift,” according to Professor Unceta. This model, developed in China, has managed to rival its American counterparts while using significantly fewer resources—a necessity driven by restrictions on the Chinese industry. Considering the AI triad mentioned earlier, “DeepSeek returns importance to the algorithm and reduces the emphasis on computing power.” 

DeepSeek’s development model could enable Europe to prioritize talent over infrastructure

“In part, the focus on computing power undermined the value of research. DeepSeek has helped restore the importance of the science behind AI,” adds Subías. This strategic shift is not only good news for the planet but also for Europe. “It opens the door for Europe to foster competitive alternatives, overcoming its previous excuse of lagging behind due to a lack of computing power,” she notes. 

“If we seek an effective, efficient, and sustainable way to solve problems, we restore power to academia and public institutions. We no longer need massive investment volumes and can prioritize talent over infrastructure,” Unceta explains. 

Moreover, this shift in focus also opens pathways to address other challenges, such as algorithmic bias and fairness, and model interpretability—issues that the industry has thus far sidelined. “Bringing more minds to solve algorithmic challenges allows us to advance beyond just environmental sustainability,” concludes the professor.

All written content is licensed under a Creative Commons Attribution 4.0 International license.