
The rapid expansion of artificial intelligence is accelerating energy demand significantly, says Richard Collins
Every online interaction depends on processing information stored in remote servers housed in data centres worldwide. This infrastructure consumes vast amounts of energy, currently accounting for approximately 1% to 1.5% of global electricity use, according to the International Energy Agency(IEA). In at least five U.S. states, data centres already exceed 10% of total electricity consumption, while in Ireland, they account for over 20% of all metered electricity use. The rapid expansion of artificial intelligence (AI) is accelerating this energy demand significantly and fast.
AI is incredibly energy-hungry because it demands massive computing power to train and run models. Training involves setting up the model and enabling it to teach itself how to behave, while the operational phase allows users to interact with it by feeding prompts and receiving responses, both of which are highly energy intensive.
According to Scientific American, if current AI trends continue, NVIDIA is expected to ship 1.5 million AI server units per year by 2027. Running at full capacity, these servers would consume at least 85.4 terawatt-hours of electricity annually, more than the total energy consumption of many small countries.
The issue is that we are already taking AI for granted without considering what powers its ability to handle millions of requests daily. The number of AI queries is skyrocketing, relying on large language models (LLMs) like ChatGPT-4 to process them, with even higher volumes expected in the future. Yet this intensive computation remains largely out of sight and mind.
Cooling Requirements
It’s not just the soaring energy consumption; AI’s significant power use also generates immense heat, requiring efficient cooling systems to prevent server overheating. Most cooling methods rely on large amounts of clean, fresh water. Yet only 3% of the Earth’s water is fresh, and of that, 2.5% is locked away in glaciers and polar ice caps, making it inaccessible. This leaves a mere 0.5% of the planet’s water supply available to support a growing global population while also meeting the increasing demands of agriculture and industry.
A report from the University of Oxford highlights that even more energy is required to cool this water, which is either sprayed into the air flowing past the servers or evaporated to dissipate heat. Unless designed as a closed-loop system, this process results in both energy consumption and water loss.
A relatively small 1-megawatt data centre using enough electricity to power 1,000 homes can consume 26 million litres of water per year through traditional cooling methods.
As LLMs continue to grow in popularity and more companies begin to develop their own models, these environmental concerns will only be compounded.
Without intervention, the environmental impact of large language models will only intensify as their adoption grows. Yet there is reason to be optimistic, as efforts are being made to limit the environmental impact of LLMs, and AI in general.
Awareness is increasing, and efforts to mitigate AI’s environmental footprint are already in motion. These include improving data centre sustainability, enhancing hardware and software efficiency, and promoting responsible AI development. By prioritising sustainability alongside technological progress, we can work toward a future where innovation and environmental responsibility go hand in hand.
Small Language Models (SLMs)
Increasingly, the answer may lean towards the precision and efficiency of small language models (SLMs). Tailored for specific business domains ranging from IT to customer support, SLMs offer targeted, actionable insights, representing a more practical approach for enterprises focused on real-world value over computational prowess.
Large language models like GPT-4 are revolutionising enterprises by automating complex tasks such as customer service, providing rapid, human-like responses that enhance user experiences. However, their broad training on diverse internet datasets can limit customisation for specific business needs.
As discussed, LLMs demand vast computational resources for both training and inference, leading to high energy consumption and substantial CO₂ emissions. Additionally, they contribute to the growing water crisis, as data centres require significant amounts of water for cooling, further straining an already limited resource.
In contrast, SLMs are trained on more focused datasets tailored to the specific needs of individual enterprises. This targeted approach reduces inaccuracies and minimises the risk of generating irrelevant or incorrect information. When fine-tuned for specific domains, SLMs can achieve language understanding comparable to that of LLMs, making them highly effective for natural language processing tasks that require deep contextual comprehension.
Carbon Emissions
The latest advancements in SLMs mark a breakthrough in energy-efficient, cost-effective AI technology, reshaping how businesses and developers approach artificial intelligence. These smaller models require significantly less energy to train and operate, leading to lower computational demands and reduced greenhouse gas emissions. In many cases, SLMs can run locally on smartphones, laptops, or personal computers, often eliminating the need for water-intensive cooling systems used in large-scale data centres.
The CO2 difference between small language models and large language models is substantial, with SLMs generally providing a more sustainable alternative due to their lower energy consumption and reduced carbon emissions. While LLMs showcase remarkable capabilities, their high environmental impact raises concerns about long-term sustainability. As AI continues to evolve, prioritising energy-efficient models and adopting sustainable practices across the AI lifecycle is essential. This includes optimising data centre energy use, promoting renewable energy adoption, and minimising electronic waste.
The world has seen a significant shift in recent years, particularly in attitudes toward environmental and social responsibility (CSR). Increasing awareness of AI’s environmental impact is driving efforts to make this transformative technology more sustainable. When weighing the choice between SLMs and LLMs, it’s crucial to consider their environmental implications and advocate for responsible AI development. By making informed decisions and supporting sustainable innovation, we can harness the power of AI while minimising its ecological footprint.
Sustainability is a shared responsibility – employees, businesses, stakeholders, and consumers all play a role in shaping the AI revolution. With greater awareness and understanding, we can choose how to engage with AI in a way that aligns with our environmental values.
Whatever the future holds, both SLMs and LLMs will likely be a part of it – their roles are distinctly suited for certain use cases, though overlap may exist between the two from time to time.
Take our survey: Where are you on your sustainability journey? https://csra-roadmap.herokuapp.com/
We also have an interactive sustainability toolkit. The FREE download contains 4 vital tools to identify, record, measure, and report your environmental and sustainability initiatives. https://csr-accreditation.co.uk/wp-content/uploads/2023/05/CSR-A_2d- Get-Started-Tool-Kit-05-23-INTERACTIVE-1.pdf
Sources – March 2025
https://www.scientificamerican.com/article/the-ai-boom-could-use-a-shocking-amount-of-electricity
https://www.iea.org/topics/artificial-intelligence
https://eng.ox.ac.uk/case-studies/the-true-cost-of-water-guzzling-data-centres
https://sustainabilitymag.com/articles/how-microsoft-ibms-slms-are-making-ai-more-sustainable
https://cee.illinois.edu/news/AIs-Challenging-Waters
(with thanks to Andrew Kirkley of The Academy World for contributions and advice)