Back to Blog

AI and Energy Consumption: Can Data Centres Sustain the AI Boom?

AI and Energy Consumption: Can Data Centres Sustain the AI Boom?

AI training and inference consume enormous amounts of electricity. We examine the energy challenge, sustainability concerns, and the technologies that could solve the crisis.

The energy demands of artificial intelligence are growing at a pace that has caught even the technology industry by surprise. Training a single frontier language model can consume as much electricity as a small town uses in a year. Globally, data centres already account for approximately 1-2% of total electricity consumption, and AI is driving that figure upward rapidly. This creates both a sustainability challenge and a practical constraint on AI development.

The Scale of AI Energy Demand

The International Energy Agency estimates that data centre electricity consumption could double by 2028, driven primarily by AI workloads. A single NVIDIA H100 GPU consumes 700 watts under full load. An AI training cluster with 10,000 such GPUs, typical for frontier model training, draws 7 megawatts, enough to power approximately 5,000 homes. When cooling, networking, and storage infrastructure are included, the total facility power can exceed 100 megawatts. Microsoft, Google, and Meta have each disclosed that their carbon emissions have increased significantly as AI operations have scaled.

The Sustainability Dilemma

Major technology companies have made ambitious carbon neutrality commitments, but AI is making those pledges increasingly difficult to honour. Google reported a 48% increase in greenhouse gas emissions over five years, driven largely by AI data centre expansion. Microsoft acknowledged that its emissions rose 30% as AI workloads grew. The tension between AI ambitions and sustainability goals is creating pressure to find energy solutions that can scale as fast as AI demand.

Potential Solutions

Several approaches are being pursued in parallel. Renewable energy procurement is the most immediate lever, with companies signing long-term power purchase agreements for solar, wind, and geothermal energy. Nuclear power, particularly small modular reactors, is attracting significant interest from technology companies seeking reliable, carbon-free baseload power. Microsoft has signed an agreement to restart a nuclear reactor at Three Mile Island specifically for data centre power. On the efficiency side, more efficient chip architectures, better cooling technologies including liquid cooling, and algorithmic improvements that reduce compute requirements per task all help moderate energy growth.

What This Means for the Industry

Energy availability is becoming a binding constraint on AI scaling. Companies that secure reliable, affordable power will have a structural advantage in AI development. Regions with abundant clean energy, including Scandinavia, Quebec, and parts of India with strong solar resources, are becoming attractive data centre locations. For companies like QverLabs that build AI applications rather than infrastructure, energy-efficient model design is both an environmental responsibility and a business necessity. Our focus on optimised inference pipelines reflects the reality that compute efficiency and energy efficiency are increasingly the same thing.