Articles
1 October 2024

AI Monthly: Hungry for power

The rise of generative AI has led to massive increases in computing power and data centre demand, potentially doubling power use by 2026. This growth raises carbon emissions, with AI models varying in energy efficiency. A targeted tax on electricity could encourage low-carbon energy use to reduce emissions

Generative AI: hungry for data and energy

Generative AI models are everywhere, from text to image and video generation. But as AI becomes more embedded in everyday applications, the energy footprint of these models is a critical factor to consider. And not every AI model is equally efficient. A recent study compared the energy efficiency of different Machine Learning (ML) models during their use. The least efficient image generation models can use as much energy as 522 smartphone charges per image, while the most efficient text generation models use as little energy as 9% of a full smartphone charge for 1,000 uses.

Data centres: the powerhouses behind our data-hungry AI systems

For AI systems to work at all, you need the right infrastructure. This is where data centres come into play. Data centres provide the essential infrastructure to store, distribute, and process the needs of AI systems. This requires energy consumption. The International Energy Agency (IEA) estimates that data centres and data transmission networks are each responsible for 1-1.5% of global electricity use.

By 2026, power demand from data centres could reach 1,000 terawatt-hours (TWh), double the levels of 2022 and equivalent to Germany’s annual power demand (512 TWh). No wonder companies like OpenAI are lobbying the US government for large data centres to support advanced AI systems. Multiple 5-gigawatt data centres in the US are on the wish list – one 5-gigawatt data centre alone, operating continuously over a year, could power New Zealand’s annual energy consumption, generating approximately 43.8 TWh of electricity. That is equivalent to the energy production of five nuclear reactors.

Why massive data centres?

The more powerful the data centre, the more advanced AI systems it can support. As the demand for data processing and storage continues to rise, the need for massive data centre capacity becomes increasingly critical. Larger data centres can handle the computing requirements of faster and more complex data processing, while also providing better redundancy and reliability.

Economically, large data centres benefit from economies of scale, reducing the cost per unit of computing power and resulting in more efficient use of operating costs.

Higher output means more energy consumption and thus more carbon emissions

However, increased output means higher energy consumption and thus more carbon emissions. The IEA estimates that the current electricity use of data centres and data transmission networks accounts for 1% of energy-related greenhouse gas (GHG) emissions. While this figure may seem small, without smart energy management, AI’s energy consumption and emissions are likely to rise in the coming years. Additionally, the trend towards general-purpose models, which are less energy-efficient than fine-tuned task-specific models, exacerbates this issue. Tasks that generate longer outputs result in higher emissions. As a result, the growing use of large language models in user-facing applications is contributing to an increase in emissions, as shown by Luccioni et al.

A recent IMF post suggests that a targeted tax on the electricity use of data centres, set at $0.032 per kilowatt-hour (or $0.052 including air pollution costs), could raise $18bn annually to limit emissions. In theory, a tax would lead to a switch to low-carbon or sustainable energy sources due to the increase in the cost of CO2 emissions. The approach taken here is to calculate the specific tax corresponding to the 2-degree Celsius-oriented carbon tax. A specific tax on electricity consumption of $0.032 per kWh would thus be required to internalise the climate damage of their electricity consumption. Currently, however, many data centres enjoy tax exemptions or profit from incentives, reducing the immediate need to curb emissions.

Beware of food ‘data’ envy

I cannot conclude this AI monthly without addressing the regulatory landscape. Because when you're hungry for data, you may reach into the fridge for data that you didn't put there. This is precisely what the Irish Data Protection Commission (DPC) is investigating with Google Ireland Limited. On 12 September, the DPC launched a cross-border investigation to determine if Google’s training of its AI model PaLM2 complied with data protection regulations, particularly regarding the processing of personal data. The verdict – of course – is still pending.

On a different note, more than 100 companies, including Google, have signed the EU AI Pact. This means they are making voluntary commitments to start implementing the requirements of the upcoming AI Act ahead of its legal deadline in two years.

In any case, this won’t be the last time we see a clash between the insatiable appetite of AI and the necessity for privacy regulations.

Content Disclaimer
This publication has been prepared by ING solely for information purposes irrespective of a particular user's means, financial situation or investment objectives. The information does not constitute investment recommendation, and nor is it investment, legal or tax advice or an offer or solicitation to purchase or sell any financial instrument. Read more