Artificial Intelligence, AI, is transforming the energy landscape and accelerating the transition towards new energy. The future of energy demands companies capable of offering multiple and flexible sources of energy, and to make them accessible at speed to meet the needs of customers across the traditional and new sources of energy
Which is why we believe that this energy transition will likely create highly attractive opportunities, spanning both incumbent infrastructure and, selectively, new infrastructure deployments. In this regard, we intend to leverage AI to draw valuable insights to make data-driven decisions by using extremely large data sets that may only be analyzed computationally to reveal patterns, trends, and associations. It is clear that AI is reshaping the way we work, think, and live. But it doesn't stop there. AI is not just about efficiency and productivity; it's about empowering human potential. AI is freeing up our time and energy, allowing us to focus on what truly matters, our human qualities. Together, humans and machines are forging a new era of collaboration
Creating a sustainable energy future
Globally and locally, the energy sector is transforming, driven by fundamental shifts in policy, technology, economic and environmental demands. The industry is evolving from a predictive, vertically integrated model based on centralised generation flowing in a single direction towards a decentralised, modular model based on bidirectional flow of power enabled by smart metering. This introduces new players to the industry and an unfolding series of demand-centric, value-adding applications. From GenAI to machine learning, we can use AI tools to develop game-changing, market-ready solutions, to coordinate shifts between the net available energy sources (due to weather, demand patterns, plant availability, etc.) to balance the energy system between supply and demand in a seamless manner, second by second, and in the process reduce waste and carbon emissions by optimizing the cumulative energy resources available
Challenges of the energy transition
One of the greatest challenges of the energy transition is keeping the power grid stable, and balancing the production and consumption of electricity. This is because wind and solar energy are subject to significant fluctuations, and they are increasingly fed into the transmission grid on a decentralized basis via millions of installations.
Intelligent power grids, also known as smart grids, offer a solution here. While power grids previously had the main task of transmitting the electricity produced in large central power plants and distributing it to consumers, a smart grid also fulfils the function of a data network that connects centralised and decentralised production facilities with flexible electricity consumers.
Smart grids are based on intelligent measuring systems, known as smart meters. These are a combination of a digital electricity meter and a communication module, the smart meter gateway. This means power consumption and production are measured in real time and at the level of individual installations. The data on network statuses is transferred continuously into the data network, managed, bundled and sent again by the gateway administrator, to the energy supplier for example. The meter operator takes on the role of the gateway administrator. This is generally the Transmission system operators (TSOs) or local network operator.
Why AI and energy are the new power couple
Power systems are becoming vastly more complex as demand for electricity grows and decarbonization efforts ramp up. In the past, grids directed energy from centralised power stations. Now, power systems increasingly need to support multi-directional flows of electricity between distributed generators, the grid and users. The rising number of grid-connected devices, from electric vehicle (EV) charging stations to residential solar installations, makes flows less predictable. Meanwhile, links are deepening between the power system and the transportation, building and industrial sectors. The result is a vastly greater need for information exchange, and more powerful tools to plan and operate power systems as they keep evolving.
This need arrives just as the capabilities of AI applications are rapidly progressing. As machine learning models have become more advanced, the computational power required to develop them has doubled every five to six months since 2010. AI models can now reliably provide language or image recognition, transform audio sounds into analysable data, power chatbots and automate simple tasks. AI mimics aspects of human intelligence by analysing data and inputs, generating outputs more quickly and at greater volume than a human operator could. Some AI algorithms are even able to self-programme and modify their own code.
It is therefore unsurprising that the energy sector is taking early steps to harness the power of AI to boost efficiency and accelerate innovation. The technology is uniquely placed to support the simultaneous growth of smart grids and the massive quantities of data they generate. Smart meters produce and send several thousand times more data points to utilities than their analogue predecessors. New devices for monitoring grid power flows funnel more than an order of magnitude more data to operators than the technologies they are replacing. And the global fleet of wind turbines is estimated to produce more than 400 billion data points per year.
This volume is a key reason energy firms see AI as an increasingly critical resource. A recent estimate suggests that AI already serves more than 50 different uses in the energy system, and that the market for the technology in the sector could be worth up to USD 13 billion.
AI and machine learning can unlock flexibility by forecasting supply and demand
One of the most common uses for AI by the energy sector has been to improve predictions of supply and demand. Developing a greater understanding of both when renewable power is available and when it’s needed is crucial for next-generation power systems. Yet this can be complicated for renewable technologies, since the sun doesn’t always shine, and the wind doesn’t always blow.
That’s where machine learning can play a role. It can help match variable supply with rising and falling demand, maximising the financial value of renewable energy and allowing it to be integrated more easily into the grid.
Wind power output, for example, can be forecast using weather models and information on the location of turbines. However, deviations in wind flow can lead to output levels that are higher or lower than expected, pushing up operational costs. To address this, Google and its AI subsidiary DeepMind developed a neural network in 2019 to increase the accuracy of forecasts for its 700 MW renewable fleet. Based on historical data, the network developed a model to predict future output up to 36 hours in advance with much greater accuracy than was previously possible. This greater visibility allows Google to sell its power in advance, rather than in real time. The company has stated that this, along with other AI-facilitated efficiencies, has increased the financial value of its wind power by 20%. Higher prices also improve the business case for wind power and can drive further investment in renewables. Notably, Google’s proprietary software is now being piloted by a major energy company. Additionally, with a more accurate picture of peaks in output, companies like Google are able to shift the timing of peak consumption, such as during heavy computing loads, to coincide with them. Doing so avoids the need to buy additional power from the market. This capacity, if expanded more widely, could have a significant impact on the promotion of load shifting and peak shaving – especially if combined with better demand forecasts. For example, Swiss manufacturer ABB has developed an AI-enabled energy demand forecasting application that allows commercial building managers to avoid peak charges and benefit from time-of-use tariffs.
AI can also prevent grid failures, increasing reliability and security
Another key AI application is predictive maintenance, where the performance of energy assets is continuously monitored and analysed to identify potential faults ahead of time. Maintenance typically happens on a regular schedule; poles on a transmission line, for example, might be examined once within a pre-defined period and repairs carried out as needed. This one-size-fits-all approach can lead to inefficiencies if maintenance happens too early or, more problematically, too late.
To address this, a range of utilities are developing AI-enabled schemes to help monitor physical assets and use past data on performance and outages to predict when intervention is required. Utility company E.ON, for instance, has developed a machine learning algorithm to predict when medium voltage cables in the grid need to be replaced, using data from a range of sources to identify patterns in electricity generation and flag any inconsistencies. E.ON’s research suggests that predictive maintenance could reduce outages in the grid by up to 30% compared with a conventional approach. Similarly, in 2019 Italy-based utility Enel began installing sensors on power lines to monitor vibration levels. Machine learning algorithms allowed Enel to identify potential problems from the resulting data and discern what caused them. As a result, Enel has been able to reduce the number of power outages on these cables by 15%. Meanwhile, Estonian technology startup Hepta Airborne uses a machine learning platform with drone footage of transmission lines to identify defects, and State Grid Corporation of China uses AI extensively to carry out actions such as analysing data from smart meters to identify problems with customers’ equipment.
Potential uses for AI across power systems are likely to soar in the years to come. In addition to better forecasting of energy supply and demand and predictive maintenance of physical assets, applications could include: Managing and controlling grids, using an array of data from sensors, smart meters and other internet-of-things devices to observe and control the flow of power in the network, particularly at the distribution level; Facilitating demand response, using a range of processes such as forecasting electricity prices, scheduling and controlling response loads, and setting dynamic pricing; Providing improved or expanded consumer services, using AI or machine learning processes in apps and online chatbots to better customers’ billing experiences, for instance. Firms such as Octopus Energy and Oracle Utilities are already exploring this.
The technology will enable digitalization, but addressing risk is also essential
Without AI, system operators and utilities will only be able to make effective use of a fraction of the new data sources and processes offered by emerging digital technologies, and they will miss out on a significant proportion of the benefits on offer. However, risks associated with AI must also be considered and addressed before the technology is scaled across the sector. These include, but are not limited to, threats to cybersecurity and privacy, the influence of biases or errors in data, and miscorrelations due to insufficient training, data or coding mistakes.
The availability of workers with the right skills is a significant challenge for any sector looking to tap AI’s potential. Across the global workforce, AI and machine learning specialists are the profession experiencing the fastest growth in demand, creating a recruitment bottleneck. In June 2022, there were only 22 000 AI specialists globally across all industries, and 61% of large firms surveyed in the United Kingdom and United States reported lacking staff with sufficient AI experience. The energy industry will need to compete to recruit the best data scientists and programmers, while firms looking to retain staff that understand the sector should consider uptraining and reskilling parts of their existing workforce. Digital training courses, supported by governments with input from the private sector, will be vital to these efforts. However, the availability and quality of such courses is not yet consistent across the largest global economies.
AI also uses more energy than other forms of computing (see the map above), a crucial consideration as the world seeks to build a more efficient energy system. Training a single model uses more electricity than 100 US homes consume in an entire year. In 2022, Google reported that machine learning accounted for about 15% of its total energy use over the prior three years. However, data is not systematically collected on AI’s energy use and wider environmental impacts, and there is a need for greater transparency and tracking, especially as models grow. The most efficient computing infrastructure and AI algorithms should be prioritised to prevent it from offsetting efficiency gains.
Furthermore, increased use of automated and self-learning software raises questions about who is responsible for the outputs or outcomes of these systems. Operators frequently purchase AI technology or a related service from IT companies and startups. This can result in decision making on electricity balancing or investments, for example, based on models they do not understand or control, leading to questions about accountability for public spending, energy prices or outages.
In an effort to address some of these issues, the OECD AI Principles, adopted in 2019 by OECD member governments and many non-member governments, provide guidance on pursuing a human-centric approach to trustworthy AI. Clearer national, regional and international frameworks may also be necessary, given that the energy sector underpins the global economy and is crucial to meeting climate goals. The European Union’s AI Act, first proposed in 2021 and currently under negotiation by EU institutions and member states, aims to develop better conditions for the technology’s development and use while guaranteeing robust protections for the environment, among other goals.
For AI to be an effective ally towards efficient, decarbonised and resilient power systems, governments will also need to develop mechanisms for data sharing and governance. A coordinated global approach can enable internationally applicable and replicable solutions, transfer learnings globally, and expedite the energy transition while reducing its costs
Energy demand from AI
AI model training and deployment occurs mainly in data centres. Understanding the role of data centres as actors in the energy system first requires an understanding of their component parts. Data centres are facilities used to house servers, storage systems, networking equipment and associated components that are installed in racks and organised into rows. This IT equipment, and a range of auxiliary equipment required to keep it in working order, comprise the following: Servers are computers that process and store data. They can be equipped with central processing units (CPUs) and specialised accelerators such as graphics processing units (GPUs). On average they account for around 60% of electricity demand in modern data centres, although this varies greatly between data centre types; Storage systems are devices used for centralised data storage and backup, and account for around 5% of electricity consumption; Networking equipment include switches to connect devices within the data centre, routers to direct traffic and load balancers to optimise performance. Networking equipment accounts for up to 5% of electricity demand; Cooling and environmental control refers to equipment that regulates temperature and humidity to keep IT equipment operating at optimal conditions. The share of cooling systems in total data centre consumption varies from about 7% for efficient hyperscale data centres to over 30% for less-efficient enterprise data centres; Uninterruptible power supply (UPS) batteries and backup power generators are there to keep the data centre powered during outages. Both UPS and backup generators are rarely used, but necessary to ensure the extremely high levels of reliability that data centres must meet; Other infrastructure, such as lighting and office equipment for on-site staff, etc.
The share of these different components in data centre electricity consumption varies greatly by type of data centre, depending on the nature and efficiency of the equipment they have installed. Data centres, at least at the scale seen today, are relatively new actors in the energy system at the global level. Today, electricity consumption from data centres is estimated to amount to around 415 terawatt hours (TWh), or about 1.5% of global electricity consumption in 2024. It has grown at 12% per year over the last five years. The rise of AI is accelerating the deployment of high-performance accelerated servers, leading to greater power density in data centres. Understanding the pace and scale of accelerator adoption is critical, as it will be a key determinant of future electricity demand. The key input to our modelling is therefore near-term industry projections for server shipments, considering the outlook for demand and supply constraints.
Data Centre Power Consumption
There is substantial uncertainty both about data centre consumption today and in the future. The uncertainty surrounding future electricity demand requires a scenario-based approach to explore alternative pathways and provide perspectives on timelines relevant for energy sector decision-making. While the technology sector moves quickly and a data centre can be operational in two to three years, the broader energy system requires longer lead times to schedule and build infrastructure, which often requires extensive planning, long build times and high upfront investment.
Three sensitivity cases (Lift-Off, High Efficiency and Headwinds) capture uncertainties in efficiency improvements in hardware and software, AI uptake and energy sector bottlenecks.
Our Base Case finds that global electricity consumption for data centres is projected to double to reach around 945 TWh by 2030 in the Base Case, representing just under 3% of total global electricity consumption in 2030. From 2024 to 2030, data centre electricity consumption grows by around 15% per year, more than four times faster than the growth of total electricity consumption from all other sectors. However, in the wider context, a 3% share in 2030 means that data centre share in global electricity demand remains limited.
Electricity consumption in accelerated servers, which is mainly driven by AI adoption, is projected to grow by 30% annually in the Base Case, while conventional server electricity consumption growth is slower at9% per year. Accelerated servers account for almost half of the net increase in global data centre electricity consumption, while conventional servers account for only around 20%. Other IT equipment, and infrastructure (cooling and other infrastructure) demand account for around 10% and 20% of the net increase respectively. All three types of data centres – enterprise, colocation and server provider, and hyperscale – contribute to the growth in electricity consumption.
The United States, China and Europe are projected to remain the largest regions for data centre electricity demand over the coming years. However, other regions are experiencing strong growth in data centre development, positioning them to play increasingly important roles in the global data centre landscape. A notable example is Southeast Asia, where electricity demand from data centres is expected to more than double by 2030, partially due to the presence of a regional hub in Singapore and southern Malaysia.
China and the United States are the most significant regions for data centre electricity consumption growth, accounting for nearly 80% of global growth to 2030. Consumption increases by around 240 TWh (up 130%) in the United States, compared to the 2024 level. In China it increases by around 175 TWh (up 170%). In Europe it grows by more than 45 TWh (up 70%). Japan increases by around 15 TWh (up 80%).
Comparing data centre electricity consumption normalised per capita can give a sense of the importance of this sector in different economies. Africa has the lowest consumption at less than 1 kWh of data centre electricity consumption per capita in 2024, rising to slightly less than 2 kWh per capita by the end of the decade. However, there are strong differences within the region, with South Africa showing strong growth and per-capita consumption more than 15 times larger than the continental average in 2030, with an intensity higher than 25 kWh per capita. By contrast, the United States has the highest per-capita data centre consumption, at around 540 kWh in 2024. This is projected to grow to over 1 200 kWh per capita by the end of the decade, which is roughly as much as 10% of the annual electricity consumption of an American household. This intensity is also one order of magnitude higher than any other region in the world.