Not aluminum smelters or steelworks, but data centers for the development of artificial intelligence will soon be the largest energy guzzlers in the world. The new digital heavy industry consumes as much energy as entire countries. In some places there is already a threat of blackout.
At first glance, Loudoun County in northern Virginia is actually more recommended as a destination for bicycle tourists and nature lovers than as a mecca for tech titans. But here, just half an hour's drive west of Washington, hidden between old plantation mansions, forested hills, vineyards and horse pastures, stands Data Center Alley – the largest collection of data centers in the world.
Almost 350 server farms are in operation or under construction here, scattered in the small towns around Dulles Airport, with a total area of around three square kilometers – larger than Berlin's Wannsee. Much of the world's Internet traffic flows through this hub, which developed thanks to the Internet's origins as a military communications network near the Pentagon, as well as tax advantages, a steady climate and sufficient cooling water from the Potomac River. And through low electricity prices.
But the successful formula behind the tech idyll could soon be clouded. In the next few years, analysts at investment firm TD Cowen warn, power supply capacity could become scarce in northern Virginia and parts of Ohio, another cloud industry hotspot.
The reason for this is the energy hunger of the AI boom. Since Google, Amazon and Microsoft have been engaged in an arms race to develop generative artificial intelligence, global demand for storage facilities has exploded like never before – and with it global electricity consumption. Because just one of them consumes as much energy as tens of thousands of households. In the USA, more new power plants went online in the first half of the year than in more than 20 years.
Bottleneck power supply
The data centers for AI development are at the same time the backbone of digital transformation and its Achilles heel. “One of the limits to the use of chips in the AI economy will be where we build the data centers and where we get the electricity from,” the British Financial Times quoted Daniel Golding, a former Google data center manager, as saying. “At some point, the reality of the power grids will get in the way of artificial intelligence.”
So far, it has been primarily the chip shortage that has limited the AI boom: the tech giants are literally hunting for high-performance GPUs or have to wait months for delivery. Just this week, Nvidia boss Jensen Huang tried to calm the nerves of investors and developers by saying that the supply of the latest AI super chips of the Blackwell generation was secured by “plenty of supplies”. But the real bottleneck is the power supply.
While a simple Google search uses around 0.3 watt hours, a typical query to ChatGPT uses around 2.9 watt hours – ten times as much. “The chip shortage may be behind us,” warned Tesla boss Elon Musk back in the spring. “The next bottleneck will be electricity. I think next year you'll see that there just isn't enough power to power all the chips.” Blackstone boss Steve Schwarzman is also alarmed: There is a real run on free space for the construction of AI data centers. The investment amounts are “breathtaking”: “I’ve never seen anything like it.”
According to the International Energy Agency (IEA), there are currently more than 8,000 data centers worldwide, a third of which are in the USA, around a sixth in Europe and a tenth in China. The IEA estimates that global consumption from electricity guzzlers will more than double by 2026 – to more than 1,000 terawatt hours. This is roughly equivalent to the annual electricity consumption of Japan. According to the IEA, the additional consumption will be somewhere between 160 and 590 terawatt hours. At best, the world will have to generate as much additional electricity as Sweden consumes in just two years. Or, at worst, join another Germany.
According to the IEA, data centers already consume around 1.3 percent of the world's total electricity, and by 2026 this could rise to 3 percent. For comparison: aluminum production currently consumes around 4 percent of electricity generation. The development of artificial intelligence would become one of the most energy-intensive industries in the world, comparable to classic heavy industry.
The lights are already flickering in Ireland
Some places are already teetering on the edge of a blackout: in Ireland, for example, a fifth of national electricity consumption is already attributable to data centers, and in two years it could be around a third. The local network operator therefore does not want to connect any new storage facilities to the network in the greater Dublin area until further notice. In the Netherlands, too, there has been a de facto ban on new hyper data centers across the country since the beginning of the year, with a few exceptions.
With the looming shortages, the analogous consequences of the industrial development of artificial intelligence are likely to become more and more of a political issue in the coming years. Not just because more and more residents from Ashburn to Amsterdam will rebel when gigantic cold storage facilities for AI servers suddenly sprout up next to their houses. But because the new technology's hunger for energy endangers the climate goals of tech companies and governments alike.
It is not only unclear whether the AI centers' hunger for electricity can be satisfied in time. But also with what technology. Amazon Web Services (AWS), for example, has concluded long-term electricity supply contracts with a wind farm for its data centers in Ireland. Because of the server halls' enormous appetite for electricity, the company is turning back the energy transition in Pennsylvania: a data center is being expanded there next to a nuclear power plant. In Sweden there are even plans for a data center with its own mini-reactor. Whether the AI boom develops into a risk or an opportunity for the energy transition also depends on how intelligent the artificial intelligence chips themselves become. According to the company, the latest Blackwell generation from Nvidia will consume 25 times less power.