• FryAI
  • Posts
  • The AI Industrial Revolution: The Harsh Truth About AI Data Centers

The AI Industrial Revolution: The Harsh Truth About AI Data Centers

Welcome to this week’s Deep-fried Dive with Fry Guy! In these long-form articles, Fry Guy conducts in-depth analyses of cutting-edge artificial intelligence (AI) developments and developers. Today, Fry Guy dives into the environmental impact of AI data centers and explores possible solutions. We hope you enjoy!

*Notice: We do not receive any monetary compensation from the people and projects we feature in the Sunday Deep-fried Dives with Fry Guy. We explore these projects and developers solely to reveal interesting and cutting-edge AI developments and uses.*


🤯 MYSTERY LINK 🤯

(The mystery link can lead to ANYTHING AI-related. Tools, memes, and more…)

Did you use ChatGPT today? If so, you’re killing the planet! … Kind of.

In this article, we will explore the massive push for data centers across the world, which power AI systems. We are going to peek behind the curtain and look at how much energy and water is being used by these data centers. In the end, we will explore what potential solutions might be available to preserve vital resources in the AI age.

 

INTRODUCING CARBON FACTORIES

When we think about posing questions to chatbots like ChatGPT or Claude, we don’t often think about harming the environment. But like most things, AI needs power. Without a power source, ChatGPT wouldn’t be able to write your essays and Gemini wouldn’t be able to put together your work presentations. But where does this power come from, how does it work, and what impact is it having on the planet? 

In the late 18th  century, the Industrial Revolution gave rise to factories across the world, which burned fossil fuels at extremely high rates. This was done in the name of innovative technology. Since the onset of industrial times in the 18th century, human activities have raised atmospheric CO2 by 50% – meaning the amount of CO2 is now 150% of its value in 1750. Many believe the Industrial Revolution was the worst thing to happen to our planet. Without it, however, we wouldn’t have the steam engine, automobiles, telephones, and much more. As they say, history often repeats itself. Welcome stage left, AI.

Tech giants like Google, Meta, Microsoft, IBM, Amazon, and OpenAI have been pouring billions of dollars into data centers throughout the U.S. and abroad. Dubbed “AI factories,” these gpu farms are vital to AI system training and operation and function as hubs for storing massive computing hardware and electricity, which power advanced AI models like ChatGPT. Like factories, they power AI innovation. So we shouldn’t be surprised that Nvidia CEO Jensen Huang called data centers an “essential part of the next Industrial Revolution.”

Before we discuss the problems with these data centers, it’s important to note that these AI factories are bringing numerous jobs to rural locations. For example, Microsoft’s network of data facilities in Iowa employ over 300 West Des Moines residents, making it among the 10 largest employers in the city of over 70,000. As a result, these data centers are offering opportunities to spark local economies and bring massive amounts of jobs to people … many of whom are losing their jobs due to AI.

So what’s the big problem? Data centers are popping up across the globe, powering innovation and sparking economies. This is good stuff, right? Well, not exactly.

Mirroring the Industrial Revolution, the problem with these AI factories is that they emit massive amounts of CO2. Because of the massive amount of electricity needed to power these AI server farms, even simple queries can stack up and emit tons of carbon. In fact, a single LLM query is estimated to consume approximately 0.3 kWh of energy. This is roughly 1,000 times more energy than a standard Google search, which uses around 0.0003 kWh. As AI adoption increases and more are built, these emissions will only get worse. As a result, some estimates project that AI computation warehouses will use 8% of US power by 2030. Prominent investment firm Goldman Sachs predicts that carbon dioxide emissions of data centers may more than double between 2022 and 2030.

The dreadful effects of these data centers were recently revealed in Google’s 2024 Environmental Report. The report showed that Google’s greenhouse gas emissions surged 48% over the past five years, primarily due to increased electricity consumption and supply chain emissions correlated with AI development. In 2023 alone, Google’s carbon emissions rose 13% to 14.3 million metric tons. These numbers show no signs of slowing down. In fact, the International Energy Agency predicts that data center electricity consumption could double by 2026, with AI-related energy usage potentially reaching 4.5% of global energy generation by 2030. 

As a result of this growing environmental impact, Google’s goal to “achieve net-zero emissions across all operations by 2030” is in major jeopardy as numbers continue to spike in the wrong direction. The Guardian nicely outlined this dilemma: “Pledges to reduce CO2 emissions are now coming up against pledges to invest heavily in AI products that require considerable amounts of energy for training and deployment in data centres.” Sacrificing the planet for profits … does this remind anyone of the Industrial Revolution?

 

A WATER GUZZLER

If harmful carbon emissions were the only problem with data centers, we would already have our hands full. But unfortunately, there’s much more. AI is thirsty. In fact, there is not enough water on the planet to quench the thirst of AI innovation.

If anyone has tried to use their computer outside on a hot day, they know that processing hardware overheats quickly, and when it does, it stops working. When 32,000 GPUs are humming at full capacity, data centers tend to get hot very quickly. To keep the hardware cool, most data centers use liquid cooling techniques, submerging the GPUs either fully or partially in water. When this happens, much of the water evaporates quite quickly as steam. To further the problem, liquid cooling techniques rely on freshwater, not saltwater. Saltwater corrodes the millions (sometimes billions) of dollars’ worth of hardware, while freshwater preserves it. This means data centers are left to draw fresh water from local streams, rivers, and lakes. So not only is AI polluting the air with harmful gasses, but it is also guzzling our drinking water and dehydrating our local resources.

The numbers are simply staggering. The average data center uses 300,000 gallons of water per day to keep this hardware cool, which is roughly equivalent to the water usage of 100,000 homes. One study estimates that AI could account for up to 6.6 billion cubic meters of water use by 2027. To put this in perspective, ChatGPT “drinks” a bottle of fresh water for every 20 to 50 questions we ask … and with over 100 million weekly users, that’s a lot of water.

IS THERE A SOLUTION?

AI data centers are a huge problem for the environment, and this is no secret. Big and small tech companies alike are not oblivious to the fact that data centers emit massive amounts of CO2 and guzzle water faster than a camel.

Regarding the water problem, many believe AI can solve its own problem, as it is already showing promising solutions for desalination and resource management. Whether these AI-driven approaches will be successful in coming up with freshwater preservation solutions remains to be seen, but the research is encouraging.

As for energy, many major tech players have been trying to make a move towards more sustainable solutions, particularly in the form of nuclear power. Some of the big nuclear power deals that have gone down in the past few years include players like Microsoft, Google, and Amazon. Microsoft signed a deal last summer with Constellation, a top nuclear power plant operator, to add nuclear-generated electricity to its Virginia data centers. The year prior, Google took part in a $250 million fundraising round for the fusion startup TAE Technologies. And in late 2021, Amazon founder Jeff Bezos and other investors raised over $130 million for Canadian nuclear company General Fusion. Ayan Paul, a research scientist at Northeastern University who studies AI, stated, “People have started to believe that these kinds of energies are going to fuel our population.”

“We are really playing a crazy game with the atmosphere and the oceans. We’re taking huge amounts of carbon from deep underground and […] putting it into the atmosphere, it’s crazy. We should not do this. This is very dangerous. We should accelerate the transition to sustainable energy.”

-Elon Musk 

Nuclear power might be the solution we are looking for, and more people are starting to believe that. In fact, 53% of Americans support nuclear energy expansion, which is up from 43% in 2020. “We need nuclear power to get to a low-carbon future,” said Ahmed Abdulla, assistant mechanical and aerospace engineering professor at Carleton University. However, for nuclear power to be an effective solution to the data center issue, it will need to be more widely available. As of now, nuclear power accounts for only 19% of the nation’s overall energy generation. This means sustainable energy consumption will require a patient AI development process, which might slow down development. As Abdulla aptly stated, “There is a chance to make serious mistakes if we sprint to the goal.” Unfortunately, I have a feeling the tech giants, who have seen unprecedented growth since the release of ChatGPT, will not heed this warning.

Did you enjoy today's article?

Login or Subscribe to participate in polls.