AI data centers may cause urban water shortages by 2030
English

Artificial intelligence could leave cities without water – scientists’ alarming prediction

By 2030, data processing centers in the United States may consume as much water as the largest American metropolis, New York City, requires every day. This is the conclusion reached by researchers who studied the rapidly growing demand for resources from the artificial intelligence industry.
Наташа Ким Reading time: 2 minutes
Link copied
data center

The work was carried out by a group of scientists led by Shaolei Ren, associate professor of electrical and computer engineering at the University of California, Riverside, writes Gizmodo. The results of the study are published in the public domain on the arXiv preprint server and have not yet been peer-reviewed. Nevertheless, they have already attracted the attention of specialists, as they raise the question of the potential shortage of water resources due to the expansion of AI infrastructure.

The main reason for high water consumption is to cool servers. Data centers operate around the clock and contain thousands of powerful computing systems that generate a significant amount of heat. To prevent the equipment from overheating, liquid cooling is used, one of the most effective ways to maintain a stable temperature.

Technology companies often claim to use closed-loop systems that reuse water. However, even these solutions are resource-intensive, as heat is often removed from buildings through evaporative cooling towers. As a result, some water is inevitably lost in the evaporation process.

Researchers estimate that during hot periods, a large modern data center can consume more than one million gallons of water per day. In some planned projects, this figure could be as high as eight million gallons daily.

An analysis of public sources – government data and reports from water utilities – provided an estimate of future infrastructure loads. It estimated that by 2030, the peak daily demand for data centers in the U.S. could range from 697 million to 1.45 billion gallons of water. This is roughly comparable to the amount of water that enters New York City’s water system each day.

Building and upgrading infrastructure to meet this demand could cost between $10 billion and $58 billion. A significant portion of the cost would potentially fall on the municipalities where the data centers are located.

Experts warn that water shortages could be a serious constraint on the industry’s future growth. In times of shortage, data centers have to switch to air cooling, which is less efficient and requires more electricity.

The authors of the study believe that companies need to be more transparent in disclosing data on peak water use, as well as more active in financing the modernization of utility infrastructure. Without such measures, the AI industry may face an unexpected and very tangible resource barrier.



Реклама недоступна
Must Read*

We always appreciate your feedback!

Read also