We hear a lot about artificial intelligence these days. ChatGPT has found its way into education, technology, and many other aspects of life. It and its brethren are a source of fascination, enthusiasm, and even fear. Many of us have given queries to the bot to see what kind of results we can obtain. But a recent study has found out something about AI systems that we probably didn’t know – they use up lots of fresh water.
According to researchers at the University of California, Riverside, running a few dozen queries on ChatGPT uses up about half a quart of fresh water from already overtaxed reservoirs.
Running artificial intelligence systems like ChatGPT relies on cloud computations done in racks of servers in warehouse-sized data processing centers. Google’s data centers in the U.S. alone consumed nearly 3.5 billion gallons of fresh water in 2021 in order to keep their servers cool.
Data processing centers consume water in two ways. They often draw much of their electricity from power plants that use large cooling towers that convert water into steam emitted into the atmosphere. In addition, the servers themselves need to be cooled to keep running and are typically connected to cooling towers as well.
It isn’t going to be easy for AI systems to reduce their water use. The study’s authors noted that people make use of AI at all hours of the day and night. But a significant amount of AI activity is actually the training of the systems. That could be scheduled for the cooler hours, when less water is lost to evaporation.
In an era of scarce fresh water and droughts, it is important to make AI less thirsty.
Photo, posted May 22, 2023, courtesy of Jernej Furman via Flickr.