MIT recently hosted a symposium on the subject of artificial intelligence being both a problem and a solution for the clean energy transition.
AI-powered computing centers are expanding rapidly, creating an unprecedented surge in electricity demand. Electricity demand in the US had been relatively flat for decades but now these computing centers consume about 4% of the nation’s electricity. Some projections say that this demand could rise to 12-15% in the next five years.
The power required for sustaining some of the AI large-language models is doubling every three months. The amount of electricity used by a single ChatGPT conversation is as much as it takes to charge a phone and consumes the equivalent of a bottle of water for cooling.
The MIT symposium focused on the challenges of meeting these growing energy needs but also on the potential for AI to dramatically improve power systems and reduce carbon emissions.
Research shows regional variations in the cost of powering computing centers with clean electricity. The central United States offers lower costs due to complementary solar and wind resources but would require massive battery deployments to provide uninterrupted power.
Because of data center demand, there is renewed interest in nuclear power, often in the form of small modular reactors, as well as efforts in long-duration storage technologies, geothermal power, or hybrid approaches.
Artificial intelligence offers both great promise and great peril. It will take real intelligence to steer it in the right direction.
**********
Web Links
Confronting the AI/energy conundrum
Photo, posted August 31, 2024, courtesy of Jefferson Lab via Flickr.
Earth Wise is a production of WAMC Northeast Public Radio



















