A data center is a building, dedicated space within a building, or a group of buildings used to house computer systems and associated components, such as telecommunications and data storage systems. Data centers are the backbone of internet services and cloud computing, which together are increasingly dominant elements of modern life.
Energy use is a central issue for data centers. Power used by them ranges from a few kilowatts for a rack of servers in a closet at a local business to several tens of megawatts for large facilities. Some data centers have power densities more than 100 times that of a typical office building and use as much electricity as several thousand homes. For such facilities, electricity costs are a dominant operating expense and account for over 10% of the total cost of ownership of a data center. These centers, with their numerous racks of computer servers, consume 90 billion kilowatt-hours of electricity each year in the United States, as much as all of our residences use for lighting.
A research group at Princeton University is developing a family of devices that can dramatically reduce power consumption at data centers. The team’s technology focuses on the process by which the AC power from the grid is converted to the low-voltage direct current used by computer equipment. With existing technology, this power conversion takes place in each individual computer, which ends up wasting about 40% of the original energy. The new device aggregates power conversion into a single unit, which then distributes the power to the individual computers and storage units.
As data centers get bigger and more numerous, the opportunity to save a lot of energy becomes increasingly important.
Photo, posted June 8, 2007, courtesy of Sean Ellis via Flickr.