Citing data from the Boston Consulting Group, Barron's said data centers will consume 7.5% of all available electricity in the US by 2030. Data centers built for AI systems can consume hundreds of megawatts of electricity per facility, so the power supply system will no longer be able to keep up with the rapid growth in the number of data centers.
Huge energy demands to serve AI servers
According to the analysis, from 2022 to 2030, the energy consumption of data centers in the US will increase from 126 to 390 terawatt hours, which will be enough to supply 40 million US households.
650 Group estimates that the volume of server systems supplying AI needs will increase sixfold from last year to 2028, to 6 million units. According to Gartner's forecast, the average power consumption of an accelerator server will increase from 650W to 1,000W.
Energy consumption will increase not only due to the increase in the number of servers but also due to the conditions. Efforts to optimize energy costs by bringing AI to the field of energy consumption control will help to limit this trend, but will not completely solve the problem. The transition to liquid cooling of server systems will be inevitable in most cases. According to Super Micro, data center operating costs can be reduced by more than 40% by switching from traditional air cooling systems to liquid cooling.
The problem is exacerbated by the uneven development of the region’s power grid. In addition, not all locations can efficiently transmit the generated electricity to the locations of large energy-consuming systems. Experts say the US generates enough electricity to support the development of AI systems, but there are problems with the distribution grid.
Source link
Comment (0)