The rapid growth of artificial intelligence (AI) has raised concerns about the surging electricity demand in the US. However, a recent study suggests that data centers can reduce their electricity consumption by curtailing use, potentially unlocking a significant amount of power capacity.
By limiting their power draw to 90% of maximum for a few hours annually, data centers could free up 76 gigawatts (GW) of capacity. This is an amount that exceeds the global data center consumption. Such reductions could alleviate the anticipated power shortage in the US, which is estimated to reach 10% of peak demand.
Data centers have traditionally prioritized maintaining uninterrupted operations, but they possess the potential for flexibility, making them suitable participants in demand response programs.
One method of reducing power consumption involves temporal flexibility, where data centers shift computing tasks to off-peak hours. AI model training and other non-critical tasks can be rescheduled to accommodate brief interruptions.
Spatial flexibility allows data centers to distribute their computational load to regions with lower demand. Operators can also consolidate workloads and shut down less efficient servers.
For mission-critical tasks, data centers can use alternative power sources, such as batteries, to avoid curtailment.
Several companies are already implementing these strategies. Google and Enel X have used their carbon-aware computing platforms to optimize electricity usage. PG&E offers incentives to data centers that participate in demand response programs.
While these adjustments alone cannot eliminate the need for new power sources, they can significantly reduce the projected shortfall. By implementing these measures, data centers can contribute to a more sustainable and reliable energy grid while meeting the growing demand for AI services.
Original source: Read the full article on TechCrunch