Data centers, particularly the ones being used to train LLMs, use vast amounts of electricity. So much electricity that it can account for more of the long term operating costs than the servers themselves. This allows for some peculiar economics. AI training data centers can alternate their load over the coarse of a day, using up electricity in times of excess and scaling back in times of peak demand. Because so much of the cost of a data center is electricity, if utilities charge less for the electricity, this strategy can be financially advantageous for the data center operator. In this setup, the data center pays a cheap price for energy that might otherwise go unused, providing the energy utility extra funding to build out renewables.
Read the full post at The Energy Mix.