Sort:  

Developing large AI models necessitates substantial computing capacity; for instance, training GPT-4 was reported to consume more electricity than what 5,000 average U.S. homes use in a year.

This increasing demand for energy is putting pressure on the transmission capacity of the electrical grid and the availability of data centers equipped to handle the power needs, causing voltage inconsistencies in regions with high

concentrations of AI computing operations.

thats old news

Interesting news