Sort:  

Developing large AI models necessitates substantial computing capacity; for instance, training GPT-4 was reported to consume more electricity than what 5,000 average U.S. homes use in a year.

This increasing demand for energy is putting pressure on the transmission capacity of the electrical grid and the availability of data centers equipped to handle the power needs, causing voltage inconsistencies in regions with high

concentrations of AI computing operations.

thats old news

Interesting news

To address this issue, U.S. AI companies are advocating for new energy infrastructure projects, which include establishing dedicated "AI economic zones" that facilitate quicker permits for data centers, creating a national electrical

transmission network to distribute power efficiently, and boosting overall power generation capabilities.

In response, OpenAI has expanded its policy team in Washington, growing from 4 to 12 members, with a shift in focus from AI safety issues to collaborating with utility companies, energy organizations, and legislators to ensure a steady

electricity supply for their needs.