AI Data Center Power Consumption Nears Scale of New York City
Stanford University's '2026 AI Index' report analyzes that AI technology is advancing at a record pace. The report found that AI model performance is continuously improving, and AI is being adopted by the public far faster than personal computers or the internet were.
AI companies are generating revenue at the fastest pace of any tech boom in history, yet they are consuming vast resources, investing hundreds of billions of dollars in building data centers and purchasing chips. The report highlights that the power consumption of AI-driven data centers could rival that of New York City itself. This AI advancement is creating significant real-world challenges, including massive electricity demands from global AI data centers (equivalent to New York State's peak demand), growing reliance on specific companies for chip supply chains, and a sharp increase in annual water consumption.
In the global AI technology race, the U.S. and China are nearly on par in model performance, with OpenAI's initial lead narrowing due to advances from Google, Anthropic, and others.