- The AI industry is shifting toward greater efficiency in energy consumption and data center costs, as highlighted by former Facebook (FB) executive Chris Kelly, who noted that human brains operate on just 20 watts compared to the gigawatt-scale demands of current AI infrastructure.
- Massive investments continue, with over $61 billion in data center dealmaking in 2025 and OpenAI committing over $1.4 trillion in AI obligations, including projects requiring at least 10 gigawatts – equivalent to the power needs of 8 million U.S. households.
- Cost disruptions from low-budget models like DeepSeek’s under $6 million open-source LLM and rising Chinese competition, bolstered by approved Nvidia (NVDA) H200 chip sales, are expected to democratize access to advanced AI capabilities.

The artificial intelligence sector faces mounting challenges from rising energy consumption and infrastructure costs as massive buildouts continue to support advanced workloads. In a CNBC interview, former Facebook Chief Privacy Officer Chris Kelly noted that human cognition functions efficiently on just 20 watts of power, highlighting the potential for AI systems that could operate without dependence on gigawatt‑scale facilities. He stressed that breakthroughs in lowering data center expenses will be critical in determining which companies emerge as industry leaders.
This push for efficiency comes amid substantial investments in supporting infrastructure. The data center market has seen over $61 billion in dealmaking in 2025, driven by hyperscalers pursuing extensive global expansions, according to S&P Global. OpenAI has committed over $1.4 trillion in AI-related obligations over the coming years, involving key partnerships with GPU provider Nvidia (NVDA) and infrastructure firms Oracle (ORCL) and CoreWeave (CRWV).
One notable initiative between Nvidia and OpenAI involves plans for at least 10 gigawatts of data center capacity, equivalent to the annual electricity use of 8 million U.S. households or comparable to New York City’s peak summer demand in 2024, as reported by the New York Independent System Operator.
These developments have intensified scrutiny over power availability, given pressures on existing electrical grids. At the same time, innovations in cost reduction are emerging, exemplified by DeepSeek’s release of a free, open-source large language model in December 2024, reportedly developed for under $6 million – far below expenditures by leading U.S. developers.
Kelly anticipates increased prominence for Chinese AI entities, particularly following President Donald Trump’s approval of Nvidia’s H200 chip sales to the country. Open-source approaches, especially from China, are expected to democratize access to foundational computing resources for generative and agentic AI applications. As the industry progresses, prioritizing operational efficiencies and sustainable power management will be critical to maintaining momentum in AI advancement.
WallStreetPit does not provide investment advice. All rights reserved.
- Bulenox: Get 45% to 91% OFF ... Use Discount Code: UNO
- Risk Our Money Not Yours | Get 50% to 90% OFF ... Use Discount Code: MMBVBKSM
Disclaimer: This page contains affiliate links. If you choose to make a purchase after clicking a link, we may receive a commission at no additional cost to you. Thank you for your support!
Leave a Reply