MENU

Nvidia CEO Reveals: AI Computing Needs Skyrocket 100x Post-ChatGPT

  • During a CNBC interview, Nvidia CEO Jensen Huang emphasized the short-term AI demand from purchase orders and emerging startups in agentic and physical AI. He cited Amazon CEO Andy Jassy’s note that AWS could sell more AI resources now, signaling immediate compute needs beyond forecasts.
  • Mid-term confidence stems from this year’s significant data center investment surge over last year’s strong base, with Blackwell and new AI factories poised to drive growth, while long-term, reasoning AI’s compute-intensive thinking process promises a major leap forward.
  • Huang reframed DeepSeek’s impact, highlighting AI’s phases – pre-training, post-training with reinforcement learning, and reasoning inference – as escalating compute demands, countering assumptions of efficiency with a vision of skyrocketing needs across development and deployment.

nvidia

Nvidia (NVDA) CEO Jensen Huang, in a CNBC special report with Jon Fortt following the company’s quarterly earnings, underscored a trio of demand signals fueling confidence in AI’s trajectory, from short-term purchase orders and startup breakthroughs to mid-term data center expansions and a long-term surge tied to reasoning AI’s computational leap. Huang highlighted immediate demand, noting Amazon CEO Andy Jassy’s remark that AWS could sell more AI resources if available, driven by a wave of unforecasted needs from notable startups in agentic and physical AI—fields birthing exciting companies that amplify compute hunger beyond existing orders. Mid-term, he pointed to a significant uptick in data center capital investment this year over last, with Blackwell’s rollout and new AI factories poised to sustain momentum, building on an already robust prior year.

Huang’s long-term vision centered on reasoning AI’s dawn – where models think before answering, breaking problems into steps and self-searching for smarter responses – a shift he said demands vastly more compute than past generations, dwarfing even last year’s hefty loads. Addressing DeepSeek’s impact, he countered the notion it signals less compute need, explaining AI’s three-phase evolution: pre-training’s foundational grind, post-training’s innovation-heavy reinforcement learning with human or AI feedback, and inference’s reasoning-intensive process, where models like DeepSeek reflect and refine answers, spiking compute demands across the board. Nvidia’s stock held steady during the interview, a quiet nod to Huang’s narrative of a compute skyrocket – spanning startups, data centers, and reasoning AI – positioning the company at AI’s cutting edge.

The discussion flipped DeepSeek’s “more with less” perception, with Huang explaining that post-training innovations – such as reinforcement learning and synthetic data generation – dramatically increase compute demands, with inference now equally intensive as AI reasons step-by-step, a shift he views as a powerful tailwind for Nvidia’s platform, noting, “the amount of computation that we have to do even at inference time now is a hundred times more than what we used to do when ChatGPT first came out.” In the short term, startups in agentic and physical AI, fueled by reasoning and general AI breakthroughs, add urgent compute needs atop existing forecasts, while mid-term, data center scale-outs reflect a structural shift tied to Blackwell’s rollout. Long-term, the compute explosion of reasoning AI – thinking, reflecting, and composing – positions Nvidia as a leader in an era Huang sees just dawning, with the latest quarter serving as a launchpad into an AI-driven future.

WallStreetPit does not provide investment advice. All rights reserved.

About Ari Haruni 567 Articles
Ari Haruni

1 Comment on Nvidia CEO Reveals: AI Computing Needs Skyrocket 100x Post-ChatGPT

Leave a Reply

Your email address will not be published.


*

This site uses Akismet to reduce spam. Learn how your comment data is processed.