Jensen Huang says the AI capex boom is being driven by monetizable demand for compute, not speculation, arguing that rising cash flows will ultimately validate today’s spending surge.
Nvidia CEO Jensen Huang has offered one of the clearest and most forceful defenses yet of the technology industry’s ballooning investment in artificial intelligence infrastructure, arguing that the spending wave unsettling parts of Wall Street is grounded in revenue growth rather than excess.
Speaking on CNBC’s Halftime Report on Friday, Huang said the unprecedented scale of capital expenditure being undertaken by the world’s largest technology companies is a rational response to demand that is already translating into cash flow.
“The reason for that is because all of these companies’ cash flows are going to start rising,” Huang said, pushing back against concerns that AI investment has begun to outpace its commercial payoff.
Register for Tekedia Mini-MBA edition 19 (Feb 9 – May 2, 2026).
Register for Tekedia AI in Business Masterclass.
Join Tekedia Capital Syndicate and co-invest in great global startups.
Register for Tekedia AI Lab.
Markets appeared reassured. Nvidia shares closed nearly 8% higher on Friday, adding to gains that have made the company the clearest financial beneficiary of the global rush to build AI infrastructure.
Hyperscalers commit, investors hesitate
Huang’s comments follow earnings reports from Nvidia’s biggest customers — Meta, Amazon, Google, and Microsoft, which over the past two weeks have collectively signaled a sharp acceleration in spending on data centers, networking equipment, and AI chips.
Based on company guidance and analyst estimates, those four firms alone could spend as much as $660 billion on capital expenditure this year, with a substantial portion earmarked for AI compute capacity. Much of that spend ultimately flows to Nvidia, whose graphics processing units have become the backbone of large-scale AI training and inference.
Investor reaction has been uneven. Meta and Alphabet saw their shares rise after reaffirming their AI strategies, while Amazon and Microsoft were sold off as investors focused on near-term margin pressure and the lag between spending and visible profit expansion. The split highlights a broader tension in markets: confidence in AI’s long-term potential set against anxiety over how long it will take for returns to materialize.
Huang framed that debate as backward-looking. In his view, the spending is already justified by how deeply AI is being woven into core products and revenue engines.
He also pointed to concrete shifts underway inside Nvidia’s largest customers. At Meta, he said, AI is replacing traditional recommendation systems that once relied heavily on CPUs. The company is now using generative AI models and autonomous agents to power content discovery and advertising, sharply increasing demand for accelerated computing.
At Amazon, Huang linked Nvidia-powered AI not only to Amazon Web Services’ cloud customers but also to Amazon’s retail operations, where AI increasingly shapes product recommendations, logistics optimization, and customer engagement. Microsoft, he said, is embedding AI across its enterprise software portfolio, turning AI from an add-on into a core productivity layer for corporate customers.
Taken together, Huang argued, these use cases show that AI infrastructure is no longer speculative. It is becoming foundational, comparable to earlier phases of cloud and mobile computing that initially raised similar concerns over cost before reshaping profit pools across the industry.
AI labs and revenue generation
Huang also addressed a key point of skepticism: whether the companies building frontier AI models can generate sustainable revenue. He singled out OpenAI and Anthropic as evidence that monetization is already happening.
“Anthropic is making great money. Open AI is making great money,” Huang said, adding that compute availability, rather than customer demand, is now the main constraint on growth.
Nvidia has direct exposure to both companies. It invested $10 billion in Anthropic last year, and Huang said earlier this week that Nvidia plans to participate heavily in OpenAI’s next fundraising round. Those moves underline Nvidia’s strategy of aligning itself not just with infrastructure buyers but also with the most influential developers of AI applications.
Huang argued that the economics of AI scale non-linearly. More computing does not simply generate incremental revenue; it can unlock entirely new products and services.
“If they could have twice as much compute, the revenues would go up four times as much,” he said.
No slack in demand
Another point Huang emphasized was the durability of demand across Nvidia’s product generations. He said every GPU Nvidia has sold in recent years, including older models such as the A100 introduced more than half a decade ago, is currently being rented out.
That detail speaks to a market where supply remains tight, and fears of rapid obsolescence have not materialized. Instead of being sidelined by newer chips, older hardware continues to find use in inference, fine-tuning, and less compute-intensive AI workloads.
“To the extent that people continue to pay for the AI and the AI companies are able to generate a profit from that, they’re going to keep on doubling, doubling, doubling, doubling,” Huang said, describing a feedback loop where revenue growth fuels further infrastructure investment.
Huang’s defense comes as comparisons to past technology bubbles grow louder, particularly given the scale of spending and Nvidia’s rising valuation. His argument rests on a key distinction: unlike earlier cycles, today’s AI buildout is being driven by customers who are already generating revenue from the technology and are reinvesting cash flows to expand capacity.
The narrative is central for Nvidia. Its dominance in AI chips, its deep ties to hyperscalers and AI labs, and its exposure to virtually every major AI deployment mean that confidence in the sustainability of AI spending directly underpins its market position.
Friday’s rally suggests investors are, for now, willing to accept Huang’s thesis: that the AI spending boom is less about exuberance and more about a structural shift in how computing power is consumed, priced, and monetized across the global economy.



