top of page
Subscribe for Free Business and Finance Resources

What Nvidia's Earnings Tell Us About the AI Market

  • Writer: Peak Frameworks Team
    Peak Frameworks Team
  • 2 hours ago
  • 4 min read

If you're interested in breaking into hedge funds, check out our Hedge Fund Course. Our courses have helped thousands of candidates land top jobs every year.



Overview


Nvidia announced its 3Q results for fiscal 2026 on November 19, 2025. Its earnings release can be interpreted as a quarterly status update on the AI infrastructure boom.



NVIDIA Logo


Nvidia’s top line continues to be dominated by AI data center spend:


  • Revenue for the quarter was $57 billion, up 22% quarter over quarter and 62% year over year, a new all-time record.

  • Data Center revenue was $51.2 billion, up 25% sequentially and 66% from a year ago. Data Center is now roughly 90% of total revenue.

  • GAAP gross margin was 73.4%; GAAP net income reached $31.9 billion.


For the next quarter, Nvidia guided to $65 billion in revenue with a 2% margin of error, with even higher gross margins expected.


If you'd like to learn how to break into a hedge fund, we'll teach you how to build comprehensive models and stock pitches to nail your interview in our Hedge Fund Course.


How Nvidia Characterizes This Phase of AI


Jensen Huang’s quote in the release is a good window into how Nvidia views the cycle.

He notes that "Blackwell sales are off the charts, and cloud GPUs are sold out," and that "we have entered the virtuous cycle of AI."



Jensen Huang, NVIDIA CEO
Jensen Huang, Nvidia's CEO


Whether that language proves accurate long term is an open question, but it clearly signals how the company is positioning itself and what it sees as the opportunity set ahead: it sees itself as a core supplier to a secular build-out.


There are two key points implied in CEO Jensen Huang's lines:


  1. Demand is still supply constrained at the high end. If cloud GPUs are "sold out," that suggests hyperscalers and large customers are still racing to secure capacity for training and inference rather than pulling back.

  2. Nvidia sees AI as self-reinforcing. The idea of a "virtuous cycle" implies that more compute leads to more powerful models, which leads to more useful applications, which then justify even more infrastructure spending.


Where Is the Growth Coming From?


Nvidia's Data Center segment, where revenue hit $51.2 billion for the quarter, is by far its largest segment. Within the segment, the company highlights that its Blackwell platform achieved top performance and efficiency in industry benchmarks and that it is already in volume production, including wafers produced in the United States at TSMC’s Arizona facility.


The company calls out partnerships with OpenAI, Google Cloud, Microsoft, Oracle, xAI and others to build large AI data centers and supercomputers using "hundreds of thousands" of Nvidia GPUs.


Microsoft Logo

For example, Nvidia and OpenAI are partnering to deploy at least 10 gigawatts of Nvidia systems for OpenAI’s next generation AI infrastructure.


OpenAI Logo

Nvidia also emphasizes that it is working with governments and national champions in places like the U.K., Germany and South Korea to expand AI infrastructure with very large GPU footprints.


Growth is therefore driven not primarily from consumer apps, but from national scale "AI factories" made up of GPUs, networking, and software stacks that look more like utility infrastructure than typical tech projects.


What This Says About The AI Market


1. AI infrastructure is still in the expansion phase, not the digestion phase. Double-digit sequential growth on a $57 billion revenue base, with guidance for another step up to ~$65 billion, suggests that hyperscalers and large customers are still scaling capacity rather than pausing to digest


2. The profit pool is concentrated at the hardware and systems layer. With net income of $31.9 billion and gross margins in the ~70s, a large share of the near term economics in AI is accruing to the company building the accelerators and systems that everyone else relies on.


3. The ecosystem is widening, but still anchored on Nvidia. The highlights list a long set of partners: cloud providers, sovereign initiatives, telecommunications companies, industrial and automotive players. The common denominator is Nvidia hardware and its software stack. That reinforces the idea that AI right now is less a single product category and more an input into many different industries, all drawing on the same compute backbone.


4. AI is increasingly treated like critical infrastructure at both corporate and national levels. References to "gigawatt-scale AI factories", and national infrastructure projects indicate that AI compute is being planned and financed more like power generation or telecom networks than like typical IT projects.



What Are The Risks?


Nvidia’s results also underline how much risk is embedded in the current AI build-out. The company is heavily exposed to a small group of hyperscale and cloud customers whose capex plans can change quickly if AI projects fail to meet return thresholds or if macro conditions tighten.


For Nvidia, some of those same customers are developing their own custom chips, which could gradually reduce their reliance on Nvidia hardware, especially if they decide to prioritize cost over peak performance.


At the market level, Nvidia has become a key driver of major equity indices, so any disappointment in earnings, guidance, or AI demand can spill over into broader risk sentiment and trigger sharp index-level drawdowns.


A prolonged slowdown in AI infrastructure spending would not just affect semiconductors, but also data centers, equipment vendors, and parts of the software ecosystem that have been built around high growth assumptions.


Taken together, the numbers point to a powerful growth story, but one that is tightly linked to a single technology stack, a concentrated customer base, and a broader equity market that has become increasingly sensitive to every Nvidia quarter.








bottom of page