Nvidia reported another blowout quarter, extending a run of results that has helped make the chipmaker a key barometer for investor confidence in the artificial intelligence boom. For the fiscal fourth quarter ended Jan. 25, 2026, the Santa Clara, California-based company said revenue rose to $68.1 billion, up 20% from the prior quarter and up 73% from a year earlier. 

Profit growth was even sharper. Net income climbed to $42.96 billion, and GAAP earnings per diluted share were $1.76, nearly doubling from the year-ago period, according to the company’s earnings release.

The latest report was closely watched after a period of heightened debate about whether AI-related spending can keep climbing at a pace that justifies the sector’s valuation gains. Nvidia’s stock has become a heavyweight in major indexes, and even small shifts in sentiment around its outlook can ripple across the broader market.

Data Center Business Remains the Growth Engine

The company’s surge continues to be fueled primarily by its data center products, where demand has been driven by the buildout of “AI factories” at cloud providers and enterprise customers training and running large language models. Data Center revenue reached a quarterly record $62.3 billion, up 22% from the prior quarter and up 75% year over year, underscoring how central Nvidia’s chips and networking gear have become to the current AI infrastructure cycle. 

Jensen Huang, Nvidia’s founder and chief executive, framed the period as part of a larger platform shift, pointing to growing demand for computing tied to “agentic AI” and describing customer investment as a race to secure capacity. In its release, the company highlighted its Grace Blackwell platform and next-generation roadmap, along with partnerships and deployments with major cloud providers and large enterprise customers.

Nvidia also reported a GAAP gross margin of 75.0% for the quarter, reflecting the pricing power and mix that have accompanied the AI-driven expansion, even as operating expenses rose alongside hiring and investment. 

Guidance Targets $78B as Investors Debate Durability

Looking ahead, Nvidia issued revenue guidance that again landed above many market expectations. The company said it expects revenue of about $78.0 billion for the first quarter of fiscal 2027, plus or minus 2%, which would imply a year-over-year increase of roughly 77% if achieved. 

The outlook also included a geopolitical qualifier: Nvidia said it is not assuming any Data Center compute revenue from China in its forecast. That detail is likely to remain a focal point for investors given ongoing export controls and uncertainty around the long-term shape of global AI supply chains.

Even with the strong numbers, the report arrives amid caution in parts of the market. Investors have been weighing whether the scale of AI investment will translate into durable productivity gains and profitable products quickly enough to sustain current spending trajectories. In recent years, Nvidia has frequently exceeded forecasts, yet the stock has not always moved in step with earnings beats as expectations have risen. 

Big Tech Spending Plans Keep Nvidia in Focus

A major support for Nvidia’s near-term demand remains capital spending by the largest technology companies building AI capacity. Amazon, Microsoft, Alphabet, and Meta Platforms have collectively signaled plans to spend about $650 billion this year to expand AI computing power, according to figures cited in recent reporting, with a meaningful portion expected to flow into Nvidia hardware and related infrastructure.

Nvidia’s own scale has expanded rapidly alongside that spending cycle. The company reported full-year fiscal 2026 revenue of $215.9 billion, up 65% from the prior year. Nvidia also said it returned $41.1 billion to shareholders during fiscal 2026 through share repurchases and dividends, and it ended the quarter with $58.5 billion remaining under its repurchase authorization.

The company’s results reinforce its position at the center of the AI buildout, while leaving investors to keep parsing the same questions: how long hyperscalers can sustain current investment levels, how quickly AI workloads evolve from training to deployment at scale, and how geopolitics shapes where high-end compute can be sold and installed.