
Nvidia’s Datacenter Revenues for Q4 Appear Promising: Bernstein
Analysts at Bernstein are expressing optimism regarding NVIDIA’s datacenter revenues as the company approaches the fourth quarter, anticipating robust growth.
The firm indicates that the datacenter revenues for Q4 appear strong. Following a recent discussion with NVIDIA’s CFO, Colette Kress, and Investor Relations lead, Stewart Stecker, Bernstein reports that NVIDIA clarified its previous guidance regarding revenue expectations for its new Blackwell chip. This clarification suggests that the revenue boost could reach “several billion” dollars, surpassing the previously estimated figure of $3 billion.
Current consensus forecasts predict an increase of $3.8 billion in datacenter revenues for Q4, with Bernstein suggesting that there may be additional upside potential beyond this projection.
Bernstein also notes that NVIDIA’s gross margins are anticipated to remain solidly in the 70% range. The slight margin compression observed in the latter half of 2024 is attributed to the product mix, particularly the rollout of the Blackwell chip and the H200 platform, rather than any yield issues. NVIDIA’s management has reassured investors, with Kress stating, “Guys, relax, margins are fine.”
The firm highlights that NVIDIA’s Blackwell offerings are highly customizable, allowing customers the flexibility to tailor configurations to meet their specific needs. This adaptability could lead to varied pricing, which is still being determined. Bernstein asserts that NVIDIA is confident in its position in the inferencing market, emphasizing that its GPU technology delivers superior performance compared to custom ASICs, particularly regarding total cost of ownership.
Furthermore, the company believes that AI training infrastructure will continue to expand, driving significant growth in inferencing revenues, especially in applications such as recommender engines and search.
Bernstein also points out promising opportunities in the sovereign AI sector, as various countries and regions work on developing their own AI infrastructures. Additionally, concerns related to power consumption have been minimized, with NVIDIA mentioning that major customers have been planning their energy requirements for several years, alleviating fears of potential bottlenecks.