HawkInsight

  • Contact Us
  • App
  • English

Nvidia's earnings report is about to be announced. Can "computing anxiety" be cured?

After hours on February 26, the global AI chip overlord will hand in a key quarterly report card.

On February 24, Nvidia's earnings report was about to be announced. Morgan Stanley analysts Joseph Moore and Mason Wayne pointed out in their latest report that demand for Nvidia's Hopper architecture chips has rebounded significantly in the past two months, and the technical bottleneck of the next-generation Blackwell chip GB200 has also been broken. However, pending export control policies still cast a shadow on its upcoming financial report.

The report predicts that Nvidia's revenue this quarter may be the same as the market consensus of US$42 billion, and maintain an "overweight" rating, with a target price of US$152, a potential increase of about 13% from the current share price.After hours the next day (February 26), the global AI chip overlord will hand in a key quarterly report card. The market is intertwined with expectations and doubts about whether it can continue the myth-on the one hand, the demand for computing power driven by generative AI. The explosion, on the other hand, geopolitics continues to disrupt the supply chain.InvalidParameterValue

From the perspective of business fundamentals, Nvidia's moat is still solid.

In the third quarter of fiscal 2025 (as of October 2024), its revenue increased by 94% year-on-year to US$35.1 billion, and its net profit increased by 109% year-on-year to US$19.3 billion. The data center business contributed 87.6% of revenue, a year-on-year increase of 112%.The core supporting this growth is the continued increase in the Hopper architecture (H100/H200) and Blackwell's production capacity climb.Although the market had previously worried that demand for Hopper would slow down due to Blackwell's replacement, Morgan Stanley's survey showed that cloud service providers 'willingness to purchase Hopper has rebounded, partly due to customers placing orders in advance due to concerns about tightening export controls, forming "transitional demand".

Huang Renxun emphasized in an earnings conference call that Hopper's demand will continue into fiscal year 2026, while Blackwell's deliveries are expected to reach "billions of dollars" in the fourth quarter of 2025 and continue to climb.This "symbiosis" phenomenon between old and new architectures reflects the multi-layered nature of computing power requirements: Hopper is still the main force in current data centers to accelerate traditional data processing and small and medium-scale AI training, while Blackwell is targeting large models that require hundreds of billions of parameters. Training and reasoning.InvalidParameterValue

However, the complexity of technology iteration and supply chain management is testing Nvidia's gross margin resilience.The "unprecedented complexity" of the Blackwell system (involving air cooling/liquid cooling, NVLink multi-version combination, Grace and X86 architecture compatibility, etc.) has led to high initial production costs. It is expected that the non-GAAP gross profit margin at the beginning of production will drop to around 70%. However, as yields improve and economies of scale emerge, it is expected to recover to 75% within several quarters.The volatility contrasts with Hopper's profit margins, which are now a pillar of profit due to mature production lines and improved supply.

However, Huang Renxun's long-term logic is clear: Blackwell breaks through single-chip performance bottlenecks through architectural innovation (such as multi-chip integration), and its "computing power per watt" advantage will help customers maximize data center revenue under power constraints, thereby consolidating Nvidia's pricing power.InvalidParameterValue

The real uncertainty comes from geopolitics.

The U.S. export control of AI chips to China is like a "sword of Damocles," forcing Nvidia to customize H20 chips with downgraded performance for the China market. However, the revenue from the mainland of China and Hong Kong markets increased by only 34% year-on-year in the third quarter of fiscal year 2025, far lower than the United States (135%) and Singapore (185%).Although Morgan Stanley believes that customers may circumvent restrictions by purchasing low-performance products or overseas training, if regulations are further tightened (such as limiting chip interconnection bandwidth or computing power density), Nvidia's global market share may be eroded by local manufacturers.

On the other hand, the rise of sovereign AI (AI infrastructure built by various countries) has opened up new markets for Nvidia, but it admitted in its financial report that it has not yet obtained a license to export restricted products to China, and its partners have also not been approved.Although this pattern of "the east is not bright and the west is bright" can buffer short-term risks, it cannot eliminate long-term growth risks.InvalidParameterValue

A deeper contradiction is that the arms race for generative AI has risen from the enterprise level to national strategy.Nvidia mentioned the concept of "AI factories" many times in conference calls-data centers are transforming from general computing platforms to facilities dedicated to the production of intelligence, and Blackwell is the "core engine" of such factories.This paradigm change has given rise to cross-industry demand: in addition to cloud giants such as Google and Microsoft, vertical companies such as Tesla and xAI are investing in building supercomputing clusters (for example, xAI's Colossus cluster is equipped with 100,000 Hopper GPUs), and even Japan's SoftBank also builds the strongest AI supercomputing in the country based on Blackwell.

Huang Renxun defines the current trend as "two fundamental shifts": the accelerated migration of computing from CPU to GPU, and the software development revolution from coding to machine learning.The resonance of these two trends makes Nvidia's ecological barriers far exceed the hardware itself-the deep integration of the CUDA software stack, AI Enterprise platform and global partners constitutes a "full stack advantage" that is difficult to replicate.InvalidParameterValue

For investors, Nvidia's short-term fluctuations and long-term value need to be viewed dialectically.Although export controls may suppress performance guidance, infrastructure construction for generative AI is still in its early stages.According to Huang Renxun, the training power requirements of the next generation of large models may be 10-40 times higher than the current model, and Blackwell's design is for this kind of exponential growth.

In addition, the explosion of the reasoning market (accounting for more than 60% of Nvidia's installed equipment base) and the rise of agent AI will provide a channel for continuous monetization of existing hardware.If NVIDIA can advance its technology roadmap as scheduled (such as the 2026 Rubin platform), its monopoly position may be further strengthened.However, regulatory risks (such as rumors of the U.S. Department of Justice antitrust investigation) and supply chain resilience (CoWoS packaging and HBM memory capacity) still need to be closely watched.

As of pre-market trading on February 24, Nvidia's share price rose slightly by 1%. The market seems to have partially absorbed policy risks and turned to betting that AI computing power is just needed.The $3.6 trillion giant is at a crossroads: technological innovation gives it the power to define the future, but geopolitical waves can also instantly change course.The only thing that is certain is that Nvidia is still an irreplaceable "ferryman" in the torrent of general-purpose computing migrating to accelerated computing.

英伟达财报公布在即,“算力焦虑”能治愈吗?

·Original

Disclaimer: The views in this article are from the original Creator and do not represent the views or position of Hawk Insight. The content of the article is for reference, communication and learning only, and does not constitute investment advice. If it involves copyright issues, please contact us for deletion.