CITIC Securities: Domestic Large Models Exceed Expectations, Emphasizing Investment Opportunities in Domestic Computing Power

Citic Securities Research Report states that during the 2026 Spring Festival, domestic large models experienced a surge in token usage. As of February 22, in the previous week, the top three global large model token usages were all domestic large models. We believe that the explosive growth in tokens fundamentally reflects an exponential increase in AI inference demand, and domestic computing power, with its cost advantages and continuously improving ecosystem, is expected to gradually dominate infrastructure layers. It is recommended to focus on valuation re-estimation opportunities driven by increased super-node interconnection density, including optical communication, high-speed line modules, switching chips and switches, IDC, and other segments.

Full Text

Communication | Domestic Large Models Surpass Expectations, Focus on Domestic Computing Power Investment Opportunities

During the 2026 Spring Festival, domestic large model token usage exploded. As of February 22, in the previous week, the top three global large model token usages were all domestic large models. We believe that the core drivers of this growth are: 1) AI shifting from optional features to default interfaces for both consumer and enterprise users, leading to full penetration; 2) application scenarios evolving from simple dialogue AI to multimodal (text/image/audio/video) and AI agents, with per-task token consumption increasing exponentially. Additionally, according to OpenRouter data, over 70% of token consumption comes from internet giants, large and medium-sized enterprises, and professional developers in production environments, where single-token calls far exceed those of individual users or small test projects. The explosive growth in tokens essentially reflects an exponential expansion in AI inference demand.

Hardware Layer: Domestic Computing Power Expected to Gradually Dominate Infrastructure Due to Cost Advantages and Ecosystem Improvements

During the Spring Festival, Alibaba and Tencent announced major achievements in NPO. Alibaba Cloud’s all-optical Scale-up network architecture UPN512 connects xPU and switches via optical interconnects, using a single-layer CLOS topology to achieve full interconnection of 512 xPUs. This solution eliminates high-speed copper cables inside cabinets, significantly reducing wiring complexity, heat dissipation, power supply needs, and maintenance costs, with power consumption halved and costs reduced by 30%. We believe that domestic computing power infrastructure, leveraging cost and ecosystem advantages, is expected to evolve from serving domestic giants to supporting global users through domestic AI models, opening up the demand ceiling for domestic computing power.

The US-China AI Arms Race Is Heating Up; Focus on Domestic Computing Infrastructure

Currently, major cloud giants in both China and the US are significantly increasing AI-related capital expenditures. According to CNBC on February 20, OpenAI is communicating to investors that its goal is to invest approximately $600 billion in computing power by 2030. According to the Financial Times on December 23, 2025, ByteDance has preliminarily planned a capital expenditure of 160 billion RMB in 2026, higher than about 150 billion RMB in 2025. Alibaba also announced at the 2025 Cloud Computing Conference that it will increase additional investments on top of the 380 billion RMB planned over the next three years. Among domestic computing power, super-node architecture is a necessary path for catching up and surpassing others. Cloud providers and equipment manufacturers are accelerating the adaptation of open protocols. Focus should be on valuation re-estimation opportunities driven by increased interconnection density, including optical communication, high-speed line modules, switching chips and switches, IDC, and other segments.

  1. Optical Communication: In the context of sustained high prosperity in the AI industry, optical interconnects are gradually replacing copper connections, becoming key to high-performance, high-bandwidth, low-latency AI networks. At the scale-out level, pluggable optical modules remain the undisputed first choice; at the scale-up level, as a new market, optical module technology is rapidly evolving, with NPO, CPO, and other solutions progressing simultaneously.

  2. High-Speed Line Modules: Responsible for short-distance, high-bandwidth, low-loss connections between boards, such as within cabinets for scale-up interconnects. As super-node sizes expand and interconnect speeds increase, their value is expected to rise significantly.

  3. Switching Chips and Switches: According to NADDOD and Semianalysis estimates, for example, as NVIDIA’s super-node scale increases from NVL72 to NVL576 or higher, the ratio of GPU to switching chips approaches 1:1 or even higher. AMD and Huawei’s super-node solutions also confirm this trend. As the GPU-to-switching chip ratio improves, the value of the switching segment increases substantially.

  4. IDC: An IDC (Internet Data Center) is the foundation of computing power, essentially a heavy-asset leasing service with high technological attributes. Initial CapEx investments are large, typically taking 1-3 years to build, with deployment taking 6-18 months. An average deployment rate of 50% can achieve breakeven; top-tier IDC providers with over 90% deployment rates can achieve gross margins over 50% and net profit margins over 30%. With rapidly rising industry demand, IDC is expected to benefit significantly, especially leading companies with abundant resources, which will see explosive growth.

Risk Factors:

  • AI technology development and application falling short of expectations
  • Domestic new infrastructure development underperforming
  • AI-related policies not being implemented as expected
  • Cloud providers and operators’ capital expenditures falling short
  • Geopolitical risks

Investment Strategy:

During the 2026 Spring Festival, domestic large model token usage surged. As of February 22, in the previous week, the top three global large model token usages were all domestic large models. We believe that the explosive growth in tokens fundamentally reflects an exponential increase in AI inference demand, and domestic computing power, with its cost advantages and improving ecosystem, is expected to gradually dominate infrastructure layers. Focus on valuation re-estimation opportunities driven by increased super-node interconnection density, including optical communication, high-speed line modules, switching chips and switches, IDC, and other segments.

(Source: People’s Financial News)

GLM-0,33%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin

Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)