Who can accurately calculate the gross profit margin of these AI large model companies' coding packages? Anthropic's gross profit is approximately 40%.


Domestic companies like Zhipu and Minimax are open-source models; those with computing power can deploy them themselves. Therefore, it's unlikely that the pricing for coding packages will be too high, and the gross profit margin may be very low or even result in losses. Moreover, the business model of selling APIs differs from SaaS and other internet applications; each token incurs a cost, and there is no near-zero marginal cost.
Zhipu has also encountered insufficient computing power; they sold too many cheap tokens earlier, and now their computing packages can't be sold continuously, with daily limits. In the same business model, Anthropic does not impose limits, but it's unclear whether they are burning money to limit supply or if their computing power is truly insufficient and adding more money can't solve the problem.
The downside of going public is that we will find out during the next financial report whether they are truly struggling. Chinese models are cost-effective, but it's hard to say how much profit the current coding packages generate or whether the business model is sustainable. Right now, it’s more of a faith-based totem, an important investment target for AI narratives.
There's also an interesting logic I just recently understood:
1. After large models become popular, the first to benefit are chip companies, which need computing power for training. Nvidia's stock skyrocketed, and large model companies all have demand. They compete with each other, and the highest bidder gets the most premium, resulting in high profits.
2. Next is memory. AI large model training and inference demand enormous memory. As large model applications increase and inference needs grow, it's a race to acquire memory for inference, also with high prices and premium margins.
3. Optical communication data transmission. The explosive growth of AI computing power has led to traditional upgrades from copper to optical transmission becoming the core infrastructure for AI computing power.
What might be the next field? Companies that can provide computing infrastructure and services are likely to be among them. As inference demand surges and computing power shortages drive up prices, buying more computing resources can yield greater premium margins. Amazon and Alibaba Cloud may have the opportunity to capture this part of the premium.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
  • Pin