Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Who can accurately calculate the gross profit margin of these AI large model companies' coding packages? Anthropic's gross profit is approximately 40%.
Domestic companies like Zhipu and Minimax are open-source models; those with computing power can deploy them themselves. Therefore, it's unlikely that the pricing for coding packages will be too high, and the gross profit margin may be very low or even result in losses. Moreover, the business model of selling APIs differs from SaaS and other internet applications; each token incurs a cost, and there is no near-zero marginal cost.
Zhipu has also encountered insufficient computing power; they sold too many cheap tokens earlier, and now their computing packages can't be sold continuously, with daily limits. In the same business model, Anthropic does not impose limits, but it's unclear whether they are burning money to limit supply or if their computing power is truly insufficient and adding more money can't solve the problem.
The downside of going public is that we will find out during the next financial report whether they are truly struggling. Chinese models are cost-effective, but it's hard to say how much profit the current coding packages generate or whether the business model is sustainable. Right now, it’s more of a faith-based totem, an important investment target for AI narratives.
There's also an interesting logic I just recently understood:
1. After large models become popular, the first to benefit are chip companies, which need computing power for training. Nvidia's stock skyrocketed, and large model companies all have demand. They compete with each other, and the highest bidder gets the most premium, resulting in high profits.
2. Next is memory. AI large model training and inference demand enormous memory. As large model applications increase and inference needs grow, it's a race to acquire memory for inference, also with high prices and premium margins.
3. Optical communication data transmission. The explosive growth of AI computing power has led to traditional upgrades from copper to optical transmission becoming the core infrastructure for AI computing power.
What might be the next field? Companies that can provide computing infrastructure and services are likely to be among them. As inference demand surges and computing power shortages drive up prices, buying more computing resources can yield greater premium margins. Amazon and Alibaba Cloud may have the opportunity to capture this part of the premium.