Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
Gate MCP
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 30+ AI models, with 0% extra fees
I was re-reading some discussions about Nvidia's evolution and found it very interesting how Jensen Huang explains the company's strategic decisions. The guy literally invented the GPU in 1999 and managed to transform Nvidia from a gaming-focused company into what it is today—practically the world's AI factory.
What stands out most is how he talks about extreme co-design. Basically, modern computational problems simply no longer fit on a single computer. You need to distribute the algorithm, break the problem into pieces, and that's where everything gets complex. It's not just about linear scaling like we used to do.
And here comes something many people don't think about: Moore's Law has slowed down significantly. You know that pattern where transistor density doubled every two years? Well, the Dennard scaling (which allowed voltage reduction while maintaining frequency) started to fail. This completely changed expectations for technological advancement. You can no longer rely solely on Moore's Law for performance improvements.
Jensen mentions that when designing a computer, you can't just look at hardware. You need an operating system, you need to think about the entire stack—software, compilers, everything. That requires intense discussions among specialists from different fields.
What I find brilliant is how Nvidia navigated this transition. It started as an acceleration company, then moved into general-purpose computing, and now is practically synonymous with AI. But he makes it clear that there's a trade-off between specialization and generalization. You can't be everything to everyone.
Another relevant point: the size of the market literally determines your R&D capacity, and your R&D determines the impact you can have. That’s why big tech companies can do research that startups can't.
There are some technical details that were game-changers too. The introduction of fp32 (32-bit floating-point) in shaders was crucial for programmability. And then came the decision to put CUDA on GeForce—Jensen calls this an existential threat at the time, but it turned out to be one of the best decisions ever made.
He also says that the installed base is the most important thing for an architecture. It’s not the technology itself that defines success—it’s how many developers are using it. That’s why x86, despite always facing criticism, remains the dominant architecture.
All of this shows how well-made strategic decisions—even with risk—can shape an entire industry. Nvidia didn't get stuck in one business model; it adapted as the market demanded. Meanwhile, computing continues to evolve beyond what Moore's Law promised, and solutions are in co-design, distributed processing, and intelligent specialization.