Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Launchpad
Be early to the next big token project
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
AI Booming, Memory Stocks Going Crazy? A Single RAM Stick Hits Over 40,000, Behind It Is the Entire Industry Chain Celebrating
Brothers, have you seen it lately? A single memory module skyrocketed to over 40,000 yuan—can a box of it buy a house in Shanghai?
Don’t think it’s just domestic hype; this is fundamentally AI consuming memory to the fullest—an actual "memory shortage"!
Today, let’s break down this chart and see how AI is devouring memory, and which industries are being fueled by it.
1. Why does memory surge when AI takes off?
Shenzhen’s merchants are all saying "no more stocking"—single memory sticks soaring past 40,000 yuan, a whole box worth 4 million, more valuable than many Shanghai properties.
Many first think it’s speculation? Wrong! It’s not hype; AI has completely gobbled up memory.
At its core, the global memory shortage caused by AI large models: high-end memory capacity is monopolized by AI giants, and even regular memory is being snatched up, causing supply and demand to spiral out of control and prices to skyrocket.
2. Understand in three steps: How exactly does AI "consume" memory?
Don’t think AI only eats computing power; it’s also a "memory-consuming beast," devouring memory in three stages:
Step 1: The model itself is huge
Modern large models have trillions of parameters; once activated, they need to load massive parameters and training data into memory.
Not enough memory? The model can’t run—just like your phone crashing when opening a big game, the same applies to AI.
Step 2: When actually working, memory usage gets even fiercer
Interacting with AI for a long time, having it read lengthy documents, or perform complex tasks—all require temporary notes stored in memory.
Longer conversations, more complex tasks—more memory needed. It’s a case of "the busier it gets, the more it eats," and ordinary memory can’t handle it.
Step 3: Good memory is snatched up immediately
To support AI, manufacturers developed ultra-fast high-end memory HBM, but almost all capacity is monopolized by AI giants, and even regular memory is being snatched up, leading to frequent shortages.
In short, AI not only needs fast computation but also requires a lot of memory—memory is now the critical bottleneck for AI computing power.
3. One chart to understand: AI computing industry chain and opportunities
From upstream to downstream, this wave of AI computing is reshaping the entire industry chain, with opportunities everywhere:
🔹 Upstream: Raw materials and energy (AI’s "grain and fuel")
AI chips rely on rare metals like gallium, germanium, indium; fiber optics need germanium dioxide; data centers are huge power consumers, directly boosting electricity demand.
Rare metals: the "hard currency" of AI chips; related stocks benefit directly.
Carbon neutrality / photovoltaics / electricity: Green energy is the energy foundation of the AI era; energy storage, solar, and power sectors are the "power stations" for AI.
🔹 Midstream: Computing hardware and infrastructure (AI’s "skeleton")
This is the core base of AI computing and the most directly benefiting segment:
Core hardware: GPUs/ASICs are the heart of AI computing; memory chips (HBM) are AI’s short-term memory; fiber optics / CPO are its neural pathways; semiconductor equipment are the "tool sellers," more stable than chip manufacturing.
Infrastructure: Data centers / intelligent computing centers are the containers for AI; cooling, liquid cooling, cloud computing are essential for operation.
Key point: Storage chips are the direct beneficiaries of the "memory shortage" in AI; semiconductor equipment and fiber optics sectors are also highly profitable.
🔹 Downstream: Application deployment (AI’s "hands and feet")
AI large models are the brain, finally applied in various scenarios:
AI large models: foundational technology support, the source of all applications.
Smart manufacturing / autonomous driving / consumer electronics: robots are the best physical carriers of AI; AI reduces costs and increases efficiency in content creation; autonomous driving / intelligent transportation are AI’s "mobility scenarios."
Media: AI empowers content creation, boosting media industry efficiency; related stocks benefit directly.
This wave of AI market activity is essentially a "dual-drive" industry explosion—"computing power + memory."
Memory price hikes are just the most obvious signal. From upstream energy and raw materials, through midstream hardware infrastructure, to downstream application scenarios—each link is being pushed forward by AI.
Don’t just focus on the sky-high memory prices; behind it is the entire AI industry chain opportunity. Brothers, have you got it?