Web3 storage has always faced an unavoidable issue: NFT loading lag, and storage costs are ridiculously high. Recently, I saw a storage protocol that uses dynamic erasure coding technology to solve this problem.



The core idea is actually simple—hot data (frequently accessed parts) is stored on nodes close to users, ensuring fast access; cold data (less frequently used parts) is distributed across multiple locations with multiple copies to ensure security. This way, it neither wastes bandwidth nor causes sleepless nights due to data loss.

From a cost perspective, this solution can reduce storage costs by over 40%, providing real benefits for developers and users alike. The project also incentivizes nodes that provide quality services through tokens. Currently, the testnet has attracted over 20,000 nodes, all competing to see who can be more efficient.

If the Web3 ecosystem truly wants to support applications that require massive storage, such storage solutions will inevitably become widespread. Worth paying attention to.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 6
  • Repost
  • Share
Comment
0/400
NFTArchaeologisvip
· 7h ago
The logic of hot and cold stratification is essentially a reinterpretation of the information lifecycle, similar to ancient book restoration—pages that are frequently turned should be placed within easy reach, while rare, sealed copies have their own dedicated preservation. A 40% reduction in costs is not a small figure, but the key is whether this can truly attract creators who produce in-depth content to get involved.
View OriginalReply0
BlockchainWorkervip
· 7h ago
Reducing costs by 40%? Is this number real? Why are so many nodes still running on the testnet? Wait, are these two data points correct... Over 20,000 nodes but can the fees really be this cheap? Dynamic erasure coding sounds good, but who guarantees these nodes are truly reliable? We still need to see mainnet data before trusting it. What does the popularity of the testnet really indicate? This kind of solution should have existed earlier; storing an NFT used to be ridiculously expensive. Node incentives are interesting, but will the tokens keep depreciating? A 40% reduction seems a bit exaggerated; maybe it needs to be discounted further. At least it's a direction. Web3 storage has to iterate step by step like this.
View OriginalReply0
MEVHuntervip
· 7h ago
nah the erasure coding angle is interesting but where's the real arbitrage play here? storage cost reduction sounds nice on paper till you realize node operators are gonna race to the bottom on pricing anyway lol
Reply0
ruggedNotShruggedvip
· 7h ago
40% cost reduction sounds good, but can it really be implemented, or is it just another testnet dream? --- The number of 20,000 nodes seems a bit inflated; how many are actually working? --- Separating hot and cold data is an old trick; the key still depends on how long token incentives can sustain it. --- Has the fundamental problem of NFT loading lag been solved, or is it just cheaper storage? --- If this thing can truly reduce costs, why are major existing projects not using it yet? Let's wait and see. --- Token incentives for node participation involve inflation expectations; how will this economic model be maintained in the future? --- Dynamic erasure coding is indeed clever, but does it have any advantages compared to IPFS? --- Not all problems can be solved by technical stacking; the pitfalls of Web3 storage are quite deep.
View OriginalReply0
ponzi_poetvip
· 7h ago
It's the same old story, but this time it seems to be something real A 40% reduction in costs sounds tempting, but I wonder if it will just turn out to be a PPT project again 20,000 nodes sound decent, but why should we believe they can actually maintain efficiency The hot-cold data separation approach has been around for a while; the key is who can use incentives to keep the nodes engaged
View OriginalReply0
SpeakWithHatOnvip
· 7h ago
A 40% cost reduction sounds good, but will it actually be implemented smoothly or will it be another story? These testnet projects are popular, but once they go live on mainnet, it's hard to say if the nodes can be maintained. Fast speed and low cost are useless if the project runs away with the funds. Dynamic erasure coding sounds impressive, but in reality, it's just data dispersal—something that's been done before. With 20,000 nodes participating now, whether the incentive mechanism changes will depend on luck. It feels like another well-packaged storage solution; let's wait and see. The layered logic of hot and cold data is an interesting idea, but how do you ensure consistency in a decentralized system?
View OriginalReply0
  • Pin

Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)