Recently, a question has sparked discussion in the community: should data storage be built on centralized infrastructure or should we jump on the decentralized train early?



This is truly a dilemma. On one hand, centralized solutions like AWS are incredibly stable, offering fast performance and low costs, and they even compensate for outages. On the other hand, decentralized storage sounds ideal, but it always makes people nervous—after all, there aren't many mature cases yet.

However, a recent project has some interesting technical details. It uses RedStuff erasure coding combined with on-chain proof mechanisms, which theoretically can guarantee data integrity even if 66% of nodes go offline. Plus, it employs encryption schemes like Seal to prevent privacy leaks. From a technical perspective, this addresses many pain points. Compared to centralized solutions, which are fast but always carry risks of data deletion or theft.

Currently, everyone is discussing how AI data will explode by 2026, which means storage demands will skyrocket. Those who are already deploying decentralized storage now, how competitive will they be then? It's really hard to say.

Many people are still on the surface waiting and observing, but secretly they are already testing the waters. Are you going to continue betting on the "security" of centralized solutions, or do you want to see how far these new schemes can go?
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 8
  • Repost
  • Share
Comment
0/400
GasGoblinvip
· 47m ago
A 66% fault tolerance sounds impressive, but can it really operate stably? I'm still a bit skeptical.
View OriginalReply0
RebaseVictimvip
· 14h ago
Alright, to be honest, AWS is stable, but I'm already tired of the feeling of being restricted.
View OriginalReply0
ChainSauceMastervip
· 14h ago
Hey, this RedStuff erasure code sounds interesting, but to really bet data on distributed nodes like that, you must have a big heart.
View OriginalReply0
Layer2Arbitrageurvip
· 14h ago
ngl the 66% node downtime tolerance is just theoretical cope lmao. show me actual production data & i'm listening. until then it's all calldata compression vibes.
Reply0
GateUser-addcaaf7vip
· 14h ago
To be honest, RedStuff's erasure coding system is indeed impressive, but a 66% fault tolerance rate is still a bit risky in actual node maintenance.
View OriginalReply0
0xDreamChaservip
· 14h ago
Red Eyes look at RedStuff erasure code, honestly, this set of arguments might be a bit overly optimistic.
View OriginalReply0
CryptoTarotReadervip
· 15h ago
This RedStuff erasure code sounds promising, but the reality of running it is another story. That bunch at AWS has already dominated the market; decentralization sounds nice but isn't without its pitfalls. 66% node downtime and it can still survive—that's a bit of a stretch to boast about... Let's wait until 2026; it's too early to act now. But those who secretly tested the waters definitely made a profit—I need to think about it too.
View OriginalReply0
BetterLuckyThanSmartvip
· 15h ago
I think RedStuff's erasure coding system is quite interesting, but is a 66% fault tolerance really enough? It seems like we should wait for more case studies to be published.
View OriginalReply0
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)