Decentralization doesn’t need to be sacrificed to make web3 usable, but the way infrastructure is delivered needs a serious rethink.Decentralization doesn’t need to be sacrificed to make web3 usable, but the way infrastructure is delivered needs a serious rethink.

Web3 is open, transparent, and miserable to build on | Opinion

2025/10/08 17:14

Disclosure: The views and opinions expressed here belong solely to the author and do not represent the views and opinions of crypto.news’ editorial.

Web3 is fond of declaring that everything is “on-chain.” In theory, that should make development easier, faster, and more accessible than ever. In practice, it’s a logistical nightmare.

Summary
  • Blockchain data is public but largely unusable — developers must build custom backends and patch unreliable tools instead of focusing on products.
  • Unlike web2, where stable infrastructure (AWS, Stripe, Firebase) “just works,” web3 forces teams to constantly rebuild basics, deterring serious companies.
  • Enterprises avoid web3 because it lacks reliability, oversight, and plug-and-play tools — whitepapers don’t replace service guarantees and monitoring.
  • For web3 to scale, it must deliver boring-but-essential infrastructure: cross-chain standards, predictable services, and usability without sacrificing decentralization.

Yes, blockchain data is technically public. But that doesn’t make it usable. Most of it is stored in ways that are hard to search or interpret unless you already know exactly what you’re looking for. As a result, developers often have to collect and organize that data themselves, working off-chain and relying on external services just to build basic features. Even with some tools available, many teams still end up building their own backend systems from scratch. That means spending time and money on infrastructure instead of improving the product.

This isn’t just inconvenient. It’s a structural failure. And unless we fix it, web3 won’t scale beyond tinkerers and ideologues.

In web2, infrastructure doesn’t get in the way

In web2, the tools developers rely on (like AWS, Stripe, or Firebase) are stable and dependable. Developers don’t have to worry about whether these services will work; they usually just do. If something goes wrong, it’s rare enough to make headlines. The default expectation is simple: this will work as expected.

Web3 doesn’t offer that same reliability. The tools developers rely on often break or give different results depending on where the data comes from. So instead of focusing on building their apps, developers end up fixing problems themselves — running their own servers, writing extra code to piece things together, and spending more time managing systems than building products.

That’s not innovation. It’s wasted effort. In web2, developers can rely on solid building blocks and focus on their actual product. In web3, they often have to rebuild those basic tools from the ground up. That might be fine for hobbyists, but for serious teams with customers, deadlines, and investors, it’s a nonstarter.

Building on web3 still means starting from scratch

And the problem goes deeper. Even though blockchain data is supposed to be transparent, there’s no standard way to access or understand it. Simple ideas like “transaction” or “account” can mean very different things depending on the blockchain. There’s no common interface, no plug-and-play tools. Developers are left dealing with all the messy differences themselves: writing custom code, patching together unreliable services, and starting from scratch every time they launch on a new chain.

The result? Most of the time and effort that should go toward building great products instead goes toward managing complexity. That’s not just inefficient, it’s self-sabotage. It’s also a major reason why big companies stay away.

Why enterprises don’t bite

Enterprises aren’t against decentralization. It’s that they weigh every new technology against risk, cost, and return. And right now, web3 just doesn’t add up.

These teams expect reliable systems, clear oversight, and tools they can plug into and trust. What they find instead is an ecosystem full of interesting ideas that’s missing the basics they need to operate at scale. A whitepaper isn’t enough. They want service guarantees, real-time monitoring, and infrastructure that quietly does its job, not one that constantly needs hand-holding.

So until web3 can match web2’s day-to-day reliability and offer something new, most companies will simply opt out.

Web3 must be easier to build on without giving up its values

This doesn’t mean Web3 has to give up its values. But it does have to grow up. Decentralization doesn’t need to be sacrificed to make web3 usable, but the way infrastructure is delivered needs a serious rethink.

We need interfaces that work across chains without custom hacks. Services that are modular, flexible, and don’t trap teams in vendor-specific tools. Developers shouldn’t need a PhD in blockchain mechanics just to pull useful data. They should be able to focus on their product, not babysitting the underlying systems.

Most teams can’t afford to become infrastructure experts, and they shouldn’t have to. Web3 needs to offer a development experience that’s boring in the best way: predictable, stable, and fast. And if it doesn’t get there soon, it risks missing its moment.

Miss this window, miss the market

Web2 cloud platforms didn’t win just because they were powerful. They won because they were easy. Developers could launch a service with a credit card and scale it by editing a few lines of configuration, not rewriting the entire backend.

That simplicity came with trade-offs: vendor lock-in, murky pricing, and centralized control. Web3 was supposed to solve those problems. It promised decentralized infrastructure with built-in resilience, shared ownership, and transparent rules. But instead of building something fundamentally new, much of the current ecosystem is just rebranding web2 patterns with a token slapped on top.

The opportunity is sitting right in front of us: decentralized infrastructure that’s not just open, but also usable. Systems that are reliable because they’re coordinated through aligned incentives, not corporate trust. Infrastructure that developers don’t have to fight with.

The window is open. But it won’t be forever.

Max Legg
Max Legg

Max Legg is the founder of Pangea, the first permissionless orchestration layer for AI and blockchain: an anti-fragile, sovereign, and stream-first approach to blockchain resources across chains and ecosystems.

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

The Channel Factories We’ve Been Waiting For

The Channel Factories We’ve Been Waiting For

The post The Channel Factories We’ve Been Waiting For appeared on BitcoinEthereumNews.com. Visions of future technology are often prescient about the broad strokes while flubbing the details. The tablets in “2001: A Space Odyssey” do indeed look like iPads, but you never see the astronauts paying for subscriptions or wasting hours on Candy Crush.  Channel factories are one vision that arose early in the history of the Lightning Network to address some challenges that Lightning has faced from the beginning. Despite having grown to become Bitcoin’s most successful layer-2 scaling solution, with instant and low-fee payments, Lightning’s scale is limited by its reliance on payment channels. Although Lightning shifts most transactions off-chain, each payment channel still requires an on-chain transaction to open and (usually) another to close. As adoption grows, pressure on the blockchain grows with it. The need for a more scalable approach to managing channels is clear. Channel factories were supposed to meet this need, but where are they? In 2025, subnetworks are emerging that revive the impetus of channel factories with some new details that vastly increase their potential. They are natively interoperable with Lightning and achieve greater scale by allowing a group of participants to open a shared multisig UTXO and create multiple bilateral channels, which reduces the number of on-chain transactions and improves capital efficiency. Achieving greater scale by reducing complexity, Ark and Spark perform the same function as traditional channel factories with new designs and additional capabilities based on shared UTXOs.  Channel Factories 101 Channel factories have been around since the inception of Lightning. A factory is a multiparty contract where multiple users (not just two, as in a Dryja-Poon channel) cooperatively lock funds in a single multisig UTXO. They can open, close and update channels off-chain without updating the blockchain for each operation. Only when participants leave or the factory dissolves is an on-chain transaction…
Share
BitcoinEthereumNews2025/09/18 00:09
XRP Price Prediction: Can Ripple Rally Past $2 Before the End of 2025?

XRP Price Prediction: Can Ripple Rally Past $2 Before the End of 2025?

The post XRP Price Prediction: Can Ripple Rally Past $2 Before the End of 2025? appeared first on Coinpedia Fintech News The XRP price has come under enormous pressure
Share
CoinPedia2025/12/16 19:22
BlackRock boosts AI and US equity exposure in $185 billion models

BlackRock boosts AI and US equity exposure in $185 billion models

The post BlackRock boosts AI and US equity exposure in $185 billion models appeared on BitcoinEthereumNews.com. BlackRock is steering $185 billion worth of model portfolios deeper into US stocks and artificial intelligence. The decision came this week as the asset manager adjusted its entire model suite, increasing its equity allocation and dumping exposure to international developed markets. The firm now sits 2% overweight on stocks, after money moved between several of its biggest exchange-traded funds. This wasn’t a slow shuffle. Billions flowed across multiple ETFs on Tuesday as BlackRock executed the realignment. The iShares S&P 100 ETF (OEF) alone brought in $3.4 billion, the largest single-day haul in its history. The iShares Core S&P 500 ETF (IVV) collected $2.3 billion, while the iShares US Equity Factor Rotation Active ETF (DYNF) added nearly $2 billion. The rebalancing triggered swift inflows and outflows that realigned investor exposure on the back of performance data and macroeconomic outlooks. BlackRock raises equities on strong US earnings The model updates come as BlackRock backs the rally in American stocks, fueled by strong earnings and optimism around rate cuts. In an investment letter obtained by Bloomberg, the firm said US companies have delivered 11% earnings growth since the third quarter of 2024. Meanwhile, earnings across other developed markets barely touched 2%. That gap helped push the decision to drop international holdings in favor of American ones. Michael Gates, lead portfolio manager for BlackRock’s Target Allocation ETF model portfolio suite, said the US market is the only one showing consistency in sales growth, profit delivery, and revisions in analyst forecasts. “The US equity market continues to stand alone in terms of earnings delivery, sales growth and sustainable trends in analyst estimates and revisions,” Michael wrote. He added that non-US developed markets lagged far behind, especially when it came to sales. This week’s changes reflect that position. The move was made ahead of the Federal…
Share
BitcoinEthereumNews2025/09/18 01:44