The hype around “web3” has led to a revisionist history of what “web2” was and in this rewriting of history, web3 tells a convenient myth about decentralization. Here’s why it matters.
Blockchain has gotten a bad name. A promising technology found widespread adoption with the development of Bitcoin and other cryptocurrencies, and lately the NFT trend, seemingly manufactured overnight, has promoted as much ridicule as it has created debate about next-generation digital marketplaces. Proponents of these technologies have adopted the moniker “web3” as an umbrella term to describe the trend of creating alternative marketplaces based on the blockchain. Evangelists for web3 claim it is a movement about decentralization, a sea change from what they claim is the defining trend of “web2,” also known as the web we experience today.
The early days of the World Wide Web, which now bears the retronym Web 1.0, gave us a promise of a decentralized and democratized way of sharing information. Web 1.0 was truly revolutionary; it stood in stark contrast to any other information sharing mechanism in all of human history. The idea was simple: people who held information could make that information available at will, with few mediating influences. Information was to be free and with as few barriers to access as possible. Fundamentally, the idea was that the web should become a decentralized information mesh that spanned the globe.
This approach was not without its challenges, of course. Decentralized networks create a discovery problem, where you can’t find information if you don’t know where to look. The maturation process of Web 1.0 involved the development of tools and practices for information discovery: search engines, knowledge repositories, purpose-built forums, and so on. Web 1.0 was never supposed to be a competition, but if it was, then its winners were those who provided services for information discovery. The decentralized web was a challenge to navigate, and they made it easy to do so at scale.
But in Web 1.0 another trend emerged: the development (and migration) of communities into Web 1.0 spaces. Communities became a crucial source of knowledge discovery, and forums and chat rooms and the like proliferated in the Web 1.0 era. By the end of the cycle, we had developed the technology to make dynamic serving of content pretty easy, but even though the content was dynamic, the process still followed the same patterns: request information, serve information, consume information. In other words, nearly everything was still a web page, even if those web pages were generated on the fly.
As this era came to an end, we began to see another problem in this model: as people began adopting the internet, the internet still represented a separate part of our identities. While it was easy to find a forum for almost any interest, from car enthusiasts to horticulturists, people began to feel like their identities in these spaces reflected only a part of who they were. Moreover, Web 1.0 failed to create a shared experience in the offline world. Web 1.0 was perfectly suitable for early adopters, but as more people went online, we began to find that it was harder and harder to connect and share our online lives with our offline friends and vice versa. Enter Web 2.0.
If Web 1.0 was about information discovery, Web 2.0, now retroactively named web2, was all about people discovery, aka social networking. Web 2.0 became about integrating the internet with our daily lives. The gap between your online and offline selves began to close. More and more of our life would be mediated through the internet, from ordering pizza delivery to scheduling an oil change. But more than that: we’d begin to share our lives online in ways that were not restricted to cloistered communities, each with their own shibboleths. The social networking era began with Web 2.0, and those who have emerged as social networking giants often didn’t start out that way. Twitter’s core functionality as a microblogging platform has changed very little since its launch, but few could have seen it emerge as a major culture moderator when it first went live. Today, it’s commonplace to see cable news talking about who tweeted what.
Social networking depends on content development. Web 2.0 sought to address the biggest challenge that plagued Web 1.0: content creation is difficult. Coding is really only fun for coders; for everyone else it is a huge and expensive pain in the ass. Web 1.0 was about finding content; Web 2.0 was about generating content. Web 1.0 waited for the content to come to it; Web 2.0 went to where the content was. This required a massive lowering of the barrier for content creation. It also meant that we needed to usurp the request, serve, consume cycle of Web 1.0. Web pages needed to become information streams; maturation of frameworks like jQuery helped developers produce dynamic content served on pages that didn’t refresh, a small but significant change in user experience that transformed the mental model of users of the Web. The web became about serving applications and experiences, not just content.
By lowering the bar to content creation, what Web 2.0 actually achieved was the simplification of sharing. It became easier to distribute knowledge, rather than just discover it. The ability to push content to our friends and followers enabled upheavals in media and culture; blogs became reliable sources of news, celebrities found news ways to engineer their public image, and we could interact with the rich, powerful and famous in ways we never could have before. Web 2.0 flattened the world. Institutions that were historically out of reach had their gates torn down. Web 2.0 was as much about decentralization as Web 1.0, except it was now culture, not information, that was being distributed.
This interpretation stands in contrast to the claims that web3 advocates make. They claim that web2 is a centralized mess dominated by technology megacorporations and web3 is now about content ownership. In order to achieve these goals, they must create an entirely new operating model for the web, one built on the blockchain. Blockchain is in effect a grand distributed public ledger to which anyone can read or write. Of course, distributed systems are very hard, so blockchain achieves this by applying advanced mathematics to ensure consensus across systems. The effect is that entries into the blockchain become effectively permanent over time and that if we look at the blockchain as a massive ledger that encodes the idea of the state of a system over time, then we avoid risks inherent to centralization because no entity can control the content in the ledger; they can only read from it and append to it.
This is, admittedly, a very powerful idea. The challenges of Web 1.0 and Web 2.0 around ownership, censorship, and digital rights are highly nontrivial ones. For web3 enthusiasts, these issues are existential threats to the web, threats that web3 will eradicate simply by eliminating the question. There can be no debate on the power or responsibility to censor if there is no ability to censor. Web3 evangelists believe Web 2.0 is a highly-centralized mess, and they’re not wrong: companies like Google, Facebook, and Twitter have become culture drivers at an unfathomable scale, in some cases immune even to the world’s most powerful governments. In this regard, web3 advocates are not wrong. Web 2.0 has become extremely centralized.
But what web3 gets wrong is that Web 2.0 was about centralization. This is not the case. A brief glance at the edit history of the Wikipedia page for Web 2.0 shows that Web 2.0 was itself all about decentralization. In a 2005 version of the page1, we can see this vision of decentralization spelled out plainly:
The first and most important evolution towards Web 2.0 involves the syndication of website content, using standardized protocols which permit end-users to make use of a site’s data in another context, ranging from another website, to a browser plug-in, or a separate desktop application. Protocols which permit syndication include RSS, RDF (as in RSS 1.1), and Atom, all of which are flavors of XML. Specialized protocols such as FOAF and XFN (both for social networking) extend functionality of sites or permit end-users to interact without centralized websites.
Web 2.0 was created with the same vision of a decentralized mesh of information as Web 1.0; fundamentally, it was about shifting the web from serving information to serving experiences. At the center of these experiences were community building, collaboration, and social networking. If Web 1.0 was a pull model, Web 2.0 would become a push model. The intention of Web 2.0 was never to consolidate the power of the web into the hands of a few companies that would be beyond reproach. But this is not how web3 advocates present it.
This matters, because while Web 2.0 failed at preventing centralization, we need to explore how and why these failures occurred. Without a proper retrospective analysis of what Web 2.0 was trying to do and why it failed, we risk repeating these failure modes with web3. Indeed, as Moxie points out in his blog, the actual tech stack that web3 is being built on is woefully inadequate at preventing these same failures from recurring. While the blockchain offers promise in addressing some of the challenges present in Web 2.0, the blockchain alone does not comprise the entirety of the web3 experience. In fact, the complexity of working on the blockchain means that the barrier for entering this space is somewhat higher, and as a result there are few companies building the required interfaces and window dressing needed to make the blockchain viable in the day-to-day experience of ordinary people. The effect is that while the blockchain may be decentralized, we are building an entire scaffold of highly-centralized services needed to interact with it, and these tools are being built with far less rigor and control expected of Web 2.0 players.
None of these things are dealbreakers for web3; the approach is still immature and these problems are not insoluble. The problem is that by telling a lie about the intentions of Web 2.0 and promoting the myth of decentralization, web3 advocates are attempting a razzle-dazzle maneuver to distract from these very important matters. By making web2—even going so far as to renaming it—about centralization and web3 about decentralization, web3 advocates claim it is virtuous because it is starting out with better intentions.
This is a fallacy. One cannot compare the promises of web3 to the problems of web2. It would be more appropriate to compare the starting point of web3 to the starting point of web2. There are no guarantees that web3 will not repeat the same mistakes that web2 made, or that it doesn’t have the same vulnerabilites to exploitation prevalent in web2. Indeed, early examples such as those pointed out in Moxies blog seem to indicate that web3 advocates would rather ignore these questions. Web 2.0 is giving web3 an early head start: we know exactly what problems will arise in online spaces. By refusing to address these challenges or even acknowledge that these challenges are not inherently solved by the blockchain, we are left with little faith that web3 will actually succeed at achieving its vision for decentralization any better than Web 2.0 did. In fact, it’s worse, because web3 advocates have priors that Web 2.0 did not have. How can I be sure your utopia addresses the problems I am currently having if you will not even address those problems on face value?
This lack of trust is exacerbated by the gold rush mentality of web3. Some claim web3 is about ownership, but it is more appropriate to say that web3 is about wealth. Very little in the web3 advocacy space addresses how it will match Web 1.0 and 2.0’s successes in terms of discoverability and sharing. It offers little by way of guarantee that it will protect its users' digital rights, in some cases making them much worse because of the blockchain’s immutable nature. Centralization was an emergent property of Web 2.0, not an intended behavior. By reframing Web 2.0 as centralized by design, web3 exposes itself to the exact same threats that caused Web 2.0 to be centralized by effect. This is enough to merit caution when embracing the web3 mentality. This is not to say that blockchain and web3 tech doesn’t have some promise, or that the centralization inherent in Web 2.0 is a positive. But rather we should be exploring ways to decentralize safely, while honestly exploring how we failed to prevent centralization. Simply restating intentions isn’t enough.
h/t to Rich Felker ↩︎
Posted: 07.01.2022
Built: 07.11.2024
Updated: 24.04.2023
Hash: 99e66be
Words: 2106
Estimated Reading Time: 11 minutes