paint-brush
The Hitchhiker's Guide to Web 3.0by@rishi
1,059 reads
1,059 reads

The Hitchhiker's Guide to Web 3.0

by SonrJune 6th, 2021
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

In this article, I will be examining Web 3.0 as it relates to decentralized peer-to-peer (p2p) file sharing. Decentralized architecture is relatively new and still evolving. The term Web 2.0 was coined in the dot com era by Tim O’Reilly and loosely defined an evolution from Web 1.0 in terms of innovations in mobile, social, and cloud computing. The idea is encapsulated by two main concepts — ‘services, not packaged software’ and ‘software above the level of a single device’

Companies Mentioned

Mention Thumbnail
Mention Thumbnail

Coins Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - The Hitchhiker's Guide to Web 3.0
Sonr HackerNoon profile picture

In this article, I will be examining Web 3.0 as it relates to decentralized peer-to-peer (p2p) file sharing. Decentralized architecture is relatively new and still evolving. The same words can mean different things to different groups, so I’ll start by outlining some general concepts and definitions so we’re on the same page semantically.

Web 2.0 is the current iteration of the internet — it’s the framework that the vast majority of web-based applications and services use today, it’s what most users think of as the “internet” or the “web”.

The term Web 2.0 was coined in the dot com era by Tim O’Reilly and loosely defined an evolution from Web 1.0 in terms of innovations in mobile, social, and cloud computing. The idea is far more nuanced but can be encapsulated by two main concepts — ‘services, not packaged software’ and ‘software above the level of a single device’.

While Web 2.0 is still highly value-creative, especially in the enterprise, there are successful projects emerging from what is potentially the next internet paradigm, referred to today as Web 3.0. It is still nascent, but many believe it is the next evolution of the internet and will be the framework that defines the next era of software.

Just like Web 2.0, the definition of Web 3.0 is nuanced (still evolving, and is often defined differently by different people). For the purposes of this article, I will simply define Web 3.0 as a shift to decentralized networks that rely on a network of peers rather than a client-server network reliant on centralized infrastructure. We’ll get more into the details of p2p networks, but let’s first discuss why decentralization is valuable.

Decentralized networks are often more performant. Let’s look at a file sharing use-case, where user A wants to send a video file to user B. In a centralized client-server network, the video file is uploaded from user A to a server that is downloading, and then user B downloads the file (which the server is then uploading).

The performance is limited by the upload speed of user A, the download speed of user B, the upload & download speed of the server, and the distance between each party. In a pure peer-to-peer system, the file is transported directly from one peer to another. It is only limited by the upload speed of user A, the download speed of user B, and the distance between them.

Accordingly, if the users and the server are all close together and the server upload/download capacity is higher than users A and B, then performance may equal that of a p2p network.

However, as the distance between the users and the server increases and/or the performance of the server decreases (e.g., from high demand), the client-server network will perform poorly compared to the p2p network.

Decentralized networks are antifragile - they get stronger as additional ‘stress’ is added. As more users (peers) are added to the network, the network performance and availability actually increases. In Web 2.0 social applications, this is seen in ‘network effects’: the value of the network/product increases as the network size increases.

Unfortunately, this is not the case for the underlying infrastructure. As an application based on a client-server model increases in popularity, the application provider must increase server capacity/central compute to maintain performance and availability.

Decentralized networks are trustless - the participants involved do not need to know or trust each other or a 3rd party for the system to function. Due to the client-server relationship, Web 2.0 applications are reliant on centralized servers/services, which means users must inherently trust this central authority (person or entity that owns/operates the servers).

This central authority has a unilateral ability to define the rules. In practice, this often manifests as a conflict around data ownership. For example, the popular storage and file sharing service dropbox uses a client-server model: users upload data to centralized servers owned and operated by dropbox, and then download data from these same servers.

Accordingly, all data passes through dropbox, and dropbox has the technical ability to read & copy this data. Thus, the only thing protecting a dropbox user’s data is the user agreement that they trust dropbox to abide by (and not change). In a decentralized p2p network, there is no central authority. Users are the owners and operators of their data, and trust resides with the software itself (rather than the operators).

Decentralized networks are more secure. It’s in the name: peer-to-peer. Data is directly uploaded from one peer and downloaded by the other without using a middle man (central server).

This means that there is no central authority or custodian of your data (unless you choose so). Also, when there is a central authority, there is a higher incentive for attack since data/value is consolidated (i.e. an attacker derives lots of value from being successful once).

Conversely, when data/value is highly distributed, an attacker would have to have exponentially more successful attempts to derive value of the same magnitude. Decentralization reduces the incentive/payout for attackers by orders of magnitude.

Now that we have some shared context, let’s discuss decentralized p2p file sharing specifically. At this point, you might be thinking, “Ok, decentralization sounds great, but what’s new about this? Haven’t services, like p2p file sharing, existed for over a decade?”

To some extent, you would be correct, but there are some essential differences between Web 2.0 p2p services and the ones we can build today on Web 3.0. Let’s take BitTorrent, which was initially released in 2001, as an example. BitTorrent has three main issues (which are solvable today): it is not fully decentralized, it lacks an incentive model, and the code is not usable or extensible.

BitTorrent (and other similar services) is not decentralized in two important ways: it uses centralized servers to track peers and store content metadata. Although peer-to-peer connections are made (to improve performance, i.e., by increasing the number of ‘seeders’), the instantiation of these connections requires a special server called a tracker, which assists in the communication between peers.

The tracker server also keeps track of where file copies reside on peer machines, which ones are available at time of the client request, and helps coordinate efficient transmission and reassembly of the copied file. The use of trackers increases the COGS of the service provider and limits the privacy/security of users.

BitTorrent doesn’t have a proper incentive model. When a user begins using the service to download a file they are called a ‘leecher’, once they have pieces of the file (or if they have other files already stored that other users would like) they can become a ‘seeder’ that is also uploading files to other users.

BitTorrent will prioritize downloads to seeding users (and users that are seeding at faster upload rates are further prioritized). However, other than this, there is no real incentive to seed or store files. Since BitTorrent is reliant on seeding and file availability, it is vulnerable without an incentive model.

BitTorrent (and similar services) have classic software usability issues that make it very difficult to leverage this previous work. Some examples include: Lack of good documentation (or none at all), restrictive licensing (or no licensing), no easy to reach point-of-contact, closed source (or the source doesn’t exist anymore), the implementation doesn’t expose a friendly API, these projects are tightly coupled with a specific use-case.

Ultimately, this means that the service’s ability to adapt and change with user needs and requirements is severely limited. In many cases makes it impossible to address new use-cases. Specifically, BitTorrent has made some improvements (it is less reliant on trackers than in the past), but it is essentially the same service today as it was 20 years ago.

Let’s now examine Web 3.0 capabilities by continuing to use file sharing as an example. Today, you can build fully decentralized p2p applications. Protocol Labs, which developed Libp2p (for decentralized transport) and IPFS (for decentralized storage) is a leader in this space.

Specifically, Libp2p’s transport protocol (via circuit relay & NAT traversal) solves connecting two peers without a special tracker server. Also, IPFS (a decentralized storage service that uses content addressing) allows for fully decentralized storage and is not reliant on either content servers or an extremely large number of live/available peers to achieve file availability.

The Web 3.0 concept of crypto/tokens solves the incentive issue. Specifically, Filecoin is a decentralized storage network that turns cloud storage into an algorithmic market.

The market runs on a blockchain with a native protocol token (FIL), which miners earn by providing storage to clients. Conversely, clients spend FIL (hiring miners) to store or distribute data. Applied to a service like BitTorrent, this means that seeders would be compensated in Filecoin for storing files and making them available for download, and leechers would spend Filecoin to access them.

This incentive structure would increase the incentive to seed and store files, which increases availability and performance to users. Monetary incentives also create a pathway to instantiate legitimate business models. Suppose the file storer were able to make money.

In that case, the legitimate owners of files (media corps) might be willing to participate in the network. As Spotify proved, users don’t actually care much about the ‘free’ aspect of torrents. They care about ease of discovery and distribution.

Most importantly, entities like Protocol Labs are solving software usability issues. They are the creators of Libp2p, IPFS, and Filecoin. All three of these projects are open-source, have up-to-date documentation, and are supported (by a reachable team).

Additionally, protocols like Libp2p are highly modular and built for general-purpose use. This means that products built with Libp2p are highly extensible and can change rapidly to address evolving user needs and new use-cases.

In summary, decentralized p2p applications have many native benefits (especially in terms of performance and security), and now tools and infrastructure are in place to build robust p2p applications.

This brings me to the reason I’m writing this article: the beta release of Sonr - a pure peer-to-peer (decentralized) file sharing application.

Think of it as a cross-platform airdrop that works with any file size, under any network conditions, at any distance. It is built with Libp2p and may incorporate IPFS/Filecoin or other decentralized storage networks in the future. Our vision for Sonr is to create a universal and seamless data delivery protocol. Stay tuned for more news and updates about Sonr soon!