66 lines
20 KiB
HTML
66 lines
20 KiB
HTML
<!DOCTYPE html>
|
|
<html lang="en">
|
|
<head>
|
|
<meta charset="UTF-8">
|
|
<title>The web never died - Archive - MayVaneDay Studios</title>
|
|
<link href="../../../style.css" rel="stylesheet" type="text/css" media="all">
|
|
<meta name="author" content="Vane Vander">
|
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
|
</head>
|
|
<body class="mayvaneday">
|
|
<article>
|
|
<div class="box">
|
|
<h1>The web never died</h1>
|
|
<p>published: 2024-03-01</p>
|
|
</div>
|
|
<hr>
|
|
<div class="box">
|
|
<p>And <em>your</em> gaze doesn't bring things into reality.</p>
|
|
<p>If I had to take a shot of alcohol for every time I found a personal website and there was a screed about how the Internet was "dying" and "we" needed to "reclaim" it, I'd have died of cirrhosis a long time ago. You, reading this, have probably noticed the same thing. In case you haven't, let's reiterate the same five points all these copy-paste hackjobs make in their manifestos:</p>
|
|
<ol>
|
|
<li>Once upon a time, the Internet was a magical decentralized place where people made small websites about subjects they enjoyed, free of advertising and corporate surveillance.</li>
|
|
<li>Then, around the advent of the first iPhone, the Big Bad Corporations moved in and <em>forced</em> everyone now onboarding the Internet onto a small handful of apps where you couldn't write custom HTML for your profile, only set a profile picture and a short bio and <em>maybe</em> a handful of other text boxes.</li>
|
|
<li>To make life harder for the stragglers, the Big Bad Corporations enshittified their search engines and inserted analytics and other trackers into websites so they'd know every step you took online.</li>
|
|
<li>Then the person who wrote the particular manifesto you're reading discovered Baby's First HTML tutorial and a free website host.</li>
|
|
<li>Now we need to "re-decentralize" the Internet for reasons not well explained.</li>
|
|
</ol>
|
|
<p>Right away there's a problem with this story: <a href="https://archive.md/https://medium.com/design-warp/the-web-was-never-decentralized-bb066138c88">the Internet was <em>never</em> decentralized.</a> Although the progenitor of the Internet, ARPANET, was designed to function without a singular central authority, in order to access the network you pretty much had to be a university student or part of a corporation. Access to the Internet for personal computing purposes didn't really take off until the time of AOL, widely derided for its "walled garden" software that AOL subscribers were required to run in order to connect. Nowadays you don't have to run any particular software or operating system to be able to connect to a modem and surf the web, but as the ongoing saga of Kiwi Farms shows, it only takes a few rogue <a href="https://web.archive.org/web/20240228151539/https://en.wikipedia.org/wiki/Tier_1_network">"Tier 1 and 2"</a> ISPs colluding to severely restrict one's ability to connect to the Internet.</p>
|
|
<p>Now that we have recognized that the central story behind these manifestos is false, we can take a closer look at why it seems to appeal so much. The structure of this narrative- all living in a Garden of Eden until Big Bad Guy arrives and enshittifies their society, leaving the initial residents yearning to return to the Past Times- echoes eerily the fascist narrative of the "mythic past". As Jason Stanley puts it in his book <em>How Fascism Works</em> (emphasis mine):</p>
|
|
<blockquote>The strategic aim of these hierarchal constructions of history is to displace truth, and the invention of a glorious past includes the erasure of inconvenient realities. While fascist politics fetishizes the past, <strong>it is never the <em>actual</em> past that is fetishized.</strong></blockquote>
|
|
<p>Since we have established that the Internet was never decentralized, the manifestos that push such an idea we can now interpret as not being concerned with the <em>actual</em> truth and how to deal with it but instead with invoking a particular emotional response.</p>
|
|
<p>But if fascists are preoccupied with the supposed "invasion" of minorities into their countries, then who serves the purpose of the "invaders" in the "smol web" mythic past? Most manifestos pin the blame on, as stated in point two, the corporations that captured the people who came online around the time that personal phones became useful for Internet browsing. (Many an "Eternal September" comparison has been misused.) Unshackled from the previous requirements of being at home and sitting down at one's computer, or even <em>having</em> a computer, many multitudes of people were suddenly granted the ability to surf the information highway whenever they wanted (or could get signal, anyway). But since the small form factor of phones, even the 2012-era BlackBerries with physical keyboards, aren't conducive for writing HTML and don't have the battery life or stable IP address to host a website (even nowadays, and certainly not back then), the new Internet users were drawn to the newfangled social media sites that allowed them to publish words without having to learn a single line of code or pay a single cent.</p>
|
|
<p>Some would go on to learn HTML and take pride in making a website they could show off to their friends and family (I should know, as I was very proud in my pre-adolescent days to have done so) but <a href="../../2023/august/interview.html">most had higher priorities, more pressing issues, in their life</a>. And so for their purposes Twitter and Facebook were enough. And regardless if they stayed on Twitter or Facebook or tried a different social media platform, every new user increased the value of said platform, drawing in advertisers and making it harder for others to leave because of the loss of one's social graph that leaving would entail. Every fly caught makes the spider's web stickier.</p>
|
|
<p><strong>In the eyes of these people for whom the Internet was new and shiny, if not for social media, they would not have been able to publish on the Internet <em>at all</em>.</strong> And so it still is today. You have only to look at the fanart community for your favorite video game, particularly its more terminally online users, and watch them every time Elon Musk contemplates ruining Twitter further: they <em>know</em> they can make a personal website to showcase their work, but they also know that that means forgoing the algorithms and channels that led to people discovering their work in the first place. And posting to one's site and only using Twitter as a means of notification for new content, to them, just seems like extra work with no reward.</p>
|
|
<p>And in the eyes of most of the people who write these manifestos, they would prefer that these social media sites ceased to exist <em>at all</em>. They would prefer that all these millions, soon to be billions, of voices be silenced to sate their concept of technological purity. That is not to say that social media is a <em>good</em> thing, merely that, if the two options <em>in the minds of the non-technologically-minded</em> are to be</p>
|
|
<ol>
|
|
<li>corralled into a silo</li>
|
|
<li>or to be disembodied online</li>
|
|
</ol>
|
|
<p><strong>they have demonstrated they would prefer the silo.</strong></p>
|
|
<p>The masses can leave the silos at any time. No guns are being pointed at them to force them to keep cancelling each other on Twitter. There is no death penalty for deactivating one's Facebook account. Reddit is not a plantation that the cops will drag you back to if you decide to flee. HTML tutorials and free webhosts are abundant. For those insistent on not learning a single lick of code, it's piss-easy to purchase a VPS with WordPress preinstalled (or find a free hosting service with WordPress support) and get writing in minutes. The resources are there, but the will en masse isn't. After all, making a website outside of a CMS (like WordPress) or a silo (like Neocities) doesn't come with a notification feed that pings with a satisfying red dot. The masses want dopamine and arguments. <strong>The masses want to be where they feel everyone else is.</strong> And if one social media platform dies, they won't suddenly start making websites; they will go find another platform that provides them with the aforementioned dopamine and arguments.</p>
|
|
<p>If you just want to blow up Facebook, then be honest and say so. Hell, I'd probably join in. But destroying a single site will not get rid of the demand for social media altogether. <a href="https://web.archive.org/web/20240126033644/https://en.wikipedia.org/wiki/List_of_defunct_social_networking_services">It didn't happen with Vine and it didn't happen with Friendster and it didn't happen with Xanga.</a></p>
|
|
<p>In short, to pin the blame for the Internet not having as many personal websites as one would like on Mark Zuckerberg or Jack Dorsey or whatever is to deny the agency of the users of Facebook or Twitter and to keep oneself blinded to the benefits that these networks <em>have</em> provided to their users. And demanding that participation on the Internet, arguably one of the most impactful inventions in existence, <em>go back to</em> requiring learning a programming language and adequate technical knowledge of networking is pure cruel elitism. This is far from the compassionate stance that the manifesto writers want you to believe they're taking. <strong>If we <em>can</em> make using technology easier for humans, then we <em>should</em>.</strong></p>
|
|
<p>As for the issue of corporate surveillance on the Internet, if anything, things are <em>better</em> than they were at the dawn of the Internet. HTTPS wasn't formally specified until <a href="https://web.archive.org/web/20231218174645/https://datatracker.ietf.org/doc/html/rfc2818">RFC 2818 in May of 2000</a>, and before that your ISP and <a href="https://web.archive.org/web/20101027170701/https://lifehacker.com/5672313/sniff-out-user-credentials-at-wi+fi-hotspots-with-firesheep">anyone else on your local network</a> could sniff your <em>unencrypted</em> traffic and do almost whatever they wanted with it. Users of all levels of technological ability have many options for blocking ads and trackers, from browser extensions like uBlock Origin to <a href="https://web.archive.org/web/20231218175146/https://rethinkdns.com/app">device-wide firewall</a>s you can enable and forget about. Even those stuck in the paradigm of using social media (instead of making a website) can forgo the algorithmic meddling of the mainstream platforms and the incessant ads they shove into the feeds of users and join a fediverse instance ran by someone they trust. (Of course, that implies they know the fediverse exists and have the will to seek it out...)</p>
|
|
<p>Now that we have demonstrated that the "smol web"'s mythic past is a bunch of bunk and things today <em>aren't</em> as bad as the manifestos would have you believe, where do we go from here? Decentralizing the Internet is still, despite the cringe factor of its advocates, a worthy pursuit. But what does decentralizing entail?</p>
|
|
<p>Most authors of "smol web" manifestos will immediately point to Tor, to its ability to let its users run "hidden services" that are anonymized and accessible exclusively through the Tor network. In some sense, a hidden service on Tor is decentralized: one can use Tor and its ability to punch through NAT and firewalls to host a website at home instead of paying a VPS provider or webhost. <strong>But the Tor network itself is not decentralized</strong>: the backbone of the network is comprised of <a href="https://metrics.torproject.org/rs.html#search/flag:authority">nine "directory authorities"</a>, which are run by people who have either been deeply involved with the Tor Project for a long time or are otherwise considered by the Project as <em>extremely</em> trustworthy. In other words, the Tor network is dependent on a small number of servers that one must trust to not abuse the authority that has been vested in them.</p>
|
|
<p>I2P, Tor's younger cousin, <em>might</em> be a better candidate for building a decentralized Internet upon. In I2P, every peer is also a router for traffic, meaning that everyone in the network shares the same level of authority and responsibility. Routes for "tunnels" to I2P's hidden services are <a href="https://web.archive.org/web/20231219003147/https://geti2p.net/en/about/intro">distributed in a, well, <em>Distributed</em> Hash Table</a>. However, even the I2P developers admit the network is not <em>completely</em> decentralized:</p>
|
|
<blockquote>The I2P network is almost completely decentralized, with exception to what are called Reseed Servers. This is to deal with the DHT ( Distributed Hash Table ) bootstrap problem. Basically, there is not a good and reliable way to get out of running at least one permanent bootstrap node that non-network participants can find to get started. Once connected to the network, a router only discovers peers by building "exploratory" tunnels, but to make the initial connection, a reseed host is required to create connections and onboard a new router to the network. Reseed servers can observe when a new router has downloaded a reseed from them, but nothing else about traffic on the I2P network.</blockquote>
|
|
<p>To be fair, if one does not like the default reseed server provided by the developers, <code>i2pd</code> allows you to change the server or file it retrieves new peers from.</p>
|
|
<p>But maybe we are going about this the wrong way. Tor and I2P are both overlay networks, which mean they run on top of the existing Internet infrastructure. While I2P's reseed server can be on a LAN or reseed file shared through a sneakernet, Tor requires one to have access to the wider Internet in order to connect. If full <em>decentralization</em> is the goal, and not merely running a personal website, we have to go further than just putting more layers on top of the cake that is the OSI model. We have to think about a completely new cake.</p>
|
|
<p>For the past few months, I have been working on <a href="https://codeberg.org/lethe/ZimBuildScripts">a collection of Bash scripts that allows one to scrape (or pull from a Git repo) a specific website and then bundle it into a file that the Kiwix reader software understands</a>. Once the site in question is bundled into a <code>.zim</code> file, the entire thing can be shared, copied, and archived through any medium of sharing files that one can think of: torrent, upload to a file-sharing site, put on a USB flash drive and lend to a friend... The only disadvantage of this method is that, if the creator of the site makes any changes or updates to the content, you have to scrape (or <code>git pull</code>) the content again and rebuild the <code>.zim</code> file. In this manner I am able to have many personal sites whose content I value stored on my local hard drive without having to worry about a rogue ISP or CloudFlare or any other "steward" of the Internet blocking my access on a whim and without warning.</p>
|
|
<p>But this approach has no consideration for social sites or any kind of communication, only for static self-contained sites. Which may be healthier for us humans in the long run, if we are forced to interact face-to-face (or go to a hypothetical open hotspot where one can drop files for others to download later) in order to exchange information instead of being able to hide behind our keyboards to harangue people on the other side of the world. If we want to look at communication networks in a world where the Internet has become peer-to-peer and node-to-node instead of routing through large ISPs, or in a world where one sometimes has access to the Internet at large and sometimes not, maybe we would do better to look at Reticulum instead.</p>
|
|
<p>Reticulum, as explained on <a href="https://web.archive.org/web/20231221011736/https://reticulum.network/">its website</a>, is a networking stack engineered to hold up in environments with high latency and low bandwidth, even to the extremes of both. Unlike the existing TCP/IP structure that powers the Internet, Reticulum mandates encryption between nodes (and nodes that don't encrypt their packets will have their packets dropped by all other nodes) and uses ephemeral encryption keys to make replay attacks harder and doesn't put source addresses in the packets to make it harder for nodes to see which nodes are talking to which other nodes. (Think of it like sending a letter through the mail without putting a return address on it.) In fact, no part of Reticulum depends on a TCP/IP link at all: in addition to "normal" networks, Reticulum packets can be routed through Ethernet devices, LoRa, packet radio, serial, and <code>stdio</code> programs/hardware to design even more link types than the developers have thought of at this moment.</p>
|
|
<p>The only disadvantage Reticulum might have in the pursuit of advertising it to "the web is dying"-type folks is that, due to the design of the packet system, Reticulum cannot (currently, at least) route HTTP traffic. That means the vast majority of self-hosted programs designed to be served over HTTP <em>will not work</em>. Sites <em>can</em> be served over Reticulum... if you're willing to learn the Micron markup language and are okay with text-only sites like one might find on Gopher or Gemini. (To Micron's credit, the range of formatting one can do exceeds either of the two "G" networks: changing the color of the text <em>and</em> the background space behind the text, bold and italics and <em>underline</em> (which even Markdown doesn't have)...) NomadNet, currently the Reticulum client with the most features, allows one to make sites using CGI if they don't like static Micron, as well to leverage headers in Micron files to allow restricting access to certain files to only certain peers. (No passwords required: either you're on the list or you're not. And good luck breaking the network's encryption in order to spoof an address.) A command-line program that facilitates <a href="https://web.archive.org/web/20231221014750/https://github.com/acehoss/rnsh">SSH connections over Reticulum</a> is also in active development to potentially allow for even more interactive experiences than Micron and CGI can provide.</p>
|
|
<p>Reticulum doesn't give a shit about ICANN or IANA or DNS or Tier 1/2 ISPs. Whatever hardware you have, whatever network you have available, Reticulum will find a way to others. <a href="https://web.archive.org/web/20231221020920/https://reticulum.network/manual/understanding.html">And hey, it's written in Python, so you don't have to fry your brain trying to learn some overcomplicated low-level programming language just to have sovereignty over your own damn communications.</a> (I'm looking at you, Varvara. Who the hell has time to learn Assembly in the apocalypse? We have food acquisition to do!)</p>
|
|
<p>But you'd be hard-pressed to find a "smol web" advocate who put that much thought into it, who considered the implications of decentralization. And despite privacy being baked into Reticulum from the very beginning, despite the lack of JavaScript and tracking endemic to mainstream websites, I've never seen a single manifesto-writer advocate for even so much as mirroring one's site to Reticulum. Occasionally Gemini, but Gemini is every bit as centralized as HTTP is.</p>
|
|
<p>Maybe that's because, for the vast majority of Internet users, the web <em>hasn't</em> died. Nobody, with the exception of <em>maybe</em> your parents or the size of your phone's keyboard, is stopping you from learning HTML and finding a shitty backwaters free webhost. I was doing it in 2009 as a "minor" with no credit card and a fake persona, and options for hosting have only gotten <em>better</em> since then. To imply that the whole web is dying just because <em>you</em> aren't personally having fun on it anymore is pure narcissism.</p>
|
|
<p>Touch grass.</p>
|
|
</div>
|
|
<hr>
|
|
<div class="box">
|
|
<p align=right>CC BY-NC-SA 4.0 © Vane Vander</p>
|
|
</div>
|
|
</article>
|
|
</body>
|
|
</html>
|