16 KiB
World Wide Web
Want to make your own small website? See our how to.
World Wide Web (www or just the web) is (or was -- by 2023 mainstream web is dead) a network of interconnected documents on the Internet, which we call websites or webpages. Webpages are normally written in the HTML language and can refer to each other by hyperlinks ("clickable" links right in the text). The web itself works on top of the HTTP protocol which says how clients and servers communicate. Some people confuse the web with the Internet, but of course those people are retarded: web is just one of many so called services existing on the Internet (other ones being e.g. email or torrents). In order to browse the web you need an Internet connection and a web browser.
{ How to browse the web in the age of shit? Currently my "workflow" is following: I use the badwolf browser (a super suckless, very fast from-scratch browser that allows turning JavaScript on/off, i.e. I mostly browse small web without JS but can still do banking etc.) with a CUSTOM START PAGE that I completely own and which only changes when I want it to -- this start page is just my own tiny HTML on my disk that has links to my favorite sites (which serves as my suckless "bookmark" system) AND a number of search bars for different search engines (Google, Duckduckgo, Yandex, wiby, Searx, marginalia, Right Dao, ...). This is important as nowadays you mustn't rely on Google or any other single search engine -- I just use whichever engine I deem best for my request at any given time. ~drummyfish }
An important part of the web is also searching its vast amounts of information with search engines such as the infamous Google engine. It also relies on systems such as DNS.
Mainstream web is now EXTREMELY bloated and practically unusable, for more suckless alternatives see gopher and gemini. See also smol web.
The web used to be perhaps the greatest part of the web, the thing that made Internet widespread, however it quickly deteriorated by capitalist mainstreamization and commercialization and by now, in 2020s, it is one of the most illustrative, depressing and most hilarious examples of capitalist bloat. A nice article about the issue, called The Website Obesity Crisis, can be found at https://idlewords.com/talks/website_obesity.htm. There is a tool for measuring a website bloat at https://www.webbloatscore.com/: it computes the ratio of the page size to the size of its screenshot (e.g. YouTube currently scores 35.7).
How It Went To Shit
________________________________________________________________________
| | | | | |
| ENLARGE PENIS WITH SNAKE OIL | | CSS | | Video AD |
|__________________________________| | | | CONSOOOOOOOOOOOOO |
| U.S. PRESIDENT ASSASINATED | BUG | | OOOOOOOOOOOM BICH |
| | | |______________________|
| Article unavailable in your country. | LOL | |
| ___________________________________| |___ Prove you're a |
| | |_______| | human, click all |
| | We deeply care about your privacy <3 | images of type 2 |
| | | quasars. |
| | Will you allow us to use cookies for spying? | [*] [*] [*] [*] |
| | _____ ______ | [*] [*] [*] [*] |
| | | YES | | OK | | _________________ |
| | """"" """""" |_ | FUCK MATURE MOMS||
| |_______________________________________________| || IN 127.0.0.1 ||
| | || CHAT NOW !!! ||
| | Your browser is 2 days old, please update || ||
| | to newest version to view this site. ||*30 NEW MESSAGES*||
|_____|______________________________________________||_________________||
A typical website under capitalism, 2023. For potential far-future readers: this is NOT exaggeration, all websites LITERALLY look like this.
Back in the days (90s and early 2000s) web used to be a place of freedom working more or less in a decentralized manner, on anarchist and often even communist principles -- people used to have their own unique websites where they shared freely and openly, censorship was difficult to implement and mostly non-existent and websites used to have a much better design, were KISS, safer, "open" (no paywalls, registration walls, country blocks, DRM, ...), MUCH faster and more robust as they were pure HTML documents. It was also the case that most websites were truly nice, useful and each one had a "soul" as they were usually made by passionate nerds who had a creative freedom and true desires to create a nice website (yes, even if they were making a commercial website for some company).
As the time marched on web used to become more and more shit, as is the case with everything touched by capitalist hand -- the advent of so called web 2.0 brought about a lot of complexity, websites started to incorporate client-side scripts (JavaScript, Flash, Java applets, ...) which led to many negative things such as incompatibility with browsers (kickstarting browser consumerism and update culture), performance loss and security vulnerabilities (web pages now became Turing complete programs rather than mere documents) and more complexity in web browsers, which leads to immense bloat and browser monopolies (greater effort is needed to develop a browser, making it a privilege of those who can afford it, and those can subsequently dictate de-facto standards that further strengthen their monopolies). Another disaster came with social networks in mid 2000s, most notably Facebook but also YouTube, Twitter and others, which centralized the web and rid people of control. Out of comfort people stopped creating and hosting own websites and rather created a page on Facebook. This gave the power to corporations and allowed mass-surveillance, mass-censorship and propaganda brainwashing. As the web became more and more popular, corporations and governments started to take more control over it, creating technologies and laws to make it less free. By 2020, the good old web is but a memory and a hobby of a few boomers, everything is controlled by corporations, infected with billions of unbearable ads, DRM, malware (trackers, crypto miners, ...), there exist no good web browsers, web pages now REQUIRE JavaScript even if it's not needed in principle due to which they are painfully slow and buggy, there are restrictive laws and censorship and de-facto laws (site policies) put in place by corporations controlling the web.
Mainstream web is quite literally unusable nowadays. What people searched for on the web they now search on on a handful of platforms like Facebook and YouTube (often not even using a web browser but rather a mobile "app"); if you try to "google" something, what you get is just a list of unusable sites written by AIs that load for several minutes (unless you have the latest 1024 TB RAM beast) and won't let you read beyond the first paragraph without registration. These sites are uplifted by SEO for pure commercial reasons, they contain no useful information, just ads. Useful sites are buried under several millions of unusable results or downright censored for political reasons (e.g. using some forbidden word). Thankfully you can still try to browse the smol web with search engines such as wiby, but still that only gives a glimpse of what the good old web used to be.
History
World Wide Web was invented by an English computer scientist Tim Berners-Lee. In 1980 he employed hyperlinks in a notebook program called ENQUIRE, he saw the idea was good. On March 12 1989 he was working at CERN where he proposed a system called "web" that would use hypertext to link documents (the term hypertext was already around). He also considered the name Mesh but settled on World Wide Web eventually. He started to implement the system with a few other people. At the end of 1990 they already had implemented the HTTP protocol for client-server communication, the HTML, language for writing websites, the first web server and the first web browser called WorldWideWeb. They set up the first website http://info.cern.ch that contained information about the project.
In 1993 CERN made the web public domain, free for anyone without any licensing requirements. The main reason was to gain advantage over competing systems such as Gopher that were proprietary. By 1994 there were over 500 web servers around the world. WWW Consortium (W3M) was established to maintain standards for the web. A number of new browsers were written such as the text-only Lynx, but the proprietary Netscape Navigator would go to become the most popular one until Micro$oft's Internet Explorer (see browser wars). In 1997 Google search engine appeared, as well as CSS. There was a economic bubble connected to the explosion of the Web called the dot-comm boom.
Between 2000 and 2010 there used to be a mobile alternative to the web called WAP. Back then mobile phones were significantly weaker than PCs so the whole protocol was simplified, e.g. it had a special markup language called WML instead of HTML. But as the phones got more powerful they simply started to support normal web and WAP disappeared.
Around 2005, the time when YouTube, Twitter, Facebook and other shit sites started to appear and become popular, so called Web 2.0 started to form. This was a shift in the web's paradigm towards more shittiness such as more JavaScript, bloat, interactivity, websites as programs, Flash, social networks etc. This would be the beginning of the web's downfall.
How It Works
It's all pretty well known, but in case you're a nub...
Users browse the Internet using web browsers, programs made specifically for this purpose. Pages on the Internet are addressed by their URL, a kind of textual address such as http://www.mysite.org/somefile.html
. This address is entered into the web browser, the browser retrieves it and displays it.
A webpage can contain text, pictures, graphics and nowadays even other media like video, audio and even programs that run in the browser. Most importantly webpages are hypertext, i.e. they may contain clickable references to other pages -- clicking a link immediately opens the linked page.
The page itself is written in HTML language (not really a programming, more like a file format), a relatively simple language that allows specifying the structure of the text (headings, paragraphs, lists, ...), inserting links, images etc. In newer browsers there are additionally two more important languages that are used with websites (they can be embedded into the HTML file or come in separate files): CSS which allows specifying the look of the page (e.g. text and font color, background images, position of individual elements etc.) and JavaScript which can be used to embed scripts (small programs) into webpages which will run on the user's computer (in the browser). These languages combined make it possible to make websites do almost anything, even display advanced 3D graphics, play movies etc. However, it's all huge bloat, it's pretty slow and also dangerous, it was better when webpages used to be HTML only.
The webpages are stored on web servers, i.e. computers specialized on listening for requests and sending back requested webpages. If someone wants to create a website, he needs a server to host it on, so called hosting. This can be done by setting up one's own server -- so called self hosting -- but nowadays it's more comfortable to buy a hosting service from some company, e.g. a VPS. For running a website you'll also want to buy a web domain (like mydomain.com
), i.e. the base part of the textual address of your site (there exist free hosting sites that even come with free domains if you're not picky, just search...).
When a user enters a URL of a page into the browser, the following happens (it's kind of simplified, there are caches etc.):
- The domain name (e.g.
www.mysite.org
) is converted into an IP address of the server the site is hosted on. This is done by asking a DNS server -- these are special servers that hold the database mapping domain names to IP addresses (when you buy a domain, you can edit its record in this database to make it point to whatever address you want). - The browser sends a request for given page to the IP address of the server. This is done via HTTP (or HTTPS in the encrypted case) protocol (that's the
http://
orhttps://
in front of the domain name) -- this protocol is a language via which web servers and clients talk (besides websites it can communicate additional data like passwords entered on the site, cookies etc.). (If the encrypted HTTPS protocol is used, encryption is performed with asymmetric cryptography using the server's public key whose digital signature additionally needs to be checked with some certificate authority.) This request is delivered to the server by the mechanisms and lower network layers of the Internet, typically TCP/IP. - The server receives the request and sends back the webpage embedded again in an HTTP response, along with other data such as the error/success code.
- Client browser receives the page and displays it. If the page contains additional resources that are needed for displaying the page, such as images, they are automatically retrieved the same way (of course things like caching may be employed so that they same image doesn't have to be readownloaded literally every time).
Cookies, small files that sites can store in the user's browser, are used on the web to implement stateful behavior (e.g. remembering if the user is signed in on a forum). However cookies can also be abused for tracking users, so they can be turned off.
Other programming languages such as PHP can also be used on the web, but they are used for server-side programming, i.e. they don't run in the web browser but on the server and somehow generate and modify the sites for each request specifically. This makes it possible to create dynamic pages such as search engines or social networks.