This commit is contained in:
Miloslav Ciz 2023-04-02 20:57:34 +02:00
parent 138510fee5
commit 018a44958c
5 changed files with 112 additions and 4 deletions

View file

@ -1,6 +1,6 @@
# Doom # Doom
Doom is a legendary video [game](game.md) released in 1993, perhaps the most famous game of all time, the game that popularized the [first person shooter](first_person_shooter.md) genre and shocked by its at the time extremely advanced [3Dish](pseudo_3D.md) graphics. It was made by [Id Software](id_software.md), most notably by [John Carmack](john_carmack.md) (graphics + engine programmer) and [John Romero](john_romero.md) (tool programmer + level designer). Doom is sadly [proprietary](proprietary.md), it was originally distributed as [shareware](shareware.md) (a free "demo" was available for playing and sharing with the option to buy a full version). However the game engine was later (1999) released as [free (as in freedom) software](free_software.md) under [GPL](gpl.md) which gave rise to many source [ports](port.md). The assets remain non-free but a completely free alternative is offered by the [Freedoom](freedoom.md) project that has created [free as in freedom](free_culture.md) asset replacements for the game. [Anarch](anarch.md) is an official [LRS](lrs.md) game inspired by Doom, completely in the [public domain](public_domain.md). Doom is a legendary video [game](game.md) released in 1993, perhaps the most famous video game of all time, the game that popularized the [first person shooter](first_person_shooter.md) genre and shocked by its at the time extremely advanced [3Dish](pseudo_3D.md) graphics. It was made by [Id Software](id_software.md), most notably by [John Carmack](john_carmack.md) (graphics + engine programmer) and [John Romero](john_romero.md) (tool programmer + level designer). Doom is sadly [proprietary](proprietary.md), it was originally distributed as [shareware](shareware.md) (a free "demo" was available for playing and sharing with the option to buy a full version). However the game engine was later (1999) released as [free (as in freedom) software](free_software.md) under [GPL](gpl.md) which gave rise to many source [ports](port.md). The assets remain non-free but a completely free alternative is offered by the [Freedoom](freedoom.md) project that has created [free as in freedom](free_culture.md) asset replacements for the game. [Anarch](anarch.md) is an official [LRS](lrs.md) game inspired by Doom, completely in the [public domain](public_domain.md).
{ Great books about Doom I can recommend: *Masters of Doom* (about the development) and *Game Engine Black Book: Doom* (details about the engine internals). ~drummyfish } { Great books about Doom I can recommend: *Masters of Doom* (about the development) and *Game Engine Black Book: Doom* (details about the engine internals). ~drummyfish }

View file

@ -1,6 +1,6 @@
# Drummyfish # Drummyfish
Drummyfish (also known as *tastyfish*, *drummy*, *drumy* and *i forcefeed my diarrhea to capitalism*) is a programmer, [anarchopacifist](anpac.md) and proponent of [free software/culture](free_software.md), who started [this wiki](lrs_wiki.md) and invented the kind of software it focuses on: [less retarded software](lrs.md) (LRS). Besides others he has written [Anarch](anarch.md), [small3dlib](small3dlib.md), [raycastlib](raycastlib.md), [smallchesslib](smallchesslib.md), [tinyphysicsengine](tinyphysicsengine.md) and [SAF](saf.md). He has also been creating free culture art and otherwise contributing to free projects such as [OpenMW](openm.md); he's been contributing with [public domain](pd.md) art of all kind (2D, 3D, music, ...) and writings to [Wikipedia](wikipedia.md), [Wikimedia Commons](wm_commons.md), [opengameart](oga.md), [libregamewiki](lgw.md), freesound and others. Drummyfish is crazy, suffering from anxiety/depression/etcetc. (diagnosed [avoidant personality disorder](avpd.md)), and has no [real life](irl.md), he is pretty retarded when it comes to leading projects or otherwise dealing with people or practical life. He is a [wizard](wizard.md). Drummyfish (also known as *tastyfish*, *drummy*, *drumy*, *smellyfish* and *i forcefeed my diarrhea to capitalism*) is a programmer, [anarchopacifist](anpac.md) and proponent of [free software/culture](free_software.md), who started [this wiki](lrs_wiki.md) and invented the kind of software it focuses on: [less retarded software](lrs.md) (LRS). Besides others he has written [Anarch](anarch.md), [small3dlib](small3dlib.md), [raycastlib](raycastlib.md), [smallchesslib](smallchesslib.md), [tinyphysicsengine](tinyphysicsengine.md) and [SAF](saf.md). He has also been creating free culture art and otherwise contributing to free projects such as [OpenMW](openm.md); he's been contributing with [public domain](pd.md) art of all kind (2D, 3D, music, ...) and writings to [Wikipedia](wikipedia.md), [Wikimedia Commons](wm_commons.md), [opengameart](oga.md), [libregamewiki](lgw.md), freesound and others. Drummyfish is crazy, suffering from anxiety/depression/etcetc. (diagnosed [avoidant personality disorder](avpd.md)), and has no [real life](irl.md), he is pretty retarded when it comes to leading projects or otherwise dealing with people or practical life. He is a [wizard](wizard.md).
He loves all living beings, even those whose attributes he hates or who hate him. He is a [vegetarian](vegetarianism.md) and here and there supports good causes, for example he donates hair and gives money to homeless people who ask for them. He loves all living beings, even those whose attributes he hates or who hate him. He is a [vegetarian](vegetarianism.md) and here and there supports good causes, for example he donates hair and gives money to homeless people who ask for them.

View file

@ -8,7 +8,7 @@ Information is knowledge that can be used for making decisions. Information is i
In [computer science](compsci.md) the basic unit of information amount is 1 **[bit](bit.md)** (for *binary digit*), also known as [shannon](shannon.md). It represents a choice of two possible options, for example an answer to a *yes/no* question (with each answer being equally likely), or one of two [binary](binary.md) digits: 0 or 1. From this we derive higher units such as [bytes](byte.md) (8 bits), [kilobytes](memory_units.md) (1000 bytes) etc. Other units of information include [nat](nat.md) or [hart](hart.md). With enough bits we can encode any information including text, sounds and images. For this we invent various [formats](file_format.md) and encodings with different properties: some encodings may for example contain [redundancy](redundancy.md) to ensure the encoded information is preserved even if the data is partially lost. Some encodings may try to hide the contained information (see [encryption](encryption.md), [obfuscation](obfuscation.md), [steganography](steganography.md)). For processing information we create [algorithms](algorithm.md) which we usually execute with [computers](computer.md). We may store information (contained in data) in physical media such as [books](book.md), computer [memory](memory.md) or computer storage media such as [CDs](cd.md), or even with traditional potentially [analog](analog.md) media such as photographs. In [computer science](compsci.md) the basic unit of information amount is 1 **[bit](bit.md)** (for *binary digit*), also known as [shannon](shannon.md). It represents a choice of two possible options, for example an answer to a *yes/no* question (with each answer being equally likely), or one of two [binary](binary.md) digits: 0 or 1. From this we derive higher units such as [bytes](byte.md) (8 bits), [kilobytes](memory_units.md) (1000 bytes) etc. Other units of information include [nat](nat.md) or [hart](hart.md). With enough bits we can encode any information including text, sounds and images. For this we invent various [formats](file_format.md) and encodings with different properties: some encodings may for example contain [redundancy](redundancy.md) to ensure the encoded information is preserved even if the data is partially lost. Some encodings may try to hide the contained information (see [encryption](encryption.md), [obfuscation](obfuscation.md), [steganography](steganography.md)). For processing information we create [algorithms](algorithm.md) which we usually execute with [computers](computer.md). We may store information (contained in data) in physical media such as [books](book.md), computer [memory](memory.md) or computer storage media such as [CDs](cd.md), or even with traditional potentially [analog](analog.md) media such as photographs.
Keep in mind that the **amount of physically present bits doesn't have to equal the amount of information** because, as mentioned above, data that takes *N* bits may e.g. utilize redundancy and so store less information that would theoretically be possible with *N* bits. It may happen that the stored bits are [correlated](correlation.md) for any reason or different binary values convey the same information (e.g. in some number encodings there are two values for number zero: positive and negative). All this means that the amount of information we receive in *N* bit data may be lower (but never higher) than *N* bits. Keep in mind that the **amount of physically present bits doesn't have to equal the amount of information** because, as mentioned above, data that takes *N* bits may e.g. utilize redundancy and so store less information that would theoretically be possible with *N* bits. It may happen that the stored bits are [correlated](correlation.md) for any reason or different binary values convey the same information (e.g. in some number encodings there are two values for number zero: positive and negative). All this means that the amount of information we receive in *N* bit data may be lower (but never higher) than *N* bits, i.e. if we e.g. store a file on a 1 GB flash drive, the actual theoretical information contained may be lower -- the exact size of such theoretical information depends on probabilities of what can really appear in the file and MAY CHANGE with the knowledge we posses, i.e. the amount of information stored on the flash drive may change by simply us coming to know that the file stored on the drive is a movie about cats which rules out many combinations of bits that can be stored there. Imagine a simplified case when there is file which says whether there exists infinitely many [prime numbers](prime.md) -- to a mathematician who already knows the answer the file gives zero information, while to someone who doesn't know the answer the file provides 1 bit of information. However in practice we often make the simplification of equating the amount of physically present bits to the contained "information".
Information is related to **information [entropy](entropy.md)** (also Shannon entropy, similar to but distinct from the concept of thermodynamic entropy in physics); they're both measured in same units (usually [bits](bit.md)) but entropy measures a kind of "uncertainty" or average information received from a certain event when we know its probability distribution -- in a sense information and entropy can be seen as opposites: before we receive information we lack the information but there exists entropy, once we receive the information there is information but no entropy. Information is related to **information [entropy](entropy.md)** (also Shannon entropy, similar to but distinct from the concept of thermodynamic entropy in physics); they're both measured in same units (usually [bits](bit.md)) but entropy measures a kind of "uncertainty" or average information received from a certain event when we know its probability distribution -- in a sense information and entropy can be seen as opposites: before we receive information we lack the information but there exists entropy, once we receive the information there is information but no entropy.

View file

@ -21,7 +21,7 @@ We may kind of see vectors as matrices that have either only one column, so call
|5 7.3 -2| |5 7.3 -2|
``` ```
is really a 3x1 matrix that as a column vector (1x3 matrix) would look as is really a 1x3 matrix that as a column vector (3x1 matrix) would look as
``` ```
|5 | |5 |

108
randomness.md Normal file
View file

@ -0,0 +1,108 @@
# Randomness
*Not to be confused with [pseudorandomess](pseudorandomness.md).*
TODO
## Truly Random Sequence Example
WORK IN PROGRESS { Also I'm not too good at statistics lol. ~drummyfish }
Here is a sequence of bits which we most definitely could consider truly random as it was generated by physical coin tosses:
{ The method I used to generate this: I took a plastic bowl and 10 coins, then for each round I threw the coins into the bowl, shook them (without looking, just in case), then rapidly turned it around and smashed it against the ground. I took the bowl up and wrote the ten generated bits by reading the coins kind of from "top left to bottom right" (heads being 1, tails 0). ~drummyfish }
```
00001110011101000000100001011101111101010011100011
01001101110100010011000101101001000010111111101110
10110110100010011011010001000111011010100100010011
11111000111011110111100001000000001101001101010000
11111111001000111100100011010110001011000001001000
10001010111110100111110010010101001101010000101101
10110000001101001010111100100100000110000000011000
11000001001111000011011101111110101101111011110111
11010001100100100110001111000111111001101111010010
10001001001010111000010101000100000111010110011000
00001010011100000110011010110101011100101110110010
01010010101111101000000110100011011101100100101001
00101101100100100101101100111101001101001110111100
11001001100110001110000000110000010101000101000100
00110111000100001100111000111100011010111100011011
11101111100010111000111001010110011001000011101000
01001111100101001100011100001111100011111101110101
01000101101100010000010110110000001101001100100110
11101000010101101111100111011011010100110011110000
10111100010100000101111001111011010110111000010101
```
Let's now take a look at how random the sequence looks, i.e. basically how likely it is that by generating random numbers by tossing a coin will give us a sequence with statistical properties (such as the ratio of 1s and 0s) that our obtained sequence has.
There are **494 1s and 506 0s**, i.e. the ratio is approximately 0.976, deviating from 1.0 (the value that infinitely many coin tosses should converge to) by only 0.024. We can use the [binomial distribution](binomial_distribution.md) to calculate the "rarity" of getting this deviation or higher one; here we get about 0.728, i.e. a pretty high probability, meaning that if we perform 1000 coin tosses like the one we did, we may expect to get the deviation we got or higher in more than 70% of cases (if on the other hand we only got e.g. 460 1s, this probability would be only 0.005, suggesting the coins we used weren't fair). If we take a look at how the ratio (rounded to two fractional digits) evolves after each round of performing additional 10 coin tosses, we see it gets pretty close to 1 after only about 60 tosses and stabilizes quite nicely after about 100 tosses: 0.67, 0.54, 0.67, 0.90, 0.92, 1.00, 0.94, 0.90, 0.88, 1.00, 1.04, 1.03, 0.97, 1.00, 0.97, 1.03, 1.10, 1.02, 0.98, 0.96, 1.02, 1.02, 1.02, 1.00, 0.95, 0.95, 0.99, 0.99, 0.99, 0.97, 0.95, 0.95, 0.96, 0.93, 0.90, 0.88, 0.90, 0.93, 0.95, 0.98, 0.98, 0.97, 0.97, 0.99, 1.00, 0.98, 0.98, 0.98, 0.97, 0.96, 0.95, 0.94, 0.95, 0.95, 0.96, 0.95, 0.96, 0.95, 0.96, 0.95, 0.96, 0.95, 0.96, 0.96, 0.97, 0.97, 0.97, 0.95, 0.94, 0.93, 0.93, 0.93, 0.94, 0.94, 0.94, 0.96, 0.95, 0.96, 0.96, 0.95, 0.96, 0.95, 0.95, 0.96, 0.97, 0.97, 0.96, 0.96, 0.95, 0.95, 0.95, 0.96, 0.97, 0.97, 0.97, 0.97, 0.96, 0.97, 0.98, 0.98.
Let's try the [chi-squared test](chi_squared_test.md) (the kind of basic "randomness" test): *D = (494 - 500)^2 / 500 + (506 - 500)^2 / 500 = 0.144*; now in the table for the chi square distribution for 1 degree of freedom (i.e. two categories, 0 and 1, minus one) we see this value of *D* falls somewhere around 30%, which is not super low but not very high either, so we can see the test doesn't invalidate the hypothesis that we got numbers from a uniform random number generator. { I did this according to Knuth's *Art of Computer Programming* where he performed a test with dice and arrived at a number between 25% and 50% which he interpreted in the same way. For a scientific paper such confidence would of course be unacceptable because there we try to "prove" the validity of our hypothesis. Here we put much lower confidence level as we're only trying not fail the test. To get a better confidence we'd probably have to perform many more than 1000 tosses. ~drummyfish }
We can try to convert this to a sequence of integers of different binary sizes and just "intuitively" see if the sequences still looks random, i.e. if there are no patterns such as e.g. the numbers only being odd or the histograms of the sequences being too unbalanced, we could also possibly repeat the chi-squared test etc.
The sequence as 100 10 bit integers (numbers from 0 to 1023) is:
```
57 832 535 501 227 311 275 90 267 1006
730 155 273 874 275 995 759 528 52 848
1020 572 565 556 72 555 935 805 309 45
704 842 969 24 24 772 963 479 695 759
838 294 241 998 978 548 696 337 29 408
41 774 429 370 946 330 1000 104 886 297
182 293 719 308 956 806 398 12 84 324
220 268 911 107 795 958 184 917 612 232
318 332 451 911 885 278 784 364 52 806
929 367 630 851 240 753 261 926 859 533
```
As 200 5 bit integers (numbers from 0 to 31):
```
1 25 26 0 16 23 15 21 7 3 9 23 8 19 2 26 8 11 31 14
22 26 4 27 8 17 27 10 8 19 31 3 23 23 16 16 1 20 26 16
31 28 17 28 17 21 17 12 2 8 17 11 29 7 25 5 9 21 1 13
22 0 26 10 30 9 0 24 0 24 24 4 30 3 14 31 21 23 23 23
26 6 9 6 7 17 31 6 30 18 17 4 21 24 10 17 0 29 12 24
1 9 24 6 13 13 11 18 29 18 10 10 31 8 3 8 27 22 9 9
5 22 9 5 22 15 9 20 29 28 25 6 12 14 0 12 2 20 10 4
6 28 8 12 28 15 3 11 24 27 29 30 5 24 28 21 19 4 7 8
9 30 10 12 14 3 28 15 27 21 8 22 24 16 11 12 1 20 25 6
29 1 11 15 19 22 26 19 7 16 23 17 8 5 28 30 26 27 16 21
```
Which has the following histogram:
```
number: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
count: 6 6 3 6 5 5 7 5 11 10 7 6 7 3 4 5
number: 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
count: 7 9 3 5 4 8 7 8 9 4 8 6 8 6 6 6
```
And as 250 4 bit integers (numbers from 0 to 15):
```
0 14 7 4 0 8 5 13 15 5 3 8 13 3 7 4 4 12 5 10 4 2 15 14 14
11 6 8 9 11 4 4 7 6 10 4 4 15 14 3 11 13 14 1 0 0 13 3 5 0
15 15 2 3 12 8 13 6 2 12 1 2 2 2 11 14 9 15 2 5 4 13 4 2 13
11 0 3 4 10 15 2 4 1 8 0 6 3 0 4 15 0 13 13 15 10 13 14 15 7
13 1 9 2 6 3 12 7 14 6 15 4 10 2 4 10 14 1 5 1 0 7 5 9 8
0 10 7 0 6 6 11 5 7 2 14 12 9 4 10 15 10 0 6 8 13 13 9 2 9
2 13 9 2 5 11 3 13 3 4 14 15 3 2 6 6 3 8 0 12 1 5 1 4 4
3 7 1 0 12 14 3 12 6 11 12 6 15 11 14 2 14 3 9 5 9 9 0 14 8
4 15 9 4 12 7 0 15 8 15 13 13 5 1 6 12 4 1 6 12 0 13 3 2 6
14 8 5 6 15 9 13 11 5 3 3 12 2 15 1 4 1 7 9 14 13 6 14 1 5
```
This has the following histogram:
```
number: 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
count: 18 14 19 18 23 15 18 11 11 14 9 10 13 20 18 19
```
TODO: see how much some compression program can compress it? Visualize it somehow to reveal correlations?