This commit is contained in:
Miloslav Ciz 2024-12-06 03:33:05 +01:00
parent 9da425a4c1
commit a82ad31a6f
20 changed files with 1979 additions and 1862 deletions

View file

@ -10,7 +10,7 @@ Let's not [confuse](often_confused.md) numbers with digits or figures (numerals)
Humans first started to use positive natural numbers (it seems as early as 30000 BC), i.e. 1, 2, 3 ..., so as to be able to trade, count enemies, days and so on -- since then they kept expanding the concept of a number with more [abstraction](abstraction.md) as they encountered more complex problems. First extension was to fractions, initially reciprocals of integers (like one half, one third, ...) and then general ones. Around 6th century BC Pythagoras showed that there even exist numbers that cannot be expressed as fractions ([irrational numbers](irrational_number.md), which in the beginning was a controversial discovery), expanding the set of known numbers further. A bit later (around 100 BC) negative numbers started to be used. Adoption of the number [zero](zero.md) also took some time (1st use of true zero seem to be in 4th century BC), with it first just having a limited use as a mere placeholder digit. Since 16th century a highly abstract concept of [complex numbers](complex_number.md) started to appear, which was later (19th century) expanded further to [quaternions](quaternion.md). With more advancement in mathematics -- e.g. with the development of set theory -- more and more concepts of new kinds of numbers appeared and still appear to this day. Nowadays we have greatly abstract numbers, ones existing in many dimensions, capable of counting and measuring infinitely large and infinitely small entities, and it seems we still haven't nearly discovered everything there is to know about numbers.
Basically **anything can be encoded as a number** which makes numbers a universal abstract "medium" -- we can exploit this in both mathematics and programming (which are actually the same thing). Ways of encoding [information](information.md) in numbers may vary, for a mathematician it is natural to see any number as a multiset of its [prime](prime.md) factors (e.g. 12 = 2 * 2 * 3, the three numbers are inherently embedded within number 12) that may carry a message, a programmer will probably rather encode the message in [binary](binary.md) and then interpret the 1s and 0s as a number in direct representation, i.e. he will embed the information in the digits. You can probably come up with many more ways.
Basically **anything can be encoded as a number** which makes numbers a universal abstract "medium" -- we can exploit this in both mathematics and [programming](programming.md) (which are actually the same thing). Ways of encoding [information](information.md) in numbers may vary, for a mathematician it is natural to see any number as a multiset of its [prime](prime.md) factors (e.g. 12 = 2 * 2 * 3, the three numbers are inherently embedded within number 12) that may carry a message, a programmer will probably rather encode the message in [binary](binary.md) and then interpret the 1s and 0s as a number in direct representation, i.e. he will embed the information in the digits. You can probably come up with many more ways.
But what really is a number? What makes number a number? Where is the border between numbers and other abstract objects? Essentially number is an abstract mathematical object made to model something about [reality](irl.md) (most fundamentally the concept of counting, expressing amount) which only becomes meaninful and useful by its relationship with other similar objects -- other numbers -- that are parts of the same, usually (but not necessarily) infinitely large set. We create systems to give these numbers names because, due to there being infinitely many of them, we can't name every single one individually, and so we have e.g. the [decimal](decimal.md) system in which the name 12345 exactly identifies a specific number, but we must realize these names are ultimately not of mathematical importance -- we may call a number 1, I, 2/2, "one", "uno" or "jedna", it doesn't matter -- what's important are the relationships between numbers that create a STRUCTURE. I.e. a set of infinitely many objects is just that and nothing more; it is the relationships that allow us to operate with numbers and that create the difference between integers, real numbers or the set of colors. These relatinships are expressed by operations (functions, maps, ...) defined with the numbers: for example the comparison operation *is less than* (<) which takes two numbers, *x* and *y*, and always says either *yes* (*x* is smaller than *y*) or *no*, gives numbers order, it creates the number line and allows us to count and measure. Number sets usually have similar operations, typically for example addition and multiplication, and this is how we intuitively judge what numbers are: they are sets of objects that have defined operations similar to those of natural numbers (the original "cavemen numbers"). However some more "advanced" kind of numbers may have lost some of the simple operations -- for example [complex numbers](complex_number.md) are not so straightforward to compare -- and so they may get more and more distant from the original natural numbers. And this is why sometimes the border between what is and what isn't a number may be blurry -- for example it can't objectively be said if infinity is a number or not, simply because number sets that include infinity lose many of the nicely defined operations, the structure of the set changes a lot. So arguing about what is a number ultimately becomes subjective, it's similar to arguing about what is and isn't a planet.