This commit is contained in:
Miloslav Ciz 2024-05-14 13:35:32 +02:00
parent fc95f9c631
commit bf2ed8760e
12 changed files with 1919 additions and 1755 deletions

View file

@ -45,7 +45,7 @@ Computers are theoretically studied by [computer science](compsci.md). The kind
**The power of computers is mathematically limited**, [Alan Turing](turing.md) mathematically proved that there exist problems that can never be completely solved by any [algorithm](algorithm.md), i.e. there are problems a computer (including our [brain](brain.md)) will never be able to solve (even if solution exists). This is related to the fact that the power of mathematics itself is limited in a similar way (see [Godel's theorems](incompleteness_theorems.md)). Turing also invented the theoretical model of a computer called the [Turing machine](turing_machine.md). Besides the mentioned theoretical limitation, many solvable problems may take too long to compute, at least with computers we currently know (see [computational complexity](computational_complexity.md) and [P vs NP](p_vs_np.md)).
And let's also mention some [curious](interesting.md) **statistics** and facts about computers as of the year 2024. The fist computer in modern sense of the word is frequently considered to have been the Analytical Engine designed in 1837 by an Englishman Charles Babbage, a general purpose [mechanical computer](mechanical_computer.md) which he however never constructed. After this the computers such as the Z1 (1938) and Z3 (1941) of a German inventor Konrad Zuse are considered to be the truly first "modern" computers. Shortly after the year 2000 the number of US households that had a computer surpassed 50%. The fastest [supercomputer](supercomputer.md) of today is Frontier (Tennessee, [USA](usa.md)) which achieved computation speed of 1.102 exaFLOPS (that is over 10^18 [floating point](float.md) operations per second) with power 22.7 MW, using the [Linux](linux.md) kernel (like all top 500 supercomputers). Over time transistors have been getting much smaller -- there is the famous **[Moore's law](moores_law.md)** which states that number of transistors in a chip doubles about every two years. Currently we are able to manufacture [transistors](transistor.md) as small as a few nanometers and chips have billions of them. { There's some blurriness about exact size, apparently the new "X nanometers" labels are just [marketing](marketing.md) lies. ~drummyfish }
And let's also mention some [curious](interesting.md) **statistics** and facts about computers as of the year 2024. The fist computer in modern sense of the word is frequently considered to have been the Analytical Engine designed in 1837 by an Englishman Charles Babbage, a general purpose [mechanical computer](mechanical_computer.md) which he however never constructed. After this the computers such as the Z1 (1938) and Z3 (1941) of a German inventor Konrad Zuse are considered to be the truly first "modern" computers. Shortly after the year 2000 the number of US households that had a computer surpassed 50%. The fastest [supercomputer](supercomputer.md) of today is Frontier (Tennessee, [USA](usa.md)) which achieved computation speed of 1.102 exaFLOPS (that is over 10^18 [floating point](float.md) operations per second) with power 22.7 MW, using [Linux](linux.md) as its kernel (like all top 500 supercomputers). Over time transistors have been getting much smaller -- there is the famous **[Moore's law](moores_law.md)** which states that number of transistors in a chip doubles about every two years. Currently we are able to manufacture [transistors](transistor.md) as small as a few nanometers and chips have billions of them. { There's some blurriness about exact size, apparently the new "X nanometers" labels are just [marketing](marketing.md) lies. ~drummyfish }
## Typical Computer