25 lines
8.7 KiB
Markdown
25 lines
8.7 KiB
Markdown
# Determinism
|
|
|
|
*"God doesn't play dice."* --[some German dude](einstein.md)
|
|
|
|
Deterministic system (such as a computer program or an equation) is one which over time evolves without any involvement of [randomness](randomness.md); i.e. its current state along with the rules according to which it behaves unambiguously and precisely determine its following states. This means that a deterministic [algorithm](algorithm.md) will always give the same result if run multiple times with the same input values. Determinism is an extremely important concept in [computer science](compsci.md) and [programming](programming.md) (and in many other fields of science and philosophy). For example [game of life](game_of_life.md) is a deterministic system while [Markov chain](markov_chain.md) is not.
|
|
|
|
Determinism is also a **philosophical theory** and aspect of physics theories -- here it signifies that our Universe is deterministic, i.e. that everything is already predetermined by the state of the universe and the laws of physics, i.e. that we don't have "[free will](free_will.md)" (whatever it means) because our brains are just machines following laws of physics like any other matter etc. Many normies believe [quantum physics](quantum.md) disproves determinism which is however not the case, there may e.g. exist hidden variables that still make quantum physics deterministic -- some believe the Bell test disproved hidden variables but again this is NOT the case as it relies on statistical independence of the experimenters, determinism is already possible if we consider the choices of experimenters are also predetermined (this is called [superdeterminism](superdeterminism.md)). [Einstein](einstein.md) and many others still believed determinism was the way the Universe works even after quantum physics emerged. { This also seems correct to me. Sabine Hossenfelder is another popular physicist promoting determinism. ~drummyfish } Anyway, this is already beyond the scope of technological determinism.
|
|
|
|
[Computers](computer.md) are mostly deterministic by nature and design, they operate by strict rules and engineers normally try to eliminate any random behavior as that is mostly undesirable (with certain exceptions mentioned below) -- randomness leads to hard to detect and hard to fix [bugs](bug.md), unpredictability etc. Determinism has furthermore many advantages, for example if we want to record a behavior of a deterministic system, it is enough if we record only the inputs to the system without the need to record its state which saves a great amount of space -- if we later want to replay the system's behavior we simply rerun the system with the recorded inputs and its behavior will be the same as before (this is exploited e.g. in recording gameplay demos in video [games](game.md) such as [Doom](doom.md)).
|
|
|
|
Determinism can however also pose a problem, notable e.g. in cryptography where we DO want true randomness e.g. when generating [seeds](seed.md). Determinism in this case implies an attacker knowing the conditions under which we generated the seed can exactly replicate the process and arrive at the seed value that's supposed to be random and secret. For this reason some [CPUs](cpu.md) come with special hardware for generating truly random numbers.
|
|
|
|
Despite the natural determinism of computers as such, **computer programs nowadays aren't always automatically deterministic** -- if you're writing a typical interactive computer program under some operating system, you have to make some extra bit of effort to make it deterministic. This is because there are things such as possible difference in timings or not perfectly specified behavior of [floating point](float.md) types in your language; for example a game running on slower computer will render fewer [frames per second](fps.md) and if it has FPS-dependent physics, the time step of the physics engine will be longer on this computer, possibly resulting in slightly different physics behavior due to rounding errors. This means that such program run with the same input data will produce different results on different computers or under slightly different circumstances, i.e. it would be non-deterministic.
|
|
|
|
Nevertheless **we almost always want our programs to be deterministic** (or at least deterministic under some conditions, e.g. on the specific hardware platform we are using), always try to make your programs deterministic unless you have a VERY good reason not to! **It doesn't take a huge effort to achieve determinism**, it's more of just taking the right design decisions (e.g. separating rendering and physics simulation), i.e. good programming leads to determinism and vice versa, determinism in your program indicates good programming. The reason why we want determinism is that such programs have great properties, e.g. that of easier debugging (bugs are reproducible just by knowing the exact inputs), easy and efficient recording of activity (e.g. demos in games), sometimes even time reversibility (like undos, but watch out -- this doesn't hold in general!). Determinism also itself serves as a kind of a [test](test.md) if the program is working right -- if your program can take recorded inputs and reproduce same behavior at every run, it shows that it's written well, without things like [undefined behavior](undefined_behavior.md) affecting its behavior.
|
|
|
|
{ The previous paragraph is here because I've talked to people who thought that determinism was some UBER feature that requires a lot of work and so on ("OMG Trackmania is deterministic, what a feat!") -- this is NOT the case. It may intuitively seem so to non-programmers or beginners, but really this is not the case. Non-determinism in software appears usually due to a fuck up, ignorance or bad design choice made by someone with a low competence. Trust me, determinism is practically always the correct way of making programs and it is NOT hard to do. ~drummyfish }
|
|
|
|
**Even if we're creating a program that somehow works with probability, we usually want to make it deterministic!** This means we don't use actual random numbers but rather [pseudorandom](pseudorandomness.md) number generators that output [chaotic](chaos.md) values which simulate randomness, but which will nevertheless be exactly the same when ran multiple times with the same initial seed. This is again important e.g. for [debugging](debugging.md) the system in which replicating the bug is key to fixing it. If under normal circumstances you want the program to really behave differently in each run, you make it so only by altering its initial random [seed](seed.md).
|
|
|
|
In theoretical computer science non-determinism means that a model of computation, such as a [Turing machine](turing_machine.md), may at certain points decide to make one of several possible actions which is somehow most convenient, e.g. which will lead to finding a solution in shortest time. Or in other words it means that the model makes many computations, each in different path, and at the end we conveniently pick the "best" one, e.g. the fastest one. Then we may talk e.g. about how the computational strength or speed of computation differ between a deterministic and non-deterministic Turing machine etc.
|
|
|
|
**Determinism does NOT guarantee [reversibility](reversibility.md)**, i.e. if we know a state of a deterministic system, it may not always be possible to say from which state it evolved, or in other words: a system that's deterministic may or may not be deterministic in reverse time direction. This reversibility is only possible if the rules of the system are such that no state can evolve from two or more different states. If this holds then it is always possible to time-reverse the system and step it backwards to its initial state. This may be useful for things such as [undos](undo.md) in programs. Also note that even if a system is reversible, it may be computationally very time consuming and sometimes practically impossible to reverse the system (imagine e.g. reversing a cryptographic [hash](hash.md) -- mathematical reversibility of such hash may be arbitrarily ensured by e.g. pairing each hash with the lowest value that produces it).
|
|
|
|
**Is [floating point](float.md) deterministic?** In theory even floating point arithmetic can of course be completely deterministic but there is the question of whether this holds about concrete specifications and implementations of floating point (e.g. in different programming languages) -- here in theory non-determinism may arise e.g. by some unspecified behavior such as rounding rules. In practice you can't rely on float being deterministic. The common float standard, IEEE 754, is basically deterministic, including rounding etc. (except for possible payload of [NaNs](nan.md), which shouldn't matter in most cases), but this e.g. doesn't hold for floating point types in [C](c.md)! |