You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.

5.2 KiB

Determinism

"God doesn't play dice." --some German dude

Deterministic system (such as a computer program or an equation) is one which over time evolves without any involvement of randomness and probability; i.e. its current state along with the rules according to which it behaves unambiguously and precisely determine its following state. This means that a deterministic algorithm will always give the same result if run multiple times with the same input values. Determinism is an extremely important concept in computer science and programming (and in many other fields of science and philosophy).

Computers are mostly deterministic by nature and design, they operate by strict rules and engineers normally try to eliminate any random behavior as that is mostly undesirable (with certain exceptions mentioned below) -- randomness leads to hard to detect and hard to fix bugs, unpredictability etc. Determinism has furthermore many advantages, for example if we want to record a behavior of a deterministic system, it is enough if we record only the inputs to the system without the need to record its state which saves a great amount of space -- if we later want to replay the system's behavior we simply rerun the system with the recorded inputs and its behavior will be the same as before (this is exploited e.g. in recording gameplay demos in video games such as Doom).

Determinism can however also pose a problem, notable e.g. in cryptography where we DO want true randomness e.g. when generating seeds. Determinism in this case implies an attacker knowing the conditions under which we generated the seed can exactly replicate the process and arrive at the seed value that's supposed to be random and secret. For this reason some CPUs come with special hardware for generating truly random numbers.

Despite the natural determinism of computers as such, computer programs aren't automatically deterministic -- if you're writing a computer program, you have to make some effort to make it deterministic. This is because there are things such as undefined behavior. For example the behavior of your program may depend on timing (critical sections, ...), performance of the computer (a game running on slower computer will render fewer frames per second, ...), byte sex (big vs little endian), accessing uninitialized memory (which many times contains undefined values) and many more things. All this means that your program run with the same input data will produce different results on different computers or under slightly different circumstances, i.e. it would be non-deterministic.

Even if we're creating a program that somehow works with probability, we usually want to make it deterministic. This means we don't use actual random numbers but rather pseudorandom number generators that output chaotic values which simulate randomness, but which will nevertheless be exactly the same when ran multiple times with the same initial seed. This is again important e.g. for debugging the system in which replicating the bug is key to fixing it.

In theoretical computer science non-determinism means that a model of computation, such as a Turing machine, may at certain points decide to make one of several possible actions which is somehow most convenient, e.g. which will lead to finding a solution in shortest time. Or in other words it means that the model makes many computations, each in different path, and at the end we conveniently pick the "best" one, e.g. the fastest one. Then we may talk e.g. about how the computational strength or speed of computation differ between a deterministic and non-deterministic Turing machine etc.

Determinism is also a philosophical theory that says our Universe is deterministic, i.e. that everything is already predetermined by the state of the universe and the laws of physics, i.e. that we don't have "free will" (whatever it means) etc. Many believe quantum physics disproves determinism which is however not the case, there may e.g. exist hidden variables that still make quantum physics deterministic. Anyway, this is already beyond the scope of technological determinism.

Determinism does NOT guarantee reversibility, i.e. if we know a state of a deterministic system, it may not always be possible to say from which state it evolved. This reversibility is only possible if the rules of the system are such that no state can evolve from two or more different states. If this holds then it is always possible to time-reverse the system and step it backwards to its initial state. This may be useful for things such as undos in programs. Also note that even if a system is reversible, it may be computationally very time consuming and sometimes practically impossible to reverse the system (imagine e.g. reversing a cryptographic hash -- mathematical reversibility of such hash may be arbitrarily ensured by e.g. pairing each hash with the lowest value that produces it).