less_retarded_wiki/recursion.md

110 lines
7.4 KiB
Markdown
Raw Permalink Normal View History

# Recursion
2024-08-19 21:04:41 +02:00
*See [recursion](recursion.md).*
2024-06-29 19:04:58 +02:00
Recursion (from Latin recursio, "running back") in general is a situation in which a [definition](definition.md) refers to itself; for example the definition of a human's ancestor as "the human's parents and the ancestors of his parents" ([fractals](fractal.md) are also very nice example of what a simple recursive definition can achieve). In [programming](programming.md) recursion denotes a **[function](function.md) that calls itself**; this is the meaning we'll assume in this article unless noted otherwise.
2024-07-02 20:46:24 +02:00
{ Perhaps an analogy to this kind of recursion may be an "Inception"-style multi level dreams: imagine having a dream in a dream in a dream ... and so on -- and then at one point you start waking up, always getting back to where you were in each of the dreams, and so on until you completely wake up. --drummyfish }
We divide recursion to a **direct** and **indirect** one. In direct recursion the function calls itself directly, in indirect function *A* calls a function *B* which ends up (even possibly by calling some more functions) calling *A* again. Indirect recursion is tricky because it may appear by mistake and cause a [bug](bug.md) (which is nevertheless easily noticed as the program will mostly run out of memory and crash).
When a function calls itself, it starts "diving" deeper and deeper and in most situations we want this to stop at some point, so in most cases **a recursion has to contain a terminating condition**. Without this condition the recursion will keep recurring and end up in an equivalent of an infinite loop (which in case of recursion will however crash the program with a [stack overflow](stack_overflow.md) exception). Let's see this on perhaps the most typical example of using recursion, a [factorial](factorial.md) function:
```
unsigned int factorial(unsigned int x)
{
if (x > 1)
return x * factorial(x - 1); // recursive call
else
return 1; // terminating condition
}
```
2023-10-26 21:18:34 +02:00
See that as long as x > 1, recursive calls are being made; with each the x is decremented so that inevitably x will at one point come to equal 1. Then the *else* branch of the condition will be taken -- the terminating condition has been met -- and in this branch no further recursive call is made, i.e. the recursion is stopped here and the code starts to descend from the recursion. The following diagram show graphically computation of `factorial(4)`:
```
factorial(4) = 4 * 6 = 24
| ^
| | return
| '------------.
| call |
'-----> factorial(3) = 3 * 2 = 6
| ^
| | return
| '------------.
| call |
'-----> factorial(2) = 2 * 1 = 2
| ^
| | return
| '-----.
| call |
'------> factorial(1) = 1 <-- end condition met
```
Note that even in computing we can use an infinite recursion sometimes. For example in [Hashell](haskell.md) it is possible to define infinite [data structures](data_structure.md) with a recursive definition; however this kind of recursion is intentionally allowed, it is treated as a mathematical definition and with correct use it won't crash the program.
2024-06-28 13:51:48 +02:00
**Every recursion can be replaced by iteration and vice versa** (iteration meaning a loop such as `while`) -- to get rid of recursion it is always possible to use a custom [stack](stack.md) to which we'll be saving our state and thanks to which we'll be able to make the recursive "dives". Purely [functional](functional.md) languages for example do not have loops at all and handle repetition solely by recursion. This means that you, a programmer, always have a choice between recursion and iteration, and here you should know that **recursion is typically slower than iteration**. This is because recursion has a lot of overhead: remember that every level of recursion is a function call that involves things such as pushing and popping values on stack, handling return addresses etc. The usual advice is therefore to **prefer iteration**, even though recursion can sometimes be more elegant/simple and if you don't mind the overhead, it's not necessarily wrong to go for it (see also [optimization](optimization.md) and the evil of trying to optimize your program in wrong places). Getting rid of a simple recursion (single direct recursive call) is usually easy, it only gets more complicated e.g. in case of multiple recursion when the function makes more than one call to itself (for example in case of [quicksort](quicksort.md)), and here we usually stick with recursive implementation for its superior elegance. In the above example of factorial we only have one recurring branch, so it's much better to implement the function with iteration:
```
unsigned int factorial(unsigned int x)
{
unsigned int result = 1;
2024-08-31 14:44:45 +02:00
while (x > 1)
{
result *= x;
x--;
}
return result;
}
```
2024-08-05 22:39:28 +02:00
Here is another example of elegant recursion: printing string backwards:
```
#include <stdio.h>
void readAndPrintBackwards(void)
{
int c = getchar();
2024-08-31 14:44:45 +02:00
2024-08-05 22:39:28 +02:00
if (c != EOF)
{
readAndPrintBackwards();
putchar(c);
}
}
int main(void)
{
readAndPrintBackwards();
putchar('\n');
return 0;
}
```
The program reads one character in each call of the function and holds its printing for when we're emerging back from the recursion dive. The terminating condition here the check for end of the string (`EOF`). You can test this for program by compiling it and passing it a string, e.g. `echo "catdog" | ./program` which should result in printing `godtac`. In [comun](comun.md) the same program would be written as:
```
readAndPrintBackwards:
<-
<? ?
readAndPrintBackwards
.
2024-08-31 14:44:45 +02:00
2024-08-05 22:39:28 +02:00
->
.
readAndPrintBackwards
```
2024-06-28 13:51:48 +02:00
Some problems, for example [Ackermann function](ackermann_function.md), [quick sort](quick_sort.md), [tree](tree.md) traversals or the mentioned factorial are said to be "recursive" because they are just most elegantly defined and/or solved with recursion, but as we've seen, there is no problem that would inherently require recursive function calls. There may exist Turing complete languages without recursion that can still solve all problems that any other language can.
How do the computers practically make recursion happen? Basically they use a [stack](stack.md) to remember states (such as values of local variables and return addresses) on each level of the recursive dive -- each such state is captured by so called **stack frame**. In programming languages that support recursive function calls this is hidden behind the scenes in the form of so called **[call stack](call_stack.md)**. This is why an infinite recursion causes stack overflow.
2023-11-18 10:12:45 +01:00
Another important type of recursion is **tail recursion** which happens when the recursive call in a function is the very last command. It is utilized in functional languages that use recursion instead of loops. This kind of recursion can be optimized by the compiler into basically the same code a loop would produce, so that e.g. stack won't grow tremendously.
2024-08-31 14:44:45 +02:00
Mathematical recursive functions find use in [computability](computability.md) theory where they help us (similarly to e.g. [Turing machines](turing_machine.md)) define [classes](class.md) of functions (such as primitive recursive and partial recursive) by how "computationally strong" of a computer we need to compute them.