less_retarded_wiki/procgen.md
2022-09-19 08:56:30 +02:00

6.1 KiB

Procedural Generation

Procedural generation (procgen) refers to creation of data, such as art in games or test data for software, by using algorithms and mathematical formulas rather than creating this data manually or measuring it in the real world. This can be used for example for automatic generation of textures, texts, music, game levels or 3D models but also to practically anything else, e.g. test databases or even computer programs. Procedural art still cannot reach qualities and creativity of a skilled human artist, but it can be a good filler, substitute, an addition to or a basis for manually created art. Procedural generation has many advantages such as saving space (instead of large data we only store small code of the algorithm that generates it), saving time (once we have an algorithm we can generate a lot data extremely quickly), increasing resolution practically to infinity or extending data to more dimensions (e.g. 3D textures). Procedural generation can also be used as a helper and guidance, e.g. an artist may use a procedurally generated game level as a starting point and fine tune it manually, or vice versa, procedural algorithm may create a level from manually created building blocks.

As neural AI approaches human level of creativity, we may see it actually replacing many artists in near future, however it is debatable whether AI generated content should be called procedural generation as AI models are quite different from the traditional hand-made algorithms. From now on we'll only be considering the traditional approach.

Minecraft (or Minetest) is a popular example of a game in which the world is generated procedurally, which allows it to have near-infinite worlds. Roguelikes also heavily utilize procgen. However this is nothing new, an old game Daggerfall was known for its extremely vast procedurally generated world.

For its extreme save of space procedural generation is extremely popular in demoscene where programmers try to create as small programs as possible. German programmers made a full fledged 3D shooter called .kkrieger that fits into just 96 kB! It was thanks to heavy use of procedural generation for the whole game content. Bytebeat is a simple method of generating procedural "8bit" music, it is used e.g. in Anarch. Procedural generation is generally popular in indie game dev thanks to offering a way of generating huge amounts of content quickly, without having to pay artists.

We may see procgen as being similar to compression algorithms: we have large data and are looking for an algorithm that's much smaller while being able to reproduce the data (but here we normally go the other way around, we start with the algorithm and see what data it produces rather than searching for an algorithm that produces given data). John Carmack himself called procgen "basically a shitty compression".

Using fractals is a popular technique in procgen because they basically perfectly fit the definition of it: a fractal is defined by a simple equation or a set of a few rules that yield an infinitely complex shape. Nature is also full of fractals such as clouds, mountain or trees, so fractals look organic.

A good example to think of is generating procedural textures. This is generally done by first generating a basis image or multiple images, e.g. with noise functions such as Perlin noise (it gives us a grayscale image that looks a bit like clouds). We then further process this base image(s) and combine the results in various ways, for example we may use different transformations, modulations, shaders, blending, adding color using color ramps etc. The whole texture is therefore described by a graph in which nodes represent the operations we apply; this can literally be done visually in software like Blender (see its shader editor). The nice things are that we can now for example generalize the texture to 3 dimensions, i.e. not only have a flat image, but have a whole volume of a texture that can extremely easily be mapped to 3D objects simply by intersecting it with their surfaces which will yield a completely smooth texturing without any seams (this is quite often used along with raytracing). Or we can animate a 2D texture by doing a moving cross section of 3D texture. We can also write the algorithm so that the generates texture has no seams if repeated side-by-side (by using modular "wrap-around" coordinates). We can also generate the texture at any arbitrary resolution as we have a continuous mathematical description of it; we may perform an infinite zoom into it if we want. As if that's not enough, we can also generate almost infinitely many slightly different versions of this texture by simply changing the seed of pseudorandom generators.

We use procedural generation mainly in two ways:

  • offline/explicit: We pre-generate the data before we run the program, i.e. we let the algorithm create our art, save it to a file and then use it as we would use traditionally created art.
  • realtime/implicit: We generate the data on the fly and only parts of it that we currently need. For example with a procedural texture mapped onto a 3D model, we would compute the texture pixels (texels) when we're actually drawing them: this has the advantage of giving an infinite resolution of the texture because no matter how close-up we view the model, we can always compute exactly the pixels we need. This would typically be implemented inside a fragment/pixel shader program. This is also used in the voxel games that generate the world only in the area the player currently occupies.

Indeed we may also do something "in between", e.g. generate procedural assets into temporary files or RAM caches at run time and depending on the situation, for example when purely realtime generation of such assets would be too slow.

Example

TODO