This commit is contained in:
Miloslav Ciz 2023-09-11 21:01:26 +02:00
parent cf33ea2de9
commit 7748e4383d
9 changed files with 12 additions and 6 deletions

View file

@ -10,7 +10,7 @@ There are many methods and [algorithms](algorithm.md) for doing so differing in
As most existing 3D "frameworks" are harmful, a [LRS](lrs.md) programmer is likely to write his own 3D rendering system that suits his program best, therefore we should list some common methods of achieving 3D. Besides that, it's just pretty interesting to see what there is in the store.
**Rendering spectrum**: The book *Real-Time Rendering* mentions that methods for 3D rendering can be seen as lying on a spectrum, one extreme of which is *appearance reproduction* and the other *physics simulation*. Methods closer to trying to imitate the appearance try to simply create the same look of an object on the monitor that the actual 3D object would have -- these may e.g. use image data such as photographs; these methods may rely on lightfields, [textures](texture.md) etc. The physics simulation methods try to replicate the behavior of light in real life and so come to the same results: these methods rely on creating 3D geometry (e.g. that made of triangles or voxels), computing light reflections and [global illumination](global_illumination.md). Most methods lie somewhere in between these two extremes: for example [billboards](billboard.md) and [particle systems](particle_system.md) may use a texture to represent an object while at the same time using 3D quads (very simple 3D models) to correctly deform the textures by perspective and solve their visibility.
**Rendering spectrum**: The book *Real-Time Rendering* mentions that methods for 3D rendering can be seen as lying on a spectrum, one extreme of which is *appearance reproduction* and the other *physics simulation*. Methods closer to trying to imitate the appearance try to simply focus on imitating the look of an object on the monitor that the actual 3D object would have in real life, without being concerned with *how* that look arises in real life -- these may e.g. use image data such as photographs; these methods may rely on lightfields, photo [textures](texture.md) etc. The physics simulation methods try to replicate the behavior of light in real life -- their main goal is to solve the **[rendering equation](rendering_equation.md)**, usually only [approximately](approximation.md) -- and so, through internally imitating the same processes, come to similar visual results that arise in real world: these methods rely on creating 3D geometry (e.g. that made of triangles or voxels), computing light reflections and [global illumination](global_illumination.md). Most methods lie somewhere in between these two extremes: for example [billboards](billboard.md) and [particle systems](particle_system.md) may use a texture to represent an object while at the same time using 3D quads (very simple 3D models) to correctly deform the textures by perspective and solve their visibility. The classic polygonal 3D models are also usually somewhere in between: the 3D geometry and [shading](shading.md) are trying to simulate the physics, but e.g. a photo texture mapped on such 3D model is the opposite appearance-based approach ([PBR](pbr.md) further tries to shift the use of textures more towards the *physics simulation* end).
A table of some common 3D rendering methods follows, including the most simple, most advanced and some unconventional ones. Note that here we talk about methods and techniques rather than algorithms, i.e. general approaches that are often modified and combined into a specific rendering algorithm. For example the traditional triangle rasterization is sometimes combined with raytracing to add e.g. realistic reflections. The methods may also be further enriched with features such as [texturing](texture.md), [antialiasing](antialiasing.md) and so on. The table below should help you choose the base 3D rendering method for your specific program.