Update
This commit is contained in:
parent
e89960bcfb
commit
5cb3296fc9
11 changed files with 63 additions and 22 deletions
|
@ -1,6 +1,6 @@
|
|||
# Aliasing
|
||||
|
||||
Aliasing is a certain typically undesirable phenomenon that distorts signals (such as sounds or images) when they are [sampled](sampling.md) [discretely](discrete.md) (captured at periodic intervals) -- this can happen e.g. when capturing sound with digital recorders or when [rendering](rendering.md) computer graphics. There exist [antialiasing](antialiasing.md) methods for suppressing or even eliminating aliasing. Aliasing can be often seen on small checkerboard patterns as a moiré pattern (spatial aliasing), or maybe more famously on rotating wheels or helicopter rotor blades that in a video look like standing still or rotating the other way (temporal aliasing, caused by capturing images at intervals given by the camera's [FPS](fps.md)).
|
||||
Aliasing is a certain typically undesirable phenomenon that distorts signals (such as sounds or images) when they are [sampled](sampling.md) [discretely](discrete.md) (captured at single points, usually at periodic intervals) -- this can happen e.g. when capturing sound with digital recorders or when [rendering](rendering.md) computer graphics. There exist [antialiasing](antialiasing.md) methods for suppressing or even eliminating aliasing. Aliasing can be often seen on small checkerboard patterns as a moiré pattern (spatial aliasing), or maybe more famously on rotating wheels or helicopter rotor blades that in a video look like standing still or rotating the other way (temporal aliasing, caused by capturing images at intervals given by the camera's [FPS](fps.md)).
|
||||
|
||||
A simple example showing how sampling at discrete points can quite dramatically alter the recorded result:
|
||||
|
||||
|
@ -56,4 +56,4 @@ The same thing may happen in [ray tracing](ray_tracing.md) if we shoot a single
|
|||
|
||||
**Why doesn't aliasing happen in our eyes and ears?** Because our senses don't sample the world discretely, i.e. in single points -- our senses [integrate](integration.md). E.g. a rod or a cone in our eyes doesn't just see exactly one point in the world but rather an averaged light over a small area (which is ideally right next to another small area seen by another cell, so there is no information to "hide" in between them), and it also doesn't sample the world at specific moments like cameras do, its excitation by light falls off gradually which averages the light over time, preventing temporal aliasing (instead of aliasing we get [motion blur](motion_blur.md)).
|
||||
|
||||
So all in all, **how to prevent aliasing?** As said above, we always try to satisfy the sampling theorem, i.e. make our sampling frequency at least twice as high as the highest frequency in the signal we're sampling, or at least get close to this situation and lower the probability of aliasing. This can be done by either increasing sampling frequency (which can be done smart, some methods try to detect where sampling should be denser), or by preprocessing the input signal with a low pass filter or otherwise ensure there won't be too high frequencies.
|
||||
So all in all, **how to prevent aliasing?** As said above, we always try to satisfy the sampling theorem, i.e. make our sampling frequency at least twice as high as the highest frequency in the signal we're sampling, or at least get close to this situation and lower the probability of aliasing. This can be done by either increasing sampling frequency (which can be done smart, some methods try to detect where sampling should be denser), or by preprocessing the input signal with a low pass filter or otherwise ensure there won't be too high frequencies (e.g. using lower resolution textures).
|
Loading…
Add table
Add a link
Reference in a new issue