Bilinear interpolation (also bilinear filtering) is a simple way of creating a smooth transition ([interpolation](interpolation.md)) between [discrete](discrete.md) samples (values) in 2D, it is a [generalization](generalization.md) of [linear interpolation](lerp.md) to 2 dimensions. It is used in many places, popularly e.g. in 3D [computer graphics](graphics.md) for **[texture](texture.md) filtering**; bilinear interpolation allows to upscale textures to higher resolutions (i.e. compute new pixels between existing pixels) while keeping their look smooth and "non-blocky" (even though blurry). On the scale of quality vs simplicity it is kind of a middle way between a simpler [nearest neighbour](nearest_neighbour.md) interpolation (which creates the "blocky" look) and more complex [bicubic interpolation](bicubic.md) (which uses yet smoother curves but also requires more samples). Bilinear interpolation can further be generalized to [trilinear interpolation](trilinear.md) (in computer graphics trilinear interpolation is used to also additionally interpolate between different levels of a texture's [mipamap](mipamp.md)) and perhaps even bilinear [extrapolation](extrapolation.md). Many frameworks/libraries/engines have bilinear filtering built-in (e.g. `GL_LINEAR` in [OpenGL](ogl.md)). Of course this method may be used to smooth not just textures but anything, for example terrain [heightmaps](heightmap.md) or just any discrete mathematical function that we simply want to have defined everywhere, it's not just graphics thing, but here we will focus on its application in [graphics](graphics.md).
The principle is simple: first linearly interpolate in one direction (e.g. horizontal), then in the other (vertical). Mathematically the order in which we take the dimensions doesn't matter (but it may matter practically due to rounding errors etc.).
Example: let's say we want to compute the value *x* between the four following given corner values:
```
1 . . . . . . 5
. . . . . . . .
. . . . . . . .
. . . . . . . .
. . . . . . . .
. . . . x . . .
. . . . . . . .
8 . . . . . . 3
```
Let's say we first interpolate horizontally: we'll compute one value, *a*, on the top (between 1 and 5) and one value, *b*, at the bottom (between 8 and 3). When computing *a* we interpolate between 1 and 5 by the horizontal position of *x* (4/7), so we get *a = 1 + 4/7 * (5 - 1) = 23/7*. Similartly *b = 8 + 4/7 * (3 - 8) = 36/7*. Now we interpolate between *a* and *b* vertically (by the vertical position of *x*, 5/7) to get the final value *x = 23/7 + 5/7 * (36/7 - 23/7) = 226/49 ~= 4.6*. If we first interpolate vertically and then horizontally, we'd get the same result (the value between 1 and 8 would be 6, the value between 5 and 3 would be 25/7 and the final value 226/49 again).
Here is a [C](c.md) code to compute all the inbetween values in the above, using [fixed point](fixed_point.md) (no [float](float.md)):
```
#include <stdio.h>
#define GRID_RESOLUTION 8
int interpolateLinear(int a, int b, int t)
{
return a + (t * (b - a)) / (GRID_RESOLUTION - 1);
}
int interpolateBilinear(int topLeft, int topRight, int bottomLeft, int bottomRight,
int x, int y)
{
#define FPP 16 // we'll use fixed point to prevent rounding errors
**Cool [hack](hacking.md) to improve bilinear interpolation** (from https://iquilezles.org/articles/texture): bilinear interpolation doesn't looks as good as bicubic but bicubic is a lot more complex on hardware and bandwidth as it requires fetching more texels -- there is one trick which [shader](shader.md) programmers use to improve the look of bilinear filtering while not requiring fetching more texels. They use the `smoothstep` function on the interpolation parameter which eliminates instant "jumps" at edges between texels, it replaces straight lines with a smoother curve and so makes the [derivative](derivative.md) of the result continuous -- basically it looks a lot better. Still not as good as bicubic but close enough.
For [suckless](suckless.md) programs that do their own [software rendering](software_rendering.md) an issue of bilinear interpolation, as compared with nearest neighbor, might be that it **creates new colors** by averaging colors in the filtered image, i.e. image filtered this way may have new colors introduced and this may become a problem e.g. if we are using [palettes](palette.md) (indexed mode) with limited number of colors and possible operations with them. This may also complicated e.g. using precomputed scaling tables (used in old games like [wolf 3D](wolf3d.md)) that simply store mapping of the original image to pixels in an upscaled image. A possible attempt at a "fix" -- or rather more of a poor man's bilinear interpolation -- may be in [dithering](dithering.md) the colors rather than averaging; perhaps once we sample in between pixels we assign probabilities to the 4 nearest pixels, based on their distance to the sample position, and then take one of the four pixels at random with those probabilities using some [pseudorandom](pseudorandom.md) generator.