Entertainment at it's peak. The news is by your side.

Band-Limiting Procedural Textures



Runtime procedural textures (no longer baked into bitmaps) procure a minuscule memory footprint and also will be personalized at runtime, which are huge aspects. On the different hand controlling aliasing for the length of rendering can even be complex, unless there’s some intermediate procedural-to-bitmap cache that facilitates prefiltering (admire in Renderman’s RtxPlugin as an instance). In the absence of the kind of tool, one can repeatedly selectively supersample the feel as an numerical approximation to filtering, which lower the aliasing to some level. On the different hand, some procedural patterns can even be analytically filtered most attention-grabbing by the easy nature of their constructing. That approach that some particular procedural capabilities can even be rendered without aliasing, no longer no longer up to when in isolation. Now, the integral of characteristic composition is often no longer the composition of the integrals, which approach, a filtered complex shading network is no longer the identical as a non-filtered network of the filtered aspects it’s fabricated from. On the different hand in discover we are in a position to analytically filter these patterns in my understanding and quiet receive a fairly band-dinky outcomes out of the network (doubtlessly slightly over-blurred).

Primarily, while “band-limiting” would now not substitute most attention-grabbing filtering, it’s far a sound different system to the dilemma of aliasing. Anybody doing procedural content material has executed band-limiting in some unspecified time in the future, both in the receive of geometrical or shading/texturing static LODs, or thru hiss manipulation of the enchancment of the procedural detail. The approach is outdated as computer graphics, and on this text we will fight thru a rapid stir to rediscovering it so we are in a position to use it more customarily in our procedural patterns.

Left: raw fBM. Exquisite: band-dinky fBM. Source+demo:

From filtering to band-limiting cos(x)

Cos(x) (or Sin) is one in every of the most smartly-hottest capabilities to generate procedural content material since it’s periodic, gentle and in actuality rapid to overview (in popular hardware). A cosine wave also has the lend a hand that it must even be analytically integrated, so we are in a position to expend the identical filtering approach we outdated for sq. waves for the length of pattern sampling/overview. Customarily, given a cosine, we first must know the scheme unparalleled enviornment it in actuality covers in the course of the a given pixel’s footprint. Let’s call that quantity w, and expend it to filter/convolve/combine the cosine with a filter width (or kernel dimension) equal to w in expose to compute the frequent mark of the cosine internal that pixel:

As we look, it so occurs that filtering a cosine wave with a field filter is the identical as multiplying it with a sinc() characteristic, which comes as no surprise if digital designate processing. A triangular kernel also produces an analytical resolution, however for the purposes of this day’s exercise we will stick with the sphere filter.

Now, let’s discover this straight to a procedural pattern, as an instance a cosine primarily based color texture that we deformed thru some enviornment distortion, admire this one:

Naive vs filtered cosines. Source+Demo:

On the left facet of the picture you look a naive implementation of the procedural texture (in step with a series of cosine primarily based color layers), where cos() is called straight, because it’s likely you’ll customarily invent. On the most attention-grabbing is the filtered version where each and every cosine characteristic has been replaced with the filtered-cosine characteristic we most attention-grabbing deduced thru our esteem maths. Let me repeat you the code for the naive texture implementation:

vec3 getColor( in trudge with the trudge with the paddle t )
vec3 col = a0;
col += a1*cos(k2PI*tw1 + k1);
col += a2*cos(kTAU*tw2 + k2);
col += a3*cos(k2PI*tw3 + k3);
col += a4*cos(kTAU*tw4 + k4);
col += a5*cos(k2PI*tw5 + k5);
col += a6*cos(kTAU*tw6 + k6);
col += a7*cos(k2PI*tw7 + k7);
col += a8*cos(kTAU*tw8 + k8);
return col;

a0 = vec3(0.4,0.4,0.4);
a1 = vec3(0.0,0.8,1.1), w1 = 1.1;
a2 = vec3(0.3,0.4,0.1), w2 = 3.1;
a3 = vec3(0.1,0.7,1.1), w3 = 5.1;
a4 = vec3(0.2,0.8,1.4), w4 = 9.1;
a5 = vec3(0.2,0.6,0.7), w5 = 17.1;
a6 = vec3(0.1,0.6,0.7), w6 = 31.1;
a7 = vec3(0.0,0.5,0.8), w7 = 65.1;
a8 = vec3(0.1,0.4,0.7), w8 = 115.1;

Even if I listed the coefficients of the valid color palette to the most attention-grabbing, the tiny print of the palette itself are irrelevant to our filtering dialogue. The most attention-grabbing factor we’ve got to know is that the left facet of the picture aliases because one of the dear crucial wi frequencies of the color cosine waves are too high for the pixel sampling charge we’re utilizing. That approach, one of the dear crucial cosine waves oscillate too time and again per pixel, and at a single sample per pixel (AA 1x) there is now not any approach to recall that knowledge in a score system. On the different hand most attention-grabbing definite aspects of the picture alias, others are fully beautiful because even the quickest cos waves invent oscillate no longer up to as soon as per pixel. That occurs in the areas of the picture where the feel is stretched the most and the enviornment of the cos() waves has been sufficiently “slowed down”.

So, if supersampling the feel is no longer an possibility, we would must relief in thoughts striking off one of the dear crucial bigger frequency cosines (snort w7 and w8) in expose to salvage aliasing. And certainly that can salvage the aliasing, however we would also salvage visual detail from the areas of the picture which regarded beautiful to begin up with, which is no longer proper. So, right here’s where the analytic filtering we most attention-grabbing developed is accessible in at hand since it does filter detail selectively, most attention-grabbing striking off high frequency content material where it offends the picture fantastic. Due to this it produces a sexy (gentle and score) render because it’s likely you’ll look on the most attention-grabbing facet of the picture. The nicest factor of it all is that the code is trivial – most attention-grabbing a hiss translation of the math we deduced – and it’s far a hiss substitute for traditional cos():

trudge with the trudge with the paddle fcos( in trudge with the trudge with the paddle x )
trudge with the trudge with the paddle w = fwidth(x);
return cos(x) sin(0.5*w)/(0.5*w);

Via implementation, you might well accomplish a 2d version of this characteristic where the filter width is manually handed to fcos(), which is ready to was at hand in some scenarios (more on that later). Also when it involves implementation, while you made the likelihood to optimize it, you might well end up with one thing admire the following:

trudge with the trudge with the paddle fcos( in trudge with the trudge with the paddle x )
trudge with the trudge with the paddle w = fwidth(x);
return cos(x) smoothstep( k2PI, 0.0, w );

which is an okey approximation. For notation clarity I’m assuming that smoothstep(a,b,x) = 1-smoothstep(b,a,x), which is more healthy in most implementations of smoothstep(). In the graph above it’s likely you’ll look the valid sinc() characteristic in orange and the smoothstep approximation in crimson. Visually, in valid rendering applications, the distinction is negligible (I would per chance well argue the smoothstep version appears to be like higher).

Now, moreover some doable race enchancment, the smoothstep version will doubtlessly begin giving you some suggestions to prolong this approach further. Finally, what the smoothstep is effectively doing is smoothly deactivating the cosine wave (zeroing it out) when the size of a 2π cycle is smaller than a pixel. Customarily, we replaced filtering by band-limiting (cancelling frequencies too high to be sampled without aliasing). So, under this mild, it appears pure to observe the identical deactivation medication to other procedural oscillating capabilities which are no longer necessarily easy to combine/filter, snort, noise capabilities.

Band limiting noise

From this path of reasoning, the postulate of approximating filtered noise by blending it in the direction of zero in step with the ratio wavelength-to-filter-width is inevitable, virtually compelled on us. You might perchance well additionally fetch it documented in outdated computer graphics manuals of some decades ago. Nonetheless without reference to being outdated and simple, it produces somewhat acceptable outcomes. Let’s decide as an instance a dilapidated FBM constructing where we score noise waves, and increase it by utilizing a filtered noise extinct as a substitute:

trudge with the trudge with the paddle fbm( in trudge with the trudge with the paddle x )
trudge with the trudge with the paddle f = 1.0;
trudge with the trudge with the paddle a = 0.5;
trudge with the trudge with the paddle t = 0.0;
for( int i=0; i

trudge with the trudge with the paddle ffbm( in trudge with the trudge with the paddle x )
trudge with the trudge with the paddle w = fwidth(x);
trudge with the trudge with the paddle f = 1.0;
trudge with the trudge with the paddle a = 0.5;
trudge with the trudge with the paddle t = 0.0;
for( int i=0; i

trudge with the trudge with the paddle fnoise( in trudge with the trudge with the paddle x, in trudge with the trudge with the paddle w )
return noise(x)*smoothstep(1.0,0.5,w);

You might perchance well additionally look the visual outcomes on the pause of this text, in the first engaging picture. In this code above, on the left we’ve got a naive fBM implementation and in the heart and appropriate we’ve got the band-dinky version. We mainly most attention-grabbing replaced the noise() characteristic by a self-limiting one. Nonetheless there are a pair of issues price mentioning:

First, the code above can even be trivially generalized to 2D and 3D, no longer no longer up to for isotropic band-limitation, by compute the length or the main factor of fwidth(). For anisotropic filtering, fnose() would procure to be modified to one thing smarter.

Secondly, the selection of 0.5 and 1.0 in the smoothstep are a small bit arbitrary, all that issues is that we attenuate issues that oscillate bigger than as soon as per pixel (in thought, as soon as per half-pixel, in accordance with Nyquist).

Thirdly, the propagation of the filter width w needs to be in synch with the doubling of the frequency, naturally. If the enviornment of noise() used to be deformed in other (doubtlessly non-linear) ways, we would procure to be definite the filter width form used to be preserved as smartly (we would must discover the a pair of partial derivatives of the deformation). Associated:

Lastly, I will procure decided to no longer pass w to fnoise() and as a substitute compute it at some stage in the noise implementation, admire with did with fcos(). On the different hand on this case above I desired to repeat learn how to compute the filter width most attention-grabbing as soon as, in the callee, after which propagate it down the cascade of noise octaves; which would per chance well doubtlessly be faster.

Now, as a publish-implementation commentary, we would argue that each and every we’re doing is striking off detail from our procedural content material in step with distance. And that can even be most attention-grabbing, however mischaracterizes the deeper relationship between pixel-footprint and the designate we’re sampling. Nonetheless, it’s far most attention-grabbing that it’s far a sound simplified interpretation, despite the indisputable fact that it’s far a sound one most attention-grabbing in some scenarios. As an instance, certainly we customarily expend that straightforward heuristic when constructing huge procedural landscapes. In such applications, the bottom inserting fruit in the pursue of performance and antialiasing is the deactivation of octaves of procedural noise in step with distance. For an fBM terrain in particular that occurs in a logarithmic fashion since detail diminishes linearly with distance (attributable to point of view). So oftentimes, one attenuates and discards noise calls. Nonetheless, this approach turns into less efficient when there are dwelling warping or other deformations dilating or growing dwelling in the combo, which is often employed to complement procedural terrains with attention-grabbing geographical aspects. The reason the distance heuristic is less efficient then is that the relation between distance and filter width is no longer any longer most attention-grabbing inverse-linear. Certainly varied portions of texture/noise dwelling will match in a single pixel searching on how unparalleled the enviornment used to be stretched/dilated. So, in those cases we’ve got to compute the valid filter width, to prevent over and under-filtering. Now, it’s likely you’ll compute that filter width analytically by propagating enviornment expansion/contraction thru the chain rule of the Jacobians if each and every of the deformations. Or you most definitely would per chance well even furthermore most attention-grabbing rely on the filter width computed robotically for you on your hardware or tool rendering equipment (snort fwidth() in GLSL or filterregion in Renderman), which is easy and works in actuality smartly in most scenarios.


Whereas band-limiting is no longer an even substitute for valid filtering (whether or no longer analytical, supersampled or mipmap primarily based), it’s positively higher than no longer doing anything else about aliasing, and in some cases it in actuality is a proper resolution for noise primarily based capabilities. For cosine capabilities one can invent this precisely as smartly. The next under is a stay instance of that genuinely; click and scroll the mouse to search the distinction between filtering and no longer, and on the title to search the provision code.

Read More

Leave A Reply

Your email address will not be published.