countTrailingZeros
Returns the number of consecutive 0 bits starting from the least significant bit (LSB) of a 32-bit integer, i.e. the index of the least significant 1-bit. It works on uint/int and their vector variants, and is useful for bitmask inspection, flag priority, and other low-level bit operations.
Core Advantages
Provides direct access to the GPU’s native (or emulated) countTrailingZeros functionality inside shaders, allowing you to efficiently locate the least significant set bit in a bitmask without writing manual loops or per-bit checks. It also supports component-wise evaluation for integer vector types, which is handy when encoding multiple states in different channels.
Common Uses
Finding the lowest-priority active flag in a bitmask, for example selecting which LOD, light channel, or texture channel to enable based on the index of the least significant 1-bit.
Measuring the power-of-two factor of an integer (for pure powers of two, trailingZeros(x) equals log2(x)), then using that as a hierarchical index or mipmap level selector.
Bucketizing hash results or random integers at the bit level by mapping different counts of trailing zeros to different color bands, roughness ranges, or effect layers in procedural shading.
Debugging or visualizing bitmasks by normalizing the number of trailing zeros to 0–1 and using it as brightness or mask to see which low-bit regions are occupied.
How to adjust
The output of countTrailingZeros is a discrete integer in the range 0–32, so it does not create smooth transitions by itself. In practice: (1) ensure the input is an integer type (uint/int or their vector forms); if you start from floats, convert them to an integer or bitmask first; (2) for an input of 0 the result is typically undefined and should be avoided or handled by a separate branch; (3) a common pattern is to cast the result to float and divide by 32 to obtain a 0–1 factor for LOD, weights, or indexing logic; (4) combined with countLeadingZeros you can inspect both the least and most significant set bits to analyze the bit distribution of an integer. According to the documentation, this node is mainly intended for use with WebGPURenderer and a WebGPU backend, so be mindful of compatibility and performance on other backends.
Code Examples
1// A 32-bit unsigned integer bitmask encoding some state
2const mask = uint( 0x0010 );
3
4// Count the number of consecutive 0 bits from the least significant bit
5// For vector types (uvec2 / uvec3 / ...) this is computed per component
6const tz = countTrailingZeros( mask );
7
8// Convert to float and normalize to a 0–1 factor
9const factor = float( tz ).div( 32.0 );
10
11// Example: blend between two colors based on the trailing-zeros count
12const baseColor = vec3( 1.0, 0.3, 0.2 );
13const highColor = vec3( 0.2, 0.6, 1.0 );
14
15material.colorNode = mix( baseColor, highColor, factor );