Skip to main content
1 of 3
Grimelios
  • 589
  • 1
  • 5
  • 22

Why does my stencil buffer allow pixels through?

I'm rendering a scene using OpenGL. The scene consists of a grassy environment and a small lake (visible as two surfaces). To achieve water surface transparency, I'm using a stencil buffer to render water surfaces separately from other geometry. For context, here's a screenshot of the scene without water surfaces:

Scene without water surfaces rendered.

Here's the same scene, but with water surfaces stenciled out (the black portions of the screen):

Water surfaces stenciled out, but not rendered.

Here's the fragment shader I'm using (where fSource is fullscreen UV coordinates). As you can see, it simply samples from a previously-used frame buffer and outputs the pixel directly to the screen. With stenciling enabled, only pixels not covered by a water surface are rendered.

#version 440 core

in vec2 fSource;

out vec4 outColor;

uniform sampler2D image;

void main()
{
    outColor = texture(image, fSource);
}

Here's the problem. When I change this fragment shader to output a solid color, those underwater pixels are suddenly visible (the stencil buffer and stencil settings have not changed). What I'd expect here is non-submerged pixels to be purple, but the water surface pixels (previously black) to remain black (since they should still be rejected through stenciling). Instead, everything is purple, including the water surface.

#version 440 core

out vec4 outColor;

void main()
{
    outColor = vec4(1, 0, 1, 1);
}

Direct color output (purple) causing pixels to not be rejected through stenciling.

As an additional test, I tried rendering only pixels under the water surfaces with a solid color. These pixels should be rejected via stenciling (appearing black, just like before), but for some reason, they're visible.

#version 440 core

in vec2 fSource;

out vec4 outColor;

uniform sampler2D image;
uniform sampler2D positions;

void main()
{
    vec3 position = texture(positions, fSource).xyz;

    // The water surfaces happen to sit at Y value 7.85.
    if (p.y < 7.85)
    {
        outColor = vec4(1, 1, 0, 1);
    }
    else
    {
        outColor = texture(image, fSource);
    }
}

Submerged pixels showing up, seemingly ignoring the stencil buffer.

I confirmed the problem by instead sampling with shifted UV coordinates (rather than outputting solid yellow). Again, pixels are showing up on the water surfaces, seemingly ignoring the stencil buffer.

#version 440 core

in vec2 fSource;

out vec4 outColor;

uniform sampler2D image;
uniform sampler2D positions;

void main()
{
    vec3 position = texture(positions, fSource).xyz;

    // The water surfaces happen to sit at Y value 7.85.
    if (p.y < 7.85)
    {
        outColor = texture(image, fSource + vec2(0.1, 0));
    }
    else
    {
        outColor = texture(image, fSource);
    }
}

Shifted UV pixels still showing up through the stencil buffer.

My understanding of stenciling is that it prevents the fragment shader from running at all on certain portions of the screen (i.e. fragments pass or fail based on stencil settings). If that were true, then no matter what the fragment shader outputs, that pixel should remain black (in this context). Clearly, the fragment shader is still being run for pixels that should be rejected due to stenciling, which means I must be misunderstanding how the stencil buffer works.

Why is the fragment shader still running on pixels that should be rejected through stenciling?


Grimelios
  • 589
  • 1
  • 5
  • 22