glutin: [Windows] Unable to create non-SRGB framebuffer

Hello there!

I have found a few other issues related to sRGB, but they simply do not help me and I am still incredibly confused. My problem is that ContextBuilder::with_srgb does not seem to have any effect. Check out this minimal example (using glium):

#[macro_use]
extern crate glium;

fn main() {
    #[allow(unused_imports)]
    use glium::{glutin, Surface};

    let events_loop = glutin::EventsLoop::new();
    let wb = glutin::WindowBuilder::new();
    let cb = glutin::ContextBuilder::new().with_srgb(false); // MARK A -------
    let display = glium::Display::new(wb, cb, &events_loop).unwrap();

    #[derive(Copy, Clone)]
    struct Vertex {
        position: [f32; 2],
    }

    implement_vertex!(Vertex, position);

    let shape = vec![
        Vertex { position: [-1.0, -1.0] },
        Vertex { position: [-1.0,  1.0] },
        Vertex { position: [ 1.0, -1.0] },
        Vertex { position: [ 1.0,  1.0] },
    ];

    let vertex_buffer = glium::VertexBuffer::new(&display, &shape).unwrap();
    let indices = glium::index::NoIndices(glium::index::PrimitiveType::TriangleStrip);

    let vertex_shader_src = r#"
        #version 140
        in vec2 position;
        void main() {
            gl_Position = vec4(position, 0.0, 1.0);
        }
    "#;

    let fragment_shader_src = r#"
        #version 140
        out vec4 color;
        void main() {
            color = vec4(vec3(0.5), 1.0);
        }
    "#;

    let program = glium::Program::new(
        &display,
        glium::program::ProgramCreationInput::SourceCode {
            vertex_shader: vertex_shader_src,
            tessellation_control_shader: None,
            tessellation_evaluation_shader: None,
            geometry_shader: None,
            fragment_shader: fragment_shader_src,
            transform_feedback_varyings: None,
            outputs_srgb: false,  // MARK B ----------------------------------
            uses_point_size: false,
        }
    ).unwrap();

    loop {
        let mut target = display.draw();

        target.draw(&vertex_buffer, &indices, &program, &uniform!{}, &Default::default()).unwrap();
        target.finish().unwrap();
    }
}

There are two places of interest which I will explain later.

  • .with_srgb(_) // MARK A
  • outputs_srgb: _, // MARK B

My test setup is the following: I run this program and take a screenshot and then inspect what color value the pixels have. For the four possible configurations, I get the following values:

.with_srgb(false) .with_srgb(true)
outputs_srgb: false #bbbbbb #bbbbbb
outputs_srgb: true #808080 #808080

So as far as I understand: there is the GL_FRAMEBUFFER_SRGB flag. If that flag is false, OpenGL does not perform any conversion from fragment shader to frame buffer. If it is enabled, however, OpenGL assumes that the shader output is linear RGB and will thus – if the frame buffer has an sRGB format – convert the shader output to sRGB. In glium, GL_FRAMEBUFFER_SRGB is controlled by the outputs_srgb parameter of the program. If the latter is false, GL_FRAMEBUFFER_SRGB is set to true and the other way around.

Additionally, I would expect glutin to create a framebuffer with the format specified by .with_srgb(_). As such, I have the following expectations:

  • with_srgb(false) and outputs_srgb: false: the framebuffer should be linear RGB, meaning that no conversion should happen, regardless of GL_FRAMEBUFFER_SRGB. As such, I would expect #808080, but I got #bbbbbb.
  • with_srgb(true) and outputs_srgb: false: the framebuffer should be sRGB and since the shader does not output sRGB, I expect OpenGL to convert. As such, I expect #bbbbbb, which I actually got.
  • with_srgb(false) and outputs_srgb: true: as above, the framebuffer should be linear RGB, meaning that no conversion should happen, regardless of GL_FRAMEBUFFER_SRGB. As such, I would expect #808080 which I luckily also got.
  • with_srgb(true) and outputs_srgb: true: the framebuffer is sRGB, but the shader also outputs in that color space, so no conversion should happen. As such I expect #808080 which I got.

This issue is about the first situation. As far as I can tell, this is just wrong.

I tested the above on several platforms: Ubuntu 18.04, MacOS and Windows. I always got the same results (well, on MacOS and Windows the #bbbbbb was slightly off, but still way more than #808080).

(I also did a test with GLFW in case that’s interesting. I used the simple example and changed line 63 to " gl_FragColor = vec4(vec3(0.5), 1.0);\n". Running that, I get a #808080 triangle.)

Am I doing something wrong? Am I completely misunderstanding color spaces? Is this a bug in glutin/glium/…? Would be super great if someone could help me!

About this issue

  • Original URL
  • State: closed
  • Created 5 years ago
  • Comments: 35 (24 by maintainers)

Most upvoted comments

@ZeGentzy Well that’s the thing: I tested this with Nvidia, Intel and AMD and got the same results on all three when testing with Windows 10 (with the latest drivers). OP’s Intel behaves the same way.

All this makes me think that it’s at least partially due to WGL, but I could be wrong. So yeah, I don’t think this has anything to do with the driver version.

@ZeGentzy Not that I can find. glview doesn’t list out alpha-bits, and visualinfo (from glew) doesn’t list out SRGB_EXT. I was able to combine information from the two for my experiments though.

Anyway, from what I can observe, this behavior is visible outside of glutin too - a simple C++ program using wgl that uses wglChoosePixelFormat also displays the same behavior.

I created this table to better document this weird behavior. Note that while this experiment was done with the C++ version, glutin also agrees with it. In my test program, I try to render a window with glClearColor(0.5f, 0.5f, 0.5f, 1.0f). Sometimes, the result would be with #808080, which is expected, but often it would show up with #bbbbbb. I have also included certain properties of my pixel formats for easier reference.

Without further ado:

Default attributes:

WGL_DRAW_TO_WINDOW_ARB,     TRUE,
WGL_ACCELERATION_ARB,       WGL_FULL_ACCELERATION_ARB,
WGL_SUPPORT_OPENGL_ARB,     TRUE,
WGL_DOUBLE_BUFFER_ARB,      TRUE,
WGL_PIXEL_TYPE_ARB,         WGL_TYPE_RGBA_ARB,
WGL_COLOR_BITS_ARB,         24,
WGL_DEPTH_BITS_ARB,         24,
WGL_STENCIL_BITS_ARB,       0,

Pixel formats:

Pixel Format Alpha Bits WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT
7 - 1
8 8 1
103 - 0

Observations:

WGL_ALPHA_BITS_ARB listed? WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT listed? glEnable(GL_FRAMEBUFFER_SRGB)? Pixel format selected Output color
No No No 7 #808080
No No Yes 7 #bbbbbb
No Yes (1) No 7 #808080
No Yes (0) No 103 #808080
No Yes (1) Yes 7 #bbbbbb
No Yes (0) Yes 103 #808080
Yes (8) No No 8 #808080
Yes (8) No Yes 8 #bbbbbb
Yes (8) Yes (1) No 8 #808080
Yes (8) Yes (0) No 8 #808080
Yes (8) Yes (1) Yes 8 #bbbbbb
Yes (8) Yes (0) Yes 8 #bbbbbb

Summary: So this pretty much confirms what we see with glutin. Granted, all of this maybe because my PC has some stupid driver issue, but it’s still odd.

  1. If WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT is not listed, then it’s assumed to be true. This is why if you then use glEnable(GL_FRAMEBUFFER_SRGB), you will see sRGB output (which is only possible if WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT is true and glEnable is used).
  2. If WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT is listed, then the behavior is the same as what you’d expect - the glEnable call will then combine with the value of WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT to determine if sRGB needs to be shown.
  3. If WGL_ALPHA_BITS_ARB is listed, then WGL_FRAMEBUFFER_SRGB_CAPABLE_EXT is assumed to be true, regardless of what the listed value says (we already know it’s true if not listed). Again, this is why the sRGB output is solely controlled by the glEnable call.

@ZeGentzy Thoughts?

@sumit0190 Older/newer implementations might behave differently, idk. Maybe I’m reading the extension wrong. Maybe none of the drivers bothered caring. Nevertheless, users should set with_srgb correctly.

Silly driver issues are silly. Not much we can do. Please file this as a PR: https://github.com/rust-windowing/glutin/issues/1175#issuecomment-506984460

I’m opening separate issues for X11 and macOS, so that this issue doesn’t get too cluttered.

Edit: anyways, I’ll close this issue as wontfix (as there is nothing we can do) once you got that PR filed. Thanks for your time 😃

Can you run this program and check the available pixel formats: http://realtech-vr.com/admin/glview

This just strikes me as shitty intel drivers.

Can you comment out the code specifying things like alpha, stencil bits, ect, ect, then add them back one by one. The goal is to discover if one of the other options is conflicting/overriding our what we set FRAMEBUFFER_SRGB_CAPABLE_* to.

@ZeGentzy So here’s some more interesting stuff.

with_srgb set to false returns this descriptor: [8193, 1, 8208, 1, 8211, 8235, 8195, 8231, 8212, 24, 8219, 8, 8226, 24, 8227, 8, 8209, 1, 8210, 0, 0]

with_srgb set to true returns this descriptor: [8193, 1, 8208, 1, 8211, 8235, 8195, 8231, 8212, 24, 8219, 8, 8226, 24, 8227, 8, 8209, 1, 8210, 0, 8361, 1, 0]

So the with_srgb parameter is doing it’s job and adding the right attribute (8361). But the pixel_format_id returned in both cases is 10, which is weird, right?

Then I decide to use my own dummy descriptor, with some random values for all attributes. This is what that looks like:

    let descriptor = 
    [
    gl::wgl_extra::DRAW_TO_WINDOW_ARB as i32, 1 as i32,
    gl::wgl_extra::SUPPORT_OPENGL_ARB as i32, 1 as i32,
    gl::wgl_extra::DOUBLE_BUFFER_ARB as i32, 1 as i32,
    gl::wgl_extra::PIXEL_TYPE_ARB as i32, gl::wgl_extra::TYPE_RGBA_ARB as i32,
    gl::wgl_extra::COLOR_BITS_ARB as i32, 32 as i32,
    gl::wgl_extra::DEPTH_BITS_ARB as i32, 24 as i32,
    gl::wgl_extra::STENCIL_BITS_ARB as i32, 8 as i32,
    // gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_EXT as i32, pf_reqs.srgb as i32,
    0 as i32, // End
    ];

Still the same result though; a format of 9 is returned, and glutin still thinks srgb is set.

But if I uncomment the commented line, suddenly things change: now, a format of 9 is returned when with_srgb is set to true, but when set to false, a format of 105 is returned, and glutin behaves correctly.

Now that makes me think: it looks like FRAMEBUFFER_SRGB_CAPABLE_EXT should be always defined, and just as you recommended, it should be set to 0 or 1 depending on whether sRGB is requested.

So I make these modifications:

        // Find out if sRGB is needed and explicitly set its value to 0 or 1.
        if extensions
            .split(' ')
            .find(|&i| i == "WGL_ARB_framebuffer_sRGB")
            .is_some()
        {
            out.push(
                gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_ARB as raw::c_int,
            );
        } else if extensions
            .split(' ')
            .find(|&i| i == "WGL_EXT_framebuffer_sRGB")
            .is_some()
        {
            out.push(
                gl::wgl_extra::FRAMEBUFFER_SRGB_CAPABLE_EXT as raw::c_int,
            );
        } else {
            return Err(());
        }
        out.push(pf_reqs.srgb as raw::c_int);

And this should do the same thing as my dummy descriptor. But even though I can see the attribute (8361) added with a 0 or 1 depending on my with_srgb, it gives me the same pixel_format_id regardless (10). Needs some more investigation, I guess.