glutin: [Windows] Unable to create non-SRGB framebuffer
Hello there!
I have found a few other issues related to sRGB, but they simply do not help me and I am still incredibly confused. My problem is that ContextBuilder::with_srgb does not seem to have any effect. Check out this minimal example (using glium):
#[macro_use]
extern crate glium;
fn main() {
#[allow(unused_imports)]
use glium::{glutin, Surface};
let events_loop = glutin::EventsLoop::new();
let wb = glutin::WindowBuilder::new();
let cb = glutin::ContextBuilder::new().with_srgb(false); // MARK A -------
let display = glium::Display::new(wb, cb, &events_loop).unwrap();
#[derive(Copy, Clone)]
struct Vertex {
position: [f32; 2],
}
implement_vertex!(Vertex, position);
let shape = vec![
Vertex { position: [-1.0, -1.0] },
Vertex { position: [-1.0, 1.0] },
Vertex { position: [ 1.0, -1.0] },
Vertex { position: [ 1.0, 1.0] },
];
let vertex_buffer = glium::VertexBuffer::new(&display, &shape).unwrap();
let indices = glium::index::NoIndices(glium::index::PrimitiveType::TriangleStrip);
let vertex_shader_src = r#"
#version 140
in vec2 position;
void main() {
gl_Position = vec4(position, 0.0, 1.0);
}
"#;
let fragment_shader_src = r#"
#version 140
out vec4 color;
void main() {
color = vec4(vec3(0.5), 1.0);
}
"#;
let program = glium::Program::new(
&display,
glium::program::ProgramCreationInput::SourceCode {
vertex_shader: vertex_shader_src,
tessellation_control_shader: None,
tessellation_evaluation_shader: None,
geometry_shader: None,
fragment_shader: fragment_shader_src,
transform_feedback_varyings: None,
outputs_srgb: false, // MARK B ----------------------------------
uses_point_size: false,
}
).unwrap();
loop {
let mut target = display.draw();
target.draw(&vertex_buffer, &indices, &program, &uniform!{}, &Default::default()).unwrap();
target.finish().unwrap();
}
}
There are two places of interest which I will explain later.
.with_srgb(_) // MARK Aoutputs_srgb: _, // MARK B
My test setup is the following: I run this program and take a screenshot and then inspect what color value the pixels have. For the four possible configurations, I get the following values:
.with_srgb(false) |
.with_srgb(true) |
|
|---|---|---|
outputs_srgb: false |
#bbbbbb |
#bbbbbb |
outputs_srgb: true |
#808080 |
#808080 |
So as far as I understand: there is the GL_FRAMEBUFFER_SRGB flag. If that flag is false, OpenGL does not perform any conversion from fragment shader to frame buffer. If it is enabled, however, OpenGL assumes that the shader output is linear RGB and will thus – if the frame buffer has an sRGB format – convert the shader output to sRGB. In glium, GL_FRAMEBUFFER_SRGB is controlled by the outputs_srgb parameter of the program. If the latter is false, GL_FRAMEBUFFER_SRGB is set to true and the other way around.
Additionally, I would expect glutin to create a framebuffer with the format specified by .with_srgb(_). As such, I have the following expectations:
- ✗
with_srgb(false)andoutputs_srgb: false: the framebuffer should be linear RGB, meaning that no conversion should happen, regardless ofGL_FRAMEBUFFER_SRGB. As such, I would expect#808080, but I got#bbbbbb. - ✔
with_srgb(true)andoutputs_srgb: false: the framebuffer should be sRGB and since the shader does not output sRGB, I expect OpenGL to convert. As such, I expect#bbbbbb, which I actually got. - ✔
with_srgb(false)andoutputs_srgb: true: as above, the framebuffer should be linear RGB, meaning that no conversion should happen, regardless ofGL_FRAMEBUFFER_SRGB. As such, I would expect#808080which I luckily also got. - ✔
with_srgb(true)andoutputs_srgb: true: the framebuffer is sRGB, but the shader also outputs in that color space, so no conversion should happen. As such I expect#808080which I got.
This issue is about the first situation. As far as I can tell, this is just wrong.
I tested the above on several platforms: Ubuntu 18.04, MacOS and Windows. I always got the same results (well, on MacOS and Windows the #bbbbbb was slightly off, but still way more than #808080).
(I also did a test with GLFW in case that’s interesting. I used the simple example and changed line 63 to " gl_FragColor = vec4(vec3(0.5), 1.0);\n". Running that, I get a #808080 triangle.)
Am I doing something wrong? Am I completely misunderstanding color spaces? Is this a bug in glutin/glium/…? Would be super great if someone could help me!
About this issue
- Original URL
- State: closed
- Created 5 years ago
- Comments: 35 (24 by maintainers)
@ZeGentzy Well that’s the thing: I tested this with Nvidia, Intel and AMD and got the same results on all three when testing with Windows 10 (with the latest drivers). OP’s Intel behaves the same way.
All this makes me think that it’s at least partially due to WGL, but I could be wrong. So yeah, I don’t think this has anything to do with the driver version.
@ZeGentzy Not that I can find.
glviewdoesn’t list out alpha-bits, andvisualinfo(fromglew) doesn’t list outSRGB_EXT. I was able to combine information from the two for my experiments though.Anyway, from what I can observe, this behavior is visible outside of
glutintoo - a simple C++ program usingwglthat useswglChoosePixelFormatalso displays the same behavior.I created this table to better document this weird behavior. Note that while this experiment was done with the C++ version,
glutinalso agrees with it. In my test program, I try to render a window withglClearColor(0.5f, 0.5f, 0.5f, 1.0f). Sometimes, the result would be with#808080, which is expected, but often it would show up with#bbbbbb. I have also included certain properties of my pixel formats for easier reference.Without further ado:
Default attributes:
Pixel formats:
WGL_FRAMEBUFFER_SRGB_CAPABLE_EXTObservations:
WGL_ALPHA_BITS_ARBlisted?WGL_FRAMEBUFFER_SRGB_CAPABLE_EXTlisted?glEnable(GL_FRAMEBUFFER_SRGB)?#808080#bbbbbb#808080#808080#bbbbbb#808080#808080#bbbbbb#808080#808080#bbbbbb#bbbbbbSummary: So this pretty much confirms what we see with
glutin. Granted, all of this maybe because my PC has some stupid driver issue, but it’s still odd.WGL_FRAMEBUFFER_SRGB_CAPABLE_EXTis not listed, then it’s assumed to betrue. This is why if you then useglEnable(GL_FRAMEBUFFER_SRGB), you will see sRGB output (which is only possible ifWGL_FRAMEBUFFER_SRGB_CAPABLE_EXTistrueandglEnableis used).WGL_FRAMEBUFFER_SRGB_CAPABLE_EXTis listed, then the behavior is the same as what you’d expect - theglEnablecall will then combine with the value ofWGL_FRAMEBUFFER_SRGB_CAPABLE_EXTto determine if sRGB needs to be shown.WGL_ALPHA_BITS_ARBis listed, thenWGL_FRAMEBUFFER_SRGB_CAPABLE_EXTis assumed to betrue, regardless of what the listed value says (we already know it’strueif not listed). Again, this is why the sRGB output is solely controlled by theglEnablecall.@ZeGentzy Thoughts?
@sumit0190 Older/newer implementations might behave differently, idk. Maybe I’m reading the extension wrong. Maybe none of the drivers bothered caring. Nevertheless, users should set
with_srgbcorrectly.Silly driver issues are silly. Not much we can do. Please file this as a PR: https://github.com/rust-windowing/glutin/issues/1175#issuecomment-506984460
I’m opening separate issues for X11 and macOS, so that this issue doesn’t get too cluttered.
Edit: anyways, I’ll close this issue as wontfix (as there is nothing we can do) once you got that PR filed. Thanks for your time 😃
Can you run this program and check the available pixel formats: http://realtech-vr.com/admin/glview
This just strikes me as shitty intel drivers.
Can you comment out the code specifying things like alpha, stencil bits, ect, ect, then add them back one by one. The goal is to discover if one of the other options is conflicting/overriding our what we set
FRAMEBUFFER_SRGB_CAPABLE_*to.@ZeGentzy So here’s some more interesting stuff.
with_srgbset tofalsereturns this descriptor:[8193, 1, 8208, 1, 8211, 8235, 8195, 8231, 8212, 24, 8219, 8, 8226, 24, 8227, 8, 8209, 1, 8210, 0, 0]with_srgbset totruereturns this descriptor:[8193, 1, 8208, 1, 8211, 8235, 8195, 8231, 8212, 24, 8219, 8, 8226, 24, 8227, 8, 8209, 1, 8210, 0, 8361, 1, 0]So the
with_srgbparameter is doing it’s job and adding the right attribute (8361). But thepixel_format_idreturned in both cases is10, which is weird, right?Then I decide to use my own dummy descriptor, with some random values for all attributes. This is what that looks like:
Still the same result though; a format of
9is returned, andglutinstill thinkssrgbis set.But if I uncomment the commented line, suddenly things change: now, a format of
9is returned whenwith_srgbis set totrue, but when set tofalse, a format of105is returned, andglutinbehaves correctly.Now that makes me think: it looks like
FRAMEBUFFER_SRGB_CAPABLE_EXTshould be always defined, and just as you recommended, it should be set to0or1depending on whether sRGB is requested.So I make these modifications:
And this should do the same thing as my dummy descriptor. But even though I can see the attribute (
8361) added with a0or1depending on mywith_srgb, it gives me the samepixel_format_idregardless (10). Needs some more investigation, I guess.