HDR or High Dynamic Range is a way for images/videos/games to take advantage of the increased colour space, brightness and contrast of modern displays.
That is, if your medium, your player device/software and your display are HDR capable.
HDR content is usually mastered with a peak brightness of 1000nits or more in mind, while Standard Dynamic Range (SDR) content is mastered for 80-100nit screens.
How is this a software problem? Why can’t the display server just tell the monitor "make this pixel as bright as you can (255) and this other pixel as dark as you can (0)?
In short: Because HDR needs additional metadata to work. You can watch HDR content on SDR screens and it’s horribly washed out. It looks a bit like log footage. The HDR metadata then tells the screen how bright/dark the image actually needs to be. The software issue is the correct support for said metadata.
I‘d speculate (I’m not an expert) that the reason for this is, that it enables more granularity. Even the 1024 steps of brightness 10bit colour can produce is nothing compared to the millions to one contrast of modern LCDs or even near infinite contrast of OLED. Besides, screens come in a number of peak brightnesses. I suppose doing it this way enables the manufacturer to interpret the metadata to look more favorably on their screens.
And also, with your solution, a brightness value of 1023 would always be the max brightness of the TV. You don’t always want that, if your TV can literally flashbang you. Sure, you want the sun to be peak brightness, but not every white object is as bright as the sun… That’s the true beauty of a good HDR experience. It looks fairly normal but reflections of the sun or the fire in a dark room just hit differently, when the rest of the scene stays much darker yet is still clearly visible.
Been watching this drama about HDR for a year now, and still can’t be arsed to read up on what it is.
HDR or High Dynamic Range is a way for images/videos/games to take advantage of the increased colour space, brightness and contrast of modern displays. That is, if your medium, your player device/software and your display are HDR capable.
HDR content is usually mastered with a peak brightness of 1000nits or more in mind, while Standard Dynamic Range (SDR) content is mastered for 80-100nit screens.
How is this a software problem? Why can’t the display server just tell the monitor "make this pixel as bright as you can (255) and this other pixel as dark as you can (0)?
In short: Because HDR needs additional metadata to work. You can watch HDR content on SDR screens and it’s horribly washed out. It looks a bit like log footage. The HDR metadata then tells the screen how bright/dark the image actually needs to be. The software issue is the correct support for said metadata.
I‘d speculate (I’m not an expert) that the reason for this is, that it enables more granularity. Even the 1024 steps of brightness 10bit colour can produce is nothing compared to the millions to one contrast of modern LCDs or even near infinite contrast of OLED. Besides, screens come in a number of peak brightnesses. I suppose doing it this way enables the manufacturer to interpret the metadata to look more favorably on their screens.
And also, with your solution, a brightness value of 1023 would always be the max brightness of the TV. You don’t always want that, if your TV can literally flashbang you. Sure, you want the sun to be peak brightness, but not every white object is as bright as the sun… That’s the true beauty of a good HDR experience. It looks fairly normal but reflections of the sun or the fire in a dark room just hit differently, when the rest of the scene stays much darker yet is still clearly visible.
HDR makes stuff look really awesome. It’s super good for real.