-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
incorrect PQ tracking for .hdr images in "no effect" mode #51
Comments
Thanks for your patience and for reporting this! Tentative findings from my initial investigation: From the RGBE file format spec, there is an EXPOSURE header variable:
In this file, that value is: This isn't my area of expertise, but from my reading of the spec, a decoder does need to divide color values by the EXPOSURE, i.e. the 8.5x higher values produced by the HDR image viewer are indeed correct. I am relying on directxtex for RGBE radiance decoding, and it looks like it is applying this divider. However, I'm not an expert on the RGBE format. It's entirely possible that the EXPOSURE directive isn't widely used so it practically should be ignored; however I need to do further research on how to achieve this while using DirectXTex. |
Attaching a quick and dirty hex-edited version of the repro image with When loading this image, my MaxCLL histogram calculation gives 1517 nits, which is within the expected 10% margin of error to 1400 nits design point. |
Hi Simon,
I just tested your 'hacked' version of the Contrast Test image that edited the header to have EXPOSURE 1.0 (whereas my test image had exposure 0.11713)
* NVIDIA's in-house viewer, HDRshow.exe appears to ignore the header exposure, the PQ codes I get are same for both versions !
* 'tev' also ignores the header exposure value, it reports "17.5" (*80 = 1400 nits) for the bright bar in both images !
I think the easiest solution is for me to change my .hdr image generation program to always use exposure 1.0. Then it should give identical results on all viewers. (to be confirmed)
Gerrit, NVIDIA
From: Gerrit Slavenburg ***@***.***>
Sent: Tuesday, December 27, 2022 8:34 AM
To: 13thsymphony/HDRImageViewer ***@***.***>; 13thsymphony/HDRImageViewer ***@***.***>
Cc: Author ***@***.***>
Subject: RE: [13thsymphony/HDRImageViewer] incorrect PQ tracking for .hdr images in "no effect" mode (Issue #51)
Thanks, Simon. Let me consult with someone inside NVIDIA that's an RGBE expert who gave me the encoding library.
From: Simon Tao ***@***.***>
Sent: Tuesday, December 27, 2022 8:08 AM
To: 13thsymphony/HDRImageViewer ***@***.***>
Cc: Gerrit Slavenburg ***@***.***>; Author ***@***.***>
Subject: Re: [13thsymphony/HDRImageViewer] incorrect PQ tracking for .hdr images in "no effect" mode (Issue #51)
Thanks for your patience and for reporting this!
Tentative findings from my initial investigation:
From the RGBE file format spec<https://nam11.safelinks.protection.outlook.com/?url=https%3A%2F%2Ffloyd.lbl.gov%2Fradiance%2Frefer%2Ffilefmts.pdf&data=05%7C01%7Cgslavenburg%40nvidia.com%7Cd89b0653df374bc221b708dae82477af%7C43083d15727340c1b7db39efd9ccc17a%7C0%7C0%7C638077542138067982%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=9chZ4DOtGjI187VZgwVTVGRXokdtlqfrUhg2tKXP42k%3D&reserved=0>, there is an EXPOSURE header variable:
EXPOSURE
A single floating point number indicating a multiplier that has
been applied to all the pixels in the file. EXPOSURE values are
cumulative, so the original pixel values (i.e., radiances in
watts/steradian/m^2) must be derived by taking the values in the
file and dividing by all the EXPOSURE settings multiplied
together. No EXPOSURE setting implies that no exposure
changes have taken place.
In this file, that value is:
EXPOSURE=0.11713
which corresponds to a multiplier of about 8.5, the discrepancy between the two apps.
This isn't my area of expertise, but from my reading of the spec, a decoder does need to divide color values by the EXPOSURE, i.e. the 8.5x higher values produced by the HDR image viewer are indeed correct. I am relying on directxtex<https://nam11.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fmicrosoft%2FDirectXTex&data=05%7C01%7Cgslavenburg%40nvidia.com%7Cd89b0653df374bc221b708dae82477af%7C43083d15727340c1b7db39efd9ccc17a%7C0%7C0%7C638077542138067982%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=PntAc1uYaJL%2Bmka0okpPg8uHIz%2Ffigq5niXORJETeiQ%3D&reserved=0> for RGBE radiance decoding, and it looks like it is applying this divider<https://nam11.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fmicrosoft%2FDirectXTex%2Fblob%2F23b7fd1f385d33f40093a0786ca5b54147f2f24a%2FDirectXTex%2FDirectXTexHDR.cpp%23L878&data=05%7C01%7Cgslavenburg%40nvidia.com%7Cd89b0653df374bc221b708dae82477af%7C43083d15727340c1b7db39efd9ccc17a%7C0%7C0%7C638077542138067982%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=x2Mf5uWZjDhEKTHGn%2BK2cE2U%2Fb8qiv0%2FAfeIcro78bU%3D&reserved=0>.
However, I'm not an expert on the RGBE format. It's entirely possible that the EXPOSURE directive isn't widely used so it practically should be ignored; however I need to do further research on how to achieve this while using DirectXTex.
-
Reply to this email directly, view it on GitHub<https://nam11.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2F13thsymphony%2FHDRImageViewer%2Fissues%2F51%23issuecomment-1366014242&data=05%7C01%7Cgslavenburg%40nvidia.com%7Cd89b0653df374bc221b708dae82477af%7C43083d15727340c1b7db39efd9ccc17a%7C0%7C0%7C638077542138223762%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=FAsDK86ailY2j4eEe0aPMzHVtN50179qY9e5SYZ7oPA%3D&reserved=0>, or unsubscribe<https://nam11.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fnotifications%2Funsubscribe-auth%2FAZJVWE5HP2ATNCEXMYOPWUDWPMH4NANCNFSM5WXS4HZQ&data=05%7C01%7Cgslavenburg%40nvidia.com%7Cd89b0653df374bc221b708dae82477af%7C43083d15727340c1b7db39efd9ccc17a%7C0%7C0%7C638077542138223762%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=OBaALWspm%2Ft9azVWaxozRJQkjwlswO1KKgB4AnLcBrs%3D&reserved=0>.
You are receiving this because you authored the thread.Message ID: ***@***.******@***.***>>
|
That's great to hear, let me know how it turns out! |
I did some tests using HDR + WCG viewer using one of our NVIDIA GSYNC HDR10 monitor prototypes. Our debug tools can observe the exact HDR PQ12 codes coming in from the PC to the monitor. And it looks like brightness of a .hdr image displayed by HDR + WCG is about 8.5x too high at default brightness and when selecting Render Effect = No Effect (all other settings have special transformations we don’t want for our tests).
Consider the attached 4k ‘simultaneous contrast’ style test image we use. It has a 1400 nits bright bar at top. At bottom it has a greyscale with 14 steps, starting with 0.02 nits and increasing by 1.5x for every step to the right. We verified that the .hdr brightness is as we want it with 'tev' (see table).
When we display this with an in-house viewer, we get PQ12 values that translate to the correct nits (within RGBE quantization accuracy). Table below shows PQ12 values/nits values we get with HDR+WCG viewer.
We’d be very interested to see a fix, so we can recommend use of this viewer for our partners/monitor makers.
ContrastTest_0p02_1p5_3840_2160.zip
PS: we did these experiments with Win10 21H2 (19044.1706), and a 27” 4k 144 Hz 1000 nits GSYNC Ultimate monitor (max 1107 nits).
The text was updated successfully, but these errors were encountered: