What seems to be happening is that the LUT created by Argyll doesn’t reflect its true effect on the display in Windows HDR mode. However, when applying the LUT with Windows in HDR mode, the CCT is way off D6500. In both cases the LUT produced is very similar to what I get when calibrating with Windows in non-HDR mode. In Legacy ICC mode, a wide gamut profile is created. In SDR mode, a profile with sRGB gamut is created. Under Windows 11 HDR mode, I’ve done Displa圜al/Argyll calibration in “SDR mode” (which Argyll runs in by default) and in “Legacy display ICC color management mode” (for this I set the “legacy ICC” toggle on dispread.exe, dispcal.exe, and dispwin.exe). This mode is set on the properties tab of the executable that wants to run in the mode: The monitor profile is auto created by Windows from monitor EDID info and passed to the app when it asks Windows for the current profile. This mode is for apps like Photoshop that want to do their own color management. The app can “see” the wide gamut display, and can map colors from its content to the display profile using ICC color management. Legacy display ICC color management mode.Without an ICC profile, sRGB is assumed and mapped to the display. If wide gamut content from the app has an ICC source profile, it is mapped (by Windows) to the display using EDID info from the monitor. The app is aware of HDR and can “see” the wide gamut display, and can display wide gamut and HDR content. Most apps (eg Firefox) run in this mode, and the Windows desktop and windowing system run in this mode. The app is unaware of HDR and its output is “clamped” to sRGB and is displayed as it would be on an sRGB monitor. When Windows 11 HDR mode is enabled, there are three modes an app can run in: (Actually it’s more subtle than that since typically the LUTs have a higher resolution output than input). (Apple have apparently gone even further in the way they’ve bodged their HDR mode, effectively disabling all the existing calibration and profiling abilities in HDR mode in the process.)īest bet is to figure out if this is indeed what’s happening, and then see if there’s any way of forcing HDR mode during calibration.Īrgyll’s measurement always occur through the per channel LUTs - there’s no way of turning that hardware off, but of course it sets the LUTs to unity while determining which sequence of LUT value results in the desired white and brightness. This type of problem is pretty typical when such special modes are introduced, rather than it being able to slot into an existing mechanism. Sounds like calibration and profiling is happening in non-HDR mode, and then the calibration LUTs are being applied in HDR mode. My theory goes something like, “the LUT table is interpreted in HDR space not in SDR space”, but I’m not really sure… I’m not sure if this is a bug in W11 HDR mode, or if its an artifact of how HDR works. The issue I’m having is that the video card LUT that’s created, while targeting sRGB at D6500, is coming out at 7000+ CCT when verifying the profile and when reporting on the calibrated display.ĭuring refinement passes, does Argyll actually measure the results of applying the LUT values so it knows how much to increase or decrease them by? My theory goes something like, “the LUT table is interpreted in HDR space not in SDR space”, but I’m not really I’ve been using Displa圜al to create monitor profiles for Windows 11 in HDR mode on my OLED laptop. I was wondering if you guys have any thoughts on what’s happening? During refinement passes, does Argyll actually measure the results of applying the LUT values so it knows how much to increase or decrease them by? While this works reasonably well, it is a clunky work-around and results in calibrations that are of lower quality. I then incorporate this corrected LUT into a new profile. So if the default LUT value for red at index 255 is 65535, and the calibrated value is 65531, the “corrected” value would be 65534 and so on for each LUT table index. I’ve worked around this by reducing the difference between each default LUT value and the calibrated LUT value by a factor of 4. The issue I’m having is that the video card LUT that’s created, while targeting sRGB at D6500, is coming out at 7000+ CCT when verifying the profile and when reporting on the calibrated display. I’ve been using Displa圜al to create monitor profiles for Windows 11 in HDR mode on my OLED laptop.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |