Assassin’s Creed Shadows is as dark as that infamous Game of Thrones episode
-
"Even playing in HDR..."
Maybe that's part of the problem? HDR implementation on my Samsung sets is garbage, I have to disable it to watch anything. Too bad too, because the picture is gorgeous without it.
HDR On:
HDR Off:
That's so weird, HDR is supposed to do the exact opposite of this.
The again, Samsung... Don't buy Samsung anymore, it's been a trash brand for a long time now
-
That's so weird, HDR is supposed to do the exact opposite of this.
The again, Samsung... Don't buy Samsung anymore, it's been a trash brand for a long time now
Yup, yup. Highly rated when I bought them, but in actual usage? Not so much.
-
I turn off HDR whenever I can. I think it looks bad
It’s one of those things where it looks good when in like the case of a video game, the GAME’s implementation of it is good AND your Console/PCs implementation is good AND your TV/Monitor’s implementation is good. But like unless you’ve got semi-deep pockets, at least one of those probably isn’t good, and so the whole thing is a wash.
-
I didn’t really understand the benefit of HDR until I got a monitor that actually supports it.
And I don’t mean simply can process the 10-bit color values, I mean has a peak brightness of at least 1000 nits.
That’s how they trick you. They make cheap monitors that can process the HDR signal and so have an “HDR” mode, and your computer will output an HDR signal, but at best it’s not really different from the non-HDR mode because the monitor can’t physically produce a high dynamic range image.
If you actually want to see an HDR difference, you need to get something like a 1000-nit OLED monitor (note that “LED” often just refers to an LCD monitor with an LED backlight). Something like one of these: https://www.displayninja.com/best-oled-monitor/
These aren’t cheap. I don’t think I’ve seen one for less than maybe $700. That’s how much it costs unfortunately. I wouldn’t trust a monitor that claims to be HDR for $300.
See my "set 2" links above. (at the time) $3,200 8K television, "If you want the brightest image possible, use the default Dynamic Mode settings with Local Dimming set to ‘High’, as we were able to get 1666 nits in the 10% peak window test."
HDR still trash.
-
This post did not contain any content.
My guess (and this is a complete guess) is that they (Ubisoft) just assumed everyone has OLED TVs/Monitors.
Also, you'd want to re adjust the white balance level, or gamma levels/curve, probably not just the brightness.
-
I didn’t really understand the benefit of HDR until I got a monitor that actually supports it.
And I don’t mean simply can process the 10-bit color values, I mean has a peak brightness of at least 1000 nits.
That’s how they trick you. They make cheap monitors that can process the HDR signal and so have an “HDR” mode, and your computer will output an HDR signal, but at best it’s not really different from the non-HDR mode because the monitor can’t physically produce a high dynamic range image.
If you actually want to see an HDR difference, you need to get something like a 1000-nit OLED monitor (note that “LED” often just refers to an LCD monitor with an LED backlight). Something like one of these: https://www.displayninja.com/best-oled-monitor/
These aren’t cheap. I don’t think I’ve seen one for less than maybe $700. That’s how much it costs unfortunately. I wouldn’t trust a monitor that claims to be HDR for $300.
wrote on last edited by [email protected]Nope, it does have wide color gamut and high-ish brightness, wouldn't buy unless reviews said it was ok. But it does some fuckery to the image I can only imagine could be to make non-hdr content pop on windows but ends up messing up the image coming from kde. I can set it up to look alright in either in a light or dark environment but the problem is I can't quickly switch between them without fiddling with all the settings again.
Compared to my cooler master a grayscale gradient on it has a much sharper transition from crushed bright to gray but then gets darker much slower as well, to a point where unless a color is black it appears darker on the cm despite it having an ips screen. Said gray also shows up as huge and very noticable red green and blue bands on it, again unlike the cm which also has banding but at least the tones of gray are similiar.
Also unrelated but just noticed while testing the monitors, max sdr brightness slider of kde seems to have changed again. Hdr content gets darker on the last 200 nits while sdr gets brighter. Does anyone know anything about that? I don't think that's how it's supposed to work
3 months edit: I might've been wrong about this. At the time I had both monitors connected to the motherboard (amd igpu) since the nvidia driver had washed out colors. Since the cooler master worked I assumed the amd drivers were fine. But a while back I ended up plugging both into the nvidia gpu and discovered that not only were the nvidia drivers fixed, but with it the Samsung didn't have the weird brightness issue neither.
Edit edit: Even though the brightness is more manageable it's still fucked. I've calibrated it with kde's new screen calibration tool and according to it the brightness tops out at 250 nits. However it is advertised and benchmarked to go up to 600 and I've measured 800 ish using my phone sensor, and it looks much brighter than an sdr 200 nit monitor. Which makes me think even though it is receiving hdr signal, it doesn't trust the signal to be actually hdr and maps sdr range to its full range instead; causing all kinds of image issues when the signal is actually hdr.
And just to make sure it's not a linux issue I've tried it with windows 10 too. With amd gpu hdr immediately disables itself if you enable it and with nvidia gpu if you enable hdr all screens including ones not connected to it turn off and don't work until you unplug the monitor and reboot. Cooler master just works
-
Nope, it does have wide color gamut and high-ish brightness, wouldn't buy unless reviews said it was ok. But it does some fuckery to the image I can only imagine could be to make non-hdr content pop on windows but ends up messing up the image coming from kde. I can set it up to look alright in either in a light or dark environment but the problem is I can't quickly switch between them without fiddling with all the settings again.
Compared to my cooler master a grayscale gradient on it has a much sharper transition from crushed bright to gray but then gets darker much slower as well, to a point where unless a color is black it appears darker on the cm despite it having an ips screen. Said gray also shows up as huge and very noticable red green and blue bands on it, again unlike the cm which also has banding but at least the tones of gray are similiar.
Also unrelated but just noticed while testing the monitors, max sdr brightness slider of kde seems to have changed again. Hdr content gets darker on the last 200 nits while sdr gets brighter. Does anyone know anything about that? I don't think that's how it's supposed to work
3 months edit: I might've been wrong about this. At the time I had both monitors connected to the motherboard (amd igpu) since the nvidia driver had washed out colors. Since the cooler master worked I assumed the amd drivers were fine. But a while back I ended up plugging both into the nvidia gpu and discovered that not only were the nvidia drivers fixed, but with it the Samsung didn't have the weird brightness issue neither.
Edit edit: Even though the brightness is more manageable it's still fucked. I've calibrated it with kde's new screen calibration tool and according to it the brightness tops out at 250 nits. However it is advertised and benchmarked to go up to 600 and I've measured 800 ish using my phone sensor, and it looks much brighter than an sdr 200 nit monitor. Which makes me think even though it is receiving hdr signal, it doesn't trust the signal to be actually hdr and maps sdr range to its full range instead; causing all kinds of image issues when the signal is actually hdr.
And just to make sure it's not a linux issue I've tried it with windows 10 too. With amd gpu hdr immediately disables itself if you enable it and with nvidia gpu if you enable hdr all screens including ones not connected to it turn off and don't work until you unplug the monitor and reboot. Cooler master just works
Yeesh sounds like your monitors color output is badly calibrated :/. Fixing that requires an OS level calibration tool. I’ve only ever done this on macOS so I’m not sure where it is on Windows or Linux.
Also in general I wouldn’t use the non-hdr to hdr conversion features. Most of them aren’t very good. Also a lot of Linux distros don’t have HDR support (at least the one I’m using doesn’t).
-
This post did not contain any content.
-
See my "set 2" links above. (at the time) $3,200 8K television, "If you want the brightest image possible, use the default Dynamic Mode settings with Local Dimming set to ‘High’, as we were able to get 1666 nits in the 10% peak window test."
HDR still trash.
8K TVs are all LCD and $3200 is on the low end of 8K TVs. So yeah of course you’d get a trash image.
-
It’s one of those things where it looks good when in like the case of a video game, the GAME’s implementation of it is good AND your Console/PCs implementation is good AND your TV/Monitor’s implementation is good. But like unless you’ve got semi-deep pockets, at least one of those probably isn’t good, and so the whole thing is a wash.
Yeah, it's very believable that the tech is finicky and it's very easy for it to look bad.
-