The advent of HDR has been a game changer for image quality when it comes to televisions. Console gamers, in particular, have enjoyed incredibly vibrant, high-contrast imagery which goes a long way to making up for a lack of overall GPU power. With the right TV, games (and movies) can be jaw-dropping on a TV.
When it comes to HDR on the desktop using a PC monitor, things have been…less great. There are plenty of reasons for this, which I’ll cover below, but the good news is that getting good HDR using a PC and a computer monitor (or TV) is possible if you put a little work in.
The promise vs. the reality of PC HDR
I’m not going to spend much time on what HDR is, you can read my HDR explainer for that. The most important thing to know is that HDR content expands the range of contrast, color, and brightness compared to SDR. The standard that’s been in use since CRT TVs were a thing.
Luscious blacks, eye-searing brights on things like flames, sunrises, and laser blasts are hallmarks of HDR. The problem is that most PC monitors—even gaming-specific models—don’t have the necessary brightness or contrast to do HDR justice. While they might have an HDR sticker, and can go beyond SDR to some extent, the actual result is disappointing. With some monitors, SDR mode simply looks better, and that’s not a good thing at all.
If a monitor doesn’t have directly backlit dimming zones for LCDs, a peak brightness of at least 400 nits, and supports 8-bit color with dithering, then its ability to accept an HDR signal doesn’t mean much. To be clear, these are bare minimum requirements and far from ideal. But many “HDR” monitors don’t meet these standards.
Windows isn’t helping—but it’s not the only culprit
Windows gets a lot of flack for bad HDR—and it deserves much of it! Desktop HDR mode still looks terrible thanks to how it attempts HDR tone-mapping. This makes it a chore to keep switching modes when you want to watch HDR content or play an HDR game. Microsoft made things a little easier by introducing an HDR keyboard shortcut for Windows 11. However, this only papers over the fact that HDR auto-switching is unreliable and if a game doesn’t do it right you have to close the game, toggle HDR on manually and then try again.
However, we can’t lay all the blame on Windows. The fact is that many monitors don’t follow the HDR Electro-Optical Transfer Function (EOTF) correctly. This describes how a display should convert a digital signal into actual light, and if a monitor is bad at this, it doesn’t really matter whether your software is doing its job correctly. So two displays with the same HDR badge and claimed specs can have very different results—one of which just looks wrong.
If the display can’t hit its luminance targets or follow the proper luminance curve, Windows can’t magically fix it.
Not all “HDR” labels are equal
A huge source of confusion comes from VESA’s DisplayHDR tiers. These labels have allowed monitors to bear the “HDR” logo that have no business being near it. It’s a little complicated, but here’s the most important stuff to know:
- DisplayHDR 400: This is not real HDR. This label does not require local dimming and with that lack of contrast, 400 nits is just too dim.
- DisplayHDR True Black 400: This is where HDR starts on desktop monitors. You need an OLED to achieve the contrast ratio standard for this badge, in which case 400 nits is sufficient to reproduce the HDR range. You just won’t get much peak brightness. If you play in a dark room, that issue isn’t a huge deal.
- DisplayHDR 600/1000: For a lot of people (myself included) this is where HDR actually starts, and 1000 is what most HDR content is mastered to at the moment.
So if you want real HDR, you need DisplayHDR True Black 400 or DisplayHDR 600 at minimum.
Your GPU and cable matter more than you think
So you have a proper HDR monitor, but if you’re using HDMI 1.4 or older DisplayPort cables then there might not be enough bandwidth to do it right, forcing various workarounds that can degrade color and clarity.
HDMI 2.1 or DisplayPort 1.4 are where you should be at minimum if you want to give your monitor what it needs to do the job right. Of course, your GPU also needs to support those standards. This is really mainly an issue when using your PC with a TV. There are plenty of perfectly good graphics cards out there that lack HDMI 2.1, and the vast majority of TVs don’t support DisplayPort. Which means the older HDMI standard on these cards can bottleneck HDR quality.
Games make it even messier
One of the big reasons HDR works so well on consoles is that these machines use fixed hardware and software. It seems that many developers have a devil of a time figuring out HDR on a PC. Essentially, you’re going to spend time tweaking each game individually to get HDR that looks good. In some games, HDR calibration is so bad that you might be better off using Windows’ Auto-HDR feature (cribbed from the Xbox) to tone map the SDR version of the game to HDR. Because it can actually look better in some worst-case scenarios.
How to actually fix HDR on your PC
That’s a lot of explaining what the problems are, but how do we fix it? First, accept that you’re not going to get great HDR in every game, but assuming you’re willing to give it the old college try, here’s what to do:
- Start with a good HDR monitor with a DisplayHDR 600+ or True Black 400 badge or better.
- Update your GPU drivers, and then set its output to 10-bit color. Even if your monitor doesn’t have a 10-bit panel, it will dither it down.
- Enable HDR and then calibrate HDR in Windows 11.
- Make sure you’re using an HDMI 2.0b/2.1 or DP 1.4 cable at minimum, and your card outputs at that standard.
- Calibrate each game individually using its internal tool.
If you really want to go all out, you can get your monitor professionally calibrated, but this is optional and most people will be happy with the monitor’s calibration out of the box. Especially if it’s a good quality model.
- Resolution
-
3840 x 2160 (4K)
- Screen Size
-
27 inches
- Brand
-
Alienware (Dell)
- Max. Refresh Rate
-
240 Hz