I'm a game developer dealing with HDR and I do HDR video mastering on the side as well. Lack of dynamic metadata issue isn't really an issue at all, or at least compared to other issues it's so insignificant it isn't worth mentioning. You said good luck on being able to cover every lighting situation with setting lower and upper brightness limit at the start. I just never had any issues related to this whatsoever. Real problem for game developers is not being able to afford $30000 for an HDR mastering monitor. Most devs don't have the budget of Rockstar or Activision, so most developers will test HDR implementation on something not designed for it. One of the actual issues on PC that doesn't get enough attention is that if windows are running in HDR, it makes SDR content very slightly inaccurate in 10 bit, and only works perfectly in 12 bit mode, which some HDR monitors don't support (this is only relevant to those working with graphics). HDR should be something that's enabled by default, automatic, and just works. Edit: I figured out what the artifacts are and made a video explaining them, video is titled "This is why displaying SDR in HDR causes quantization artifacts in grayscale gradients"
I'm not a game developer but I've been gaming in HDR since 2017. My main problem since moving to OLED is the crushed/grey blacks on certain titles which is mainly a problem with OLED. Having a $3000 accurate monitor to develop games wont really help anyone because most displays aren't color accurate and have a low peak brightness. In my experience Dolby Vision games look much better on the PC because of the dynamic meta data. Developers should consistently add brightness sliders in their games so end users can calibrate both the bright and dark portions of the games like Cyberpunk 2077.
Exactly, having to switch back and forth HDR and SDR manually between Windows and games is very annoying. To the point that I sometimes just ignore HDR.
Has it improved at all in Windows 11? I imagine that providing SDR tone mapping for thousands of monitors is a bit of a challenge but if that info is provided in the monitor EDID wouldn’t it be feasible to do some sort of offset based adjustment? I a noob to this topic, very interesting insight from a dev, thanks for sharing!
Red Dead Redemption 2 is a good example that even developers with a huge budget can f* it up really bad. I do run Windows 10 with HDR on my A90J and since it's just a gaming PC it's fine. I would like to get a good HDR monitor but until recently there wasn't a single good one out there. VESA HDR 400 is just non-sense. My current monitor can get brighter ... but doesn't support HDR. So VESA HDR 600 is the pure minimum. ASUS ROG Swift OLED PG42UQ and PG48UQ it is!
fr i use it on my c1 and in most games its implemented in a very nice way, but certain games (ie battlefield 1 and 5) are very poorly implemented. i think its alot more affected by the gem devs not the technology or windows, this video is super misleading and uninformed, basically just blabbing on and on about hdr standards which anyone who has half a brain already knows about. the fact that he didnt even bring up the lg oleds for pc gaming just shows how little he knows about it. like no shit if you have a bad monitor its not gonna look good in hdr either...
I game on console, but I just picked up an LG C2 for my first ever OLED. Honestly HDR is almost laughable on my older display. And it was a pretty good display already. This C2 is absolute perfection though. Gorgeous HDR, now with a large amount of great HDR games on the consoles. I'm still a bit apprehensive about image retention on OLED. But I figured I'll give one a go and see how it does. Now I'm probably not going to be able to move away from OLED. It's just so good. And the LG can do 4K 120hertz in Dolby Vision with VRR/Freesync with extremely low input lag. I'm in freaking heaven. Best image quality I have ever seen.
@@sawdust8691 trust me when I say this, OLED burn in is not an issue on C2s. I run mine as a monitor for my pc. Use it for work all day and game for 3 hours in the evening everyday. When I work it is all static images, not a hint of burnin owning it for 6 months. I also have a C1 in my living room and no issues either.
@_numbing sweet. I wonder how long they'll last. I mainly game on mine, usually 2 to 3 hours a day. Sometimes more on weekends.The TV is off the rest of the time, so I hope it lasts me a good long while.
I finally got my Alienware QD OLED and it’s amazing. First games I tried were Doom Eternal and Tomb Raider and my goodness…you won’t ever go back to IPS after 🤤
Nice Kemper you got there lol. But yeah the biggest issue I stumbled into on this HDR journey is that in Windows it is a universal toggle. If you turn it on you will destroy the colors of any SDR content regardless of where you set the "SDR content brightness" slider because it's no longer native SDR. This means watching 99% of RUclips videos with messed up colors because HDR content just so happens to require very beefy system to edit. Another problem related to this is that whenever a RUclips creator masters content for HDR they are simultaneously destroying the quality of the video for anyone watching in SDR since it uses the same type of filter to convert it back to SDR. This combined with the fact that HDR has so many different standards as you mentioned now makes the whole thing feel utterly pointless since one of the selling points of moving away from SDR was to have a single standard video makers and game developers can master for. I mean what's even the point if HDR looks worse than SDR half the time and by turning it on you make anything SDR look bad?
I'm actually loving HDR on my Alienware aw3423dw qd OLED. I think it adds more to the image of the game than any other modern tech, specifically ray tracing. HDR in games like halo Infinite, doom eternal, and even auto HDR in spiderman remastered is incredible looking. Doom eternal looks like a totally different game with HDR set correctly and the hardware to drive it. As a side note, I'm also running windows 11 22h2.
Hes talking rubbish, for example spider man remastered pc doesnt need auto HDR (sdr to hdr conversion) it has proper HDR build in, at least on PS5, I own both versions and HDR on PC made me play the game on PS5 instead with my 4090 as the PC version has increadibly washed out colors and grey appearance on PC. Games that have decent HDR are mostly Microsoft titles, forza Horizon, sea of thieves, gears of war and a few more that are not totally awful. Red dead redemption 2 looks ridiculous apart from peak brightness for another horrible example.
Dolby Vision should be added to Windows 11 and use the same codec as Xbox Series S|X. Auto HDR works really well on some games. But developers should be encouraged to add HDR to PC titles as well. I think it could get better. That also leaves Spatial Audio support on Windows 11 lackluster as well. Ultimately though we need DisplayPort 2.0 to actually launch. That will bring HDR to the light for PC, because higher res HDR is only realized on HDMI 2.1 hardware, lots of monitors are only just getting HDMI 2.1 so the needed port bandwidth is crucial. DisplayPort 2.0 should be able to do 16K HDR.
Once I got the LG CX as a monitor HDR issues were barely a thing. They're not completely gone though. Some games look terrible with HDR on and usually there is no way to disable it in the game itself so you have to turn it off in Windows just to have it off in this one game. Then you must remember to turn it back on in Windows once you're done with that game.
I suppose they are games were Auto HDR works? If you are on Windows 11 that's a thing, and the new HDR Calibration App would fix this issue if a game runs on auto HDR, which uses the systems data rather than ingame settings.
@@007GoldenLionNot true. When I turned on HDR for every game, games that's weren't actually HDR looked weird, colors were off and not as they were intended. Luckily you have have HDR on only on supported games.
I think it's really weird that HDR10+ hasn't taken off in the PC (and mobile) gaming space because it's an open standard like HDR10 and doesn't have a royalty fee while providing most of the benefits Dolby Vision offers over HDR10. Heck HDR10+ is even the default for dynamic meta-data in the HDMI 2.1 spec so what's holding it back?
Nice video. The monitor situtaion is rapidly improving as of late 2022. The video points out a few standouts with many more on the way. I personally went the TV route and play PC games on a 42" Sony A90K. I think games will be the main issue moving forward.
I'd stay away from the Sony Inzone M9. I had it and sent it back. The picture quality was great for what it was, but it has some serious firmware issues that affect the HDR handshake making it practically unusable. Sony appears to be aware of the firmware issues and only states that they don't know if and when a fix will be available.
HDR = specular, highlights and shadow detail on a broader color spectrum. People think 1000 nits is gonna give you better HDR and it is only true for TVs 50” or bigger, since you’re gonna be sitting further away from those TVs than a 32 inch monitor. I think HDR 400/600 32 inch monitors can produce great HDR, given the HDR is coming from a good source.
@@NexGenTek oh, and you are? Saying absolutely need 1000 nits? You know they’ve been making Oleds for about four years now maybe longer with peak nits of 900. Hollywood masters there HDR Movies at 900 nits. The new quantum oleds after professional calibration reach about 950 nits on average. Don’t say I’m not a professional because I just killed your comment with receipts.
@@ToxMod What? The G2 and S95B hit over 1,000 nits after calibration and comparing oled which is completely different technology then most people use as monitors the contrast is outstanding for HDR but there's a standard for HDR and it's 1,000nits for HDR10. Also my S95B can achieve 1500nits without calibration. So no you killed nothing but your own accusations.
HDR 400 shouldn't even be allow to call HDR, there's barely any differences. 600 should be the bare minimum. You guys should check out Asus ROG Swift PG42/48UQ, it's pretty amazing for a monitor (well it's basically "gamerfy" LG C2)
Games definitely need better min/max brightness calibration setup sliders, especially for 0 nits OLED. Or allow Windows global settings from their new Windows hdr calibration app
On a QD OLED in Windows 11 and it’s not that bad. HDR looks great in games that support it. Auto HDR looks good too. If you have an LCD display, then yeah it sucks.
For now only OLED can give you a real HDR experience for a decent price, LCD is still a long way from providing an affordable monitor with 5k dimming zones minimum to have a comparable experience.
@@MadOrange644 5,000 dimming zones is STILL a garbage JOKE for any modern display which has MILLIONS of pixels. 😂😂😂 That's why OLED wipes the floor with LCD.
Is there a way to duplicate your screen in HDR on Windows 10 ?. When i duplicate the screen from my main PC to send the image into my recording PC's capture card (4K60 MK2 & OBS). HDR becomes disabled on my main PC, the toggle completely disappears.
i did this and my monitor looks amazing. Gaming though is odd in various cases. cyberpunk looks very weird to me overbright. as do apex legends it looks just over colored.
I have the Alienware QD OLED. The AutoHDR on Windows is fine for the most part. Not great, but not terrible. However, on games that support HDR and have a good implementation, the presentation is night and day when compared to SDR. Playing FF7Remake, GoW, or Horzon Zero Dawn with HDR is frakking amazing. The effect, for me at least, is more impressive than ray tracing on other titles, like RE2/3, Control, etc. I really hope that ~32" 2k and 4k OLED/QD OLEDs come down in price so more people can see that difference.
One of the reasons I get some games on my PS5 despite having a fairly powerful PC is because it does HDR so much better than PC. Also literally plug and play with no need to make any weird changes on Windows.
When something new arrives on the technology scene, we usually think it will make everything better with a switch of a button. Like RTX and HDR, but everything comes down to artistic touch and fine tune. The best example I can say is that, Crysis Remastered might have RTX but original version's first sunrise with its orange hue looks so much better.
I've struggled to get the HDR on my Samsung Odyssey to look good. After much effort calibrating, I still can't get a good picture without a massive color washout. To me, the juice isn't worth the squeeze yet and people are simply defending their purchases at times.
My problem with hdr is my main monitor I use for work and my Sony TV are connected to the same card (I mean why wouldn't they be?). This is an issue as windows will not let you turn on HDR unless all monitors connected support it. I get why they do it top a degree but its frustrating and can only be fixed by disabling my monitor when I game on my TV.
i don't have that problem at all. main monitor is an older dell 30" connected via DP, TV is a TCL 6 series via HDMI, all connected to a 3080Ti.. go into display settings, click on monitor 2, enable "Use HDR".. it works fine with both enabled
Works on my laptop with the built in display enabled (no HDR) + an external large monitor which has HDR support. Though there are a few issues as Windows HDR support is woeful. I've not tried it with multiple displays on my main PC.
Just thought I'd add that it works for me too. I have two monitors connected to my Windows 11 PC - one is HDR enabled and one is much older and probably can't even spell HDR. However, I am still able to turn HDR on and off in Windows when I need to and it works fine for me. Maybe it's like someone else mentioned - that is only a problem if you are duplicating displays and not extending your desktop? Or, maybe it has to do with using a TV as a monitor? Not sure, but it does work.
I have a 4090 graphics card in my desktop and I’m using the Alienware 27 inch 1440 P monitor just realized it had HDR and I’ve been trying it out. I don’t see any problems with it at least on call of duty kind of makes things look more color accurate to me just bothers me that you can’t change the brightness level only the contrast.
will depend on the actual monitor, i would class it as HDR truly but a small step from sdr if you have full array local dimming and your panel is 10 bit (most are 8 bits which is just normal sdr unchanged from the crt days)
Mixed feeling about HDR on (my) PC monitor. Maybe it's a user error, but when I turned on HDR in win10 the desktop just got too bright, so installed w11, in w11 the desktop does not seem natural, although much better than w10 it is still off in some way and maybe a tad too bright too. HDR in videos well YT to be honest look amazing, but the brightness are a strain to my eyes. I haven't really played any games in true HDR yet, just started Cyberpunk 2077, and while it look very good, it is still kind of straining my eyes. The monitor is Acer x32FP. So in the end I actually prefer running my monitor in SDR, SRGB bc I want things to look natural, well that looks more natural in all desktop related content, pictures etc. Maybe this is just me, I really don't know. I don't regret getting a monitor like this bc the dimming zones still gives better blacks in SDR, but HDR on pc is a mixed feeling.
I'm very pleased with the HDR on my AW QD-OLED. It's comparable to LG OLEDs (maybe not quite as natural and crisp a picture). Elden Ring and GoW are amazing. CP2077 is so so but ultimately worth using. I've found most PC/console titles do an acceptable to good HDR experience.
@paskowitz I'm very happy with my Alienware QD-OLED as well. Best picture quality I have every had on my PC. @AmpEdition Thanks for the recommendation to try Doom Eternal. I've been pretty happy with the results that HDR gives in Battlefield 2042 with one slider, but I've never really played Doom Eternal despite receiving it as a gift. I will have to go try it out to see what "the HDR standard" looks like. 🙂
The Odyssey Neo G8 is basically perfect for a PC monitor. 1196 local dimming zones, 2000 nits of peak brightness and full HDR10+ support. That's not even mentioning its 240hz refresh rate, it's large amount of options to adjust or its great console support. You can say the biggest set back for the G8 is the lack of developer support. But when there is proper support the HDR on the G8 is breathtaking. It's the same feeling as building a new PC, and playing all your games at ultra settings at 4k as opposed to playing at 1080p at medium settings. With that said, at the worst of times things look either too contrasty or too washed out, but that's due to poor developer intergration, not the monitor.
The 2000 nits is total BS, it can only hit that for a couple seconds on a test pattern and then can't achieve it again until the entire screen goes dark for a while. Plus it's hitting 2000 nits when the content is only asking for 1000 nits so you end up with blown out highlights and can't tell the difference between parts of the image that are meant to be 1000 nits and 2000 nits.. The real peak is 1100 nits. It can only sustain 300 nits full screen which doesn't even meet the full screen long duration requirements for DisplayHDR400. The display doesn't follow eotf at all and greatly over darkens dark colors and overbrightens highlights so it will never produce the image the content creator intended. That absolutely is the monitor's fault and the the developer's.
@@Lead_Foot where did you get the 300 nits from? Because you're lying. Anybody with eyes and working retinas will be able to tell you that at Max brightness that it is NOT just 300 nits for fullscreen brightness. SDR brightness alone can be set closer to 600-650 nits if you are fine with needing to squint when looking at a white background. Also yes, it is the developers fault. I was referring to From Software's Sekiro and Elden Ring which are both well documented and infamous for their poor HDR support. Literally every other game I've played and every streaming service had proper HDR support.
Despite its problems due to G-Sync ultimate making firmware updates impossible, the AW34 QD-Oled was, imo, well worth the upgrade. I just wish I'd bought this before I sunk another chunk of money into a deciptively worded standard LED TV which is absolutley not OLED, and absolutely gave me a headache to try and watch (I need OLED because my eyes are very light sensitive, especially in cases where a TV or monitor only emulates black/dark instead of actually having black/dark). It just sucks that there are too many games that don't switch on the HDR settings in game while I play them connected to the monitor.
The only way to enjoy HDR on PC is with an OLED or a Mini Led with a VA Panel with hdr 1000 and at least 384 dimming zones. Anything lower then they suck.
ive played Elder Scrolls Online on both pc as well as Xbox Series X. And after the Xbox version received a fidelity upgrade, i think the game actually looks better on my xbox now than it does on PC. The better colors really makes the game more attractive and appealing. With the fake HDR on PC, it just looks dull in comparison. And with HDR turned off on PC, it actually looks better than with it on but still doesn't quite give the appeal of Series X. You would think with a $5,000 pc i could get that hdr fidelity but unfortunately i still cant
@@thecarsonigen also higher resolution and higher graphics and 100+ fps and getting exclusives from both consoles As someone with a good PC i see 0% reason to ever get a console again, i have everything i need on my PC
Samsung's Neo G7 and G8 have dropped in price quite a fair bit and are 1196 zones. I've ordered a Neo G8 yesterday and it's only a little more expensive than what my current Acer X28 cost me, which is 152hz and only HDR400 and it's just a standard IPS. I have a feeling as newer versions of OLED and QLED monitors come out, those older ones from 1-2 years ago are going to plummet in price.
Great video! It seems also like there is confusion about when HDR is supported by PC games thus users will keep HDR off in their OS settings and never experience it. Would also like your thought on 1. Auto HDR in Win11 and 2. PC Gaming on TV such as the 40” LG OLEDs. Thanks!
@@mightymushroom4980 I did not know that! Do you know which games it has that effect on? I assume these are eSports titles and they don’t have HDR support but HDR on in the OS does that? Is it GPU specific?
@@mightymushroom4980 I'm pretty sure it's not the HDR itself that can add a tiny bit (imperceptible to humans) of input lag but rather the something with regards to the HDR & auto-dimming of non-OLED monitors. That's why OLED monitors show no increased input lag whatsoever when running in HDR mode.
I have a 7900xtx and hdr on with 9mm response to the pixel but even with hdr in os turned off I still see no difference other than slight less flicker but when I use sdr ita fine just terrible peek brightness in certain games 😅
Well, when I got my HDR400 display many years ago, I realized how insincere a lot of tech comentators were, because they were complaining about HDR400 and saying how useless it was, and how it wasn't even HDR. I'd look at my HDR400 display and then look over at my old display which was probably about 250 nits. And it was obvious to me they were just lying. I'm not sure what the motivation for doing so is. But there you go. Now I have a HDR600 display and it's obviously better, and I'm starting to see the true intent of HDR. But the fact is, progress is progress. And HDR400 is better than no HDR at all.
HDR even on consoles is still hit and miss. The problem for me is the colours don't pop as you'd imagine from the marketing spiel. In fact if a bright spot is on the screen, it actually will wash out colours as the HDR focuses its power on creating that bright spot over colour accuracy in the rest of the imagine. Stick to SDR gaming and you'll see more impressive colour using a HDR TV. Backwards as it sounds I know
this is simply not true, what tv do you have? you are more than likely using incorrect gamma, brightness, contrast, or color depth settings, cuz hdr looks incredible on my c1 and when a game doesnt support it i am dissapointed because it just simply doesnt look as good.
Bombshell - there's enough dynamic range in Standard dynamic range that, if the game developers/artists wanted to, they could have ENSURED 'dark areas' were suitably corrected - where it would benefit.
Completely agree. If a game's too dark, i just increase the contrast from the game's settings and max out the monitor's brightness, and all the dark details suddenly come back. It even looks more realistic than before.
OK well glad to hear, I upgraded to an RT4070 from GTX1070 and was looking for a screen, turns out I don't need to spend 400-500 euros for a great gaming computer, I just bought HP - Gaming x27i Moniteur 27" for 280 euros and it will probably blow me away from a 65Hz 1080p set up I have right now. It's very easy to spend a lot more than needed in PC gaming.
Works just fine on my 3090ti with an LG C1. Running Mass Effect Definitive Edition in Dolby Vision looks amazing af. It's just game companies that are lazy to implement it.
From all the games you play mass effect with a 3090ti , bruh if i was a girl i would sell my body to get that videocard and play some quake champions on nice settings. Lgc1 isnt that a oled tv? Also nice
June 2023, PC HDR gaming was working fine in Windows 10. Just was told by Microsoft I needed to upgrade to 11. Now HDR comes on but two new games EA Sports Golf and PGA Tour 2k23 both are unplayable with HDR on? I have all new 2022 hardware. TV Hisense 1500 nit 120 fps, Denon receiver, HDR and 120 fps receiver, and AMD 6750 GPU. By the way, Windows says Denon Receiver isn't "certified", which it is? I'm very disappointed with Microsoft.
OMG...I've been waiting for a video like this for nearly 5 years, ever since HDR became a thing for gaming. No other reviewer would really go into detail about what's important to look for, or explain the importance of HDR in gaming is. So we the consumers can make an educated decision on what to look for in a gaming monitor.
I think the title of this video is misleading. I run my PC into the Hisense U6H and HDR games look phenomenal. I was playing Ori and the Will of the Whisps today and it looked and sounded stunning. Gaming monitors don't have good HDR support but using a PC to run a living room setup works great.
Now for the first time ever, you can buy a true HDR monitor for 500$.The GP27Q and M27T20 have 576 dimming zones (as much as apple's pro display XDR lol) excellent color reporduction thanks to a quantum dot layer, and can sustain 1000nits brightnes even at full screen. AND they're 1440p, 144hz. I got the M27T20 and the games with hdr on this display look absolutly fantasitc!
Hdr worked fine for my 2070 super founders edition. I tried to enable windows 10 Hdr on my gigabyte 3070 and my LG oled tv goes black and the video never comes back on, reboot it and still no video. I had to put an old video card in it just to get video and disable HDR and then put the 3070 back in. After that I've never tried to turn it back on. It's probably gigabyte that is the problem and why they had sooo many revisions of the 3070 and on top of selling them for dirt cheap while everyone else was selling 3070's for 2 or 3 times the price.
@@joaquinarchivaldoguzmanloe1073 check the specs and just make sure it’s at least VesaDisplayHDR1000. Preferably Full Array Dimming as well but if it’s not, it should still be decent. You need that brightness.
@@joaquinarchivaldoguzmanloe1073 ok so it can hit 1000 nits but it only has 24 dimming zones. The size and the fact it’s 4K 160hz is super nice. If you never tried HDR on PC this one is actually probably decent. Especially for $1,000.
It seems like tv might be the way to go for PC HDR... I use a 65 inch LG CX, which looks amazing, and a 50 inch samsung qn90a, which will do 144hz and also looks amazing. Seems like others are attesting to the same with other lg oled models.
Went 48 inch lg oled as a monitor years ago. So glad i did. I can also use 3840 x 1600 if I want to (with true black bars on top/bottom). But since i have a large desk, i have not needee to.
HDR makes my ps4 look beautiful, but I can't get it to look good for the life of me on my PC. The darks are always too dark no matter what I do. It's really frustrating.
Samsung G7 4k 165hz mini led have a very beautiful hdr but when i start some game, not all on pc i must start in windows mode and after in full screen or the game crash. GSync don't work to, he macke a black screen
idk for me it looks amazing it looks way better but there are some annoying bugs like right now when i have a game open and im watching youtube it keeps switching between HDR and SDR and each time it switch the screen turns off then on
Samsung mentioned that they should be coming out with HDR10+ gaming monitors in 2022 . So you would think they would be out by now but I can't find them myself
As far as I know, nothing. If the monitor supports all those features you should be able to use them all at once. Some higher spec monitors are HDR400 or 600 because they probably have to make cutbacks in order for HDR1000 or above to work.
a year went buy an HDR on PC still sucks. EDID data interpretation has A LONG WAYS to go. I purchased the most expensive ASUS 4k 240hz monitor and it kinda sucks for HDR, It just doesnt get bright when it needs to and constantly reverts to the wrong EDID data of a 450nit SDR max luminance when it should be at 1,100 nitt max instead- thus making scenes verry dark when their not supposed to be
The bit about brightness not being important is total bollocks, since our eyes are accustomed to perceving hight energies up to hunders of thousans of nits in real life. The fire, the harsh glints on metallic surfaces, water, what have you, or others energy sources require tons of nits to be represented in a physically correct, realistic and convincing way. Localized peak brightness is just as essential as the encoded amount of f-stops/pixel brightness states, contrast and increased color depth gamut.
In some games looks awful. Days Gone for example. Everything is washing out and bleach is all around. In some others looks great. AC:Odyssey for example, looks a lot better when i play with w10 HDR enabled, same goes for Metro Exodus EE & Cyberpunk 2077 with sRGB10. It depends on each game HDR setup as it seems, without a standard ''middle''. Just play with game's settings and monitors brightness/contrast/saturation.
to me hdr is snake oil. Maybe I am too poor to get that tv that contradicts my findings, maybe I am just wired differently, but I see no difference in any content with it on vs off. Definitely not worth say $200 more.
I've seen HDR 4K monitors for $1,300. That's more than any VR setup. You could get a high end gaming PC for that price that does ray tracing with VR support.
Cos hdr gaming is a gimmick and pc already has the best picture quality and frame rate.hdr is a disaster of mastering and mixing with no real standards cos TV manufacturers all have their own way of interpreting it just look at shadow detail between Sony and Samsung.
I'm a game developer dealing with HDR and I do HDR video mastering on the side as well. Lack of dynamic metadata issue isn't really an issue at all, or at least compared to other issues it's so insignificant it isn't worth mentioning. You said good luck on being able to cover every lighting situation with setting lower and upper brightness limit at the start. I just never had any issues related to this whatsoever. Real problem for game developers is not being able to afford $30000 for an HDR mastering monitor. Most devs don't have the budget of Rockstar or Activision, so most developers will test HDR implementation on something not designed for it. One of the actual issues on PC that doesn't get enough attention is that if windows are running in HDR, it makes SDR content very slightly inaccurate in 10 bit, and only works perfectly in 12 bit mode, which some HDR monitors don't support (this is only relevant to those working with graphics). HDR should be something that's enabled by default, automatic, and just works.
Edit: I figured out what the artifacts are and made a video explaining them, video is titled "This is why displaying SDR in HDR causes quantization artifacts in grayscale gradients"
I'm not a game developer but I've been gaming in HDR since 2017. My main problem since moving to OLED is the crushed/grey blacks on certain titles which is mainly a problem with OLED. Having a $3000 accurate monitor to develop games wont really help anyone because most displays aren't color accurate and have a low peak brightness. In my experience Dolby Vision games look much better on the PC because of the dynamic meta data. Developers should consistently add brightness sliders in their games so end users can calibrate both the bright and dark portions of the games like Cyberpunk 2077.
@@2Drip007 crushed gray blacks? You must have an LG OLED.
Exactly, having to switch back and forth HDR and SDR manually between Windows and games is very annoying. To the point that I sometimes just ignore HDR.
Has it improved at all in Windows 11? I imagine that providing SDR tone mapping for thousands of monitors is a bit of a challenge but if that info is provided in the monitor EDID wouldn’t it be feasible to do some sort of offset based adjustment? I a noob to this topic, very interesting insight from a dev, thanks for sharing!
Red Dead Redemption 2 is a good example that even developers with a huge budget can f* it up really bad.
I do run Windows 10 with HDR on my A90J and since it's just a gaming PC it's fine. I would like to get a good HDR monitor but until recently there wasn't a single good one out there.
VESA HDR 400 is just non-sense. My current monitor can get brighter ... but doesn't support HDR. So VESA HDR 600 is the pure minimum.
ASUS ROG Swift OLED PG42UQ and PG48UQ it is!
HDR on Windows 11 on my LG C2 has been amazing. Best upgrade I have ever made.
fr i use it on my c1 and in most games its implemented in a very nice way, but certain games (ie battlefield 1 and 5) are very poorly implemented. i think its alot more affected by the gem devs not the technology or windows, this video is super misleading and uninformed, basically just blabbing on and on about hdr standards which anyone who has half a brain already knows about. the fact that he didnt even bring up the lg oleds for pc gaming just shows how little he knows about it. like no shit if you have a bad monitor its not gonna look good in hdr either...
I game on console, but I just picked up an LG C2 for my first ever OLED.
Honestly HDR is almost laughable on my older display. And it was a pretty good display already.
This C2 is absolute perfection though. Gorgeous HDR, now with a large amount of great HDR games on the consoles.
I'm still a bit apprehensive about image retention on OLED. But I figured I'll give one a go and see how it does.
Now I'm probably not going to be able to move away from OLED. It's just so good.
And the LG can do 4K 120hertz in Dolby Vision with VRR/Freesync with extremely low input lag. I'm in freaking heaven. Best image quality I have ever seen.
@@sawdust8691 trust me when I say this, OLED burn in is not an issue on C2s. I run mine as a monitor for my pc. Use it for work all day and game for 3 hours in the evening everyday. When I work it is all static images, not a hint of burnin owning it for 6 months. I also have a C1 in my living room and no issues either.
@_numbing sweet. I wonder how long they'll last.
I mainly game on mine, usually 2 to 3 hours a day. Sometimes more on weekends.The TV is off the rest of the time, so I hope it lasts me a good long while.
Yeah, if you want HDR on PC, the only reasonable option is an OLED TV. The smallest one you can get...
Gaming with HDR on PC has been excellent for me.
What monitor are you using? Been shopping for a new one
@@DiazKnutzBenz möbius
I finally got my Alienware QD OLED and it’s amazing. First games I tried were Doom Eternal and Tomb Raider and my goodness…you won’t ever go back to IPS after 🤤
It's an amazing monitor but looks so good on its own, HDR doesn't really improve much.
Agree, have same screen and there's a clear difference between HDR on and off in games, games look stunning on it.
@@Wft-bu5zc I agree, I find SDR looks great vs HDR
@@Wft-bu5zc It does for me. Alienware here as well. HDR on is leagues better on RDR2 than having it off.
How much did it cost you?
Nice Kemper you got there lol. But yeah the biggest issue I stumbled into on this HDR journey is that in Windows it is a universal toggle. If you turn it on you will destroy the colors of any SDR content regardless of where you set the "SDR content brightness" slider because it's no longer native SDR. This means watching 99% of RUclips videos with messed up colors because HDR content just so happens to require very beefy system to edit.
Another problem related to this is that whenever a RUclips creator masters content for HDR they are simultaneously destroying the quality of the video for anyone watching in SDR since it uses the same type of filter to convert it back to SDR. This combined with the fact that HDR has so many different standards as you mentioned now makes the whole thing feel utterly pointless since one of the selling points of moving away from SDR was to have a single standard video makers and game developers can master for. I mean what's even the point if HDR looks worse than SDR half the time and by turning it on you make anything SDR look bad?
I'm actually loving HDR on my Alienware aw3423dw qd OLED. I think it adds more to the image of the game than any other modern tech, specifically ray tracing. HDR in games like halo Infinite, doom eternal, and even auto HDR in spiderman remastered is incredible looking. Doom eternal looks like a totally different game with HDR set correctly and the hardware to drive it. As a side note, I'm also running windows 11 22h2.
Do you have any good source for proper HDR settings?
Hes talking rubbish, for example spider man remastered pc doesnt need auto HDR (sdr to hdr conversion) it has proper HDR build in, at least on PS5, I own both versions and HDR on PC made me play the game on PS5 instead with my 4090 as the PC version has increadibly washed out colors and grey appearance on PC. Games that have decent HDR are mostly Microsoft titles, forza Horizon, sea of thieves, gears of war and a few more that are not totally awful. Red dead redemption 2 looks ridiculous apart from peak brightness for another horrible example.
Damn you must have a really nice setup! What gpu & cpu you running in your build?
@@Shaggii_ I'm running an overclocked rtx 4090 with a 13900ks @ 5.7ghz all core oc and 7800mt ddr5 manually tuned ram.
@@Unobserved07 Nice setup! does the ddr5 ram really make a difference in fps on games?
Dolby Vision should be added to Windows 11 and use the same codec as Xbox Series S|X. Auto HDR works really well on some games. But developers should be encouraged to add HDR to PC titles as well. I think it could get better. That also leaves Spatial Audio support on Windows 11 lackluster as well. Ultimately though we need DisplayPort 2.0 to actually launch. That will bring HDR to the light for PC, because higher res HDR is only realized on HDMI 2.1 hardware, lots of monitors are only just getting HDMI 2.1 so the needed port bandwidth is crucial. DisplayPort 2.0 should be able to do 16K HDR.
I got hdr option in windows 11 now
Once I got the LG CX as a monitor HDR issues were barely a thing. They're not completely gone though. Some games look terrible with HDR on and usually there is no way to disable it in the game itself so you have to turn it off in Windows just to have it off in this one game. Then you must remember to turn it back on in Windows once you're done with that game.
I suppose they are games were Auto HDR works? If you are on Windows 11 that's a thing, and the new HDR Calibration App would fix this issue if a game runs on auto HDR, which uses the systems data rather than ingame settings.
All games look better on HDR on consoles. That is why the video states its bad on PC
Compare once with playstation 5 attached to the TV and your jaw will drop, is vastly different experience, colors REALLY pop and its so much better
@@007GoldenLionNot true. When I turned on HDR for every game, games that's weren't actually HDR looked weird, colors were off and not as they were intended. Luckily you have have HDR on only on supported games.
@@MarshallZPie thats the point HDR on games that support it.
I think it's really weird that HDR10+ hasn't taken off in the PC (and mobile) gaming space because it's an open standard like HDR10 and doesn't have a royalty fee while providing most of the benefits Dolby Vision offers over HDR10. Heck HDR10+ is even the default for dynamic meta-data in the HDMI 2.1 spec so what's holding it back?
I am really surprised at that as well
HDR10+ does have a license fee, it's a lot cheaper than Dolby Vision though
Yeah definitely shocking I wish It would take off. :(
hdr monitors are expensive
Nice video. The monitor situtaion is rapidly improving as of late 2022. The video points out a few standouts with many more on the way. I personally went the TV route and play PC games on a 42" Sony A90K. I think games will be the main issue moving forward.
That’s a huge screen to be gaming on but i imagine you’re not playing competitive games
As someone who recently made the switch from console to pc, I was amazed how much a pain in the ass hdr is on pc.
I'd stay away from the Sony Inzone M9. I had it and sent it back. The picture quality was great for what it was, but it has some serious firmware issues that affect the HDR handshake making it practically unusable. Sony appears to be aware of the firmware issues and only states that they don't know if and when a fix will be available.
0:20 that sync with the music was crazy (if intentional)
i love details like that
HDR = specular, highlights and shadow detail on a broader color spectrum. People think 1000 nits is gonna give you better HDR and it is only true for TVs 50” or bigger, since you’re gonna be sitting further away from those TVs than a 32 inch monitor. I think HDR 400/600 32 inch monitors can produce great HDR, given the HDR is coming from a good source.
No HDR400,600 is trash. You need at least 1,000nits
@@NexGenTek you’re not a professional I am.
@@ToxMod You're obviously not 🚫
@@NexGenTek oh, and you are? Saying absolutely need 1000 nits? You know they’ve been making Oleds for about four years now maybe longer with peak nits of 900. Hollywood masters there HDR Movies at 900 nits. The new quantum oleds after professional calibration reach about 950 nits on average. Don’t say I’m not a professional because I just killed your comment with receipts.
@@ToxMod What? The G2 and S95B hit over 1,000 nits after calibration and comparing oled which is completely different technology then most people use as monitors the contrast is outstanding for HDR but there's a standard for HDR and it's 1,000nits for HDR10. Also my S95B can achieve 1500nits without calibration. So no you killed nothing but your own accusations.
HDR 400 shouldn't even be allow to call HDR, there's barely any differences. 600 should be the bare minimum.
You guys should check out Asus ROG Swift PG42/48UQ, it's pretty amazing for a monitor (well it's basically "gamerfy" LG C2)
Games definitely need better min/max brightness calibration setup sliders, especially for 0 nits OLED. Or allow Windows global settings from their new Windows hdr calibration app
On a QD OLED in Windows 11 and it’s not that bad. HDR looks great in games that support it. Auto HDR looks good too. If you have an LCD display, then yeah it sucks.
For now only OLED can give you a real HDR experience for a decent price, LCD is still a long way from providing an affordable monitor with 5k dimming zones minimum to have a comparable experience.
@@MadOrange644 5,000 dimming zones is STILL a garbage JOKE for any modern display which has MILLIONS of pixels.
😂😂😂
That's why OLED wipes the floor with LCD.
Is there a way to duplicate your screen in HDR on Windows 10 ?. When i duplicate the screen from my main PC to send the image into my recording PC's capture card (4K60 MK2 & OBS). HDR becomes disabled on my main PC, the toggle completely disappears.
Have you tried the Windows HDR calibration tool that was released on windows 11? Worked really well on my Neo G8.
i did this and my monitor looks amazing. Gaming though is odd in various cases. cyberpunk looks very weird to me overbright. as do apex legends it looks just over colored.
Thank you for suggesting this bud.
So where should I enable HDR on my Odyssey G7: On Windows? On a game? Or on Windows and in a game?
I have the Alienware QD OLED. The AutoHDR on Windows is fine for the most part. Not great, but not terrible. However, on games that support HDR and have a good implementation, the presentation is night and day when compared to SDR. Playing FF7Remake, GoW, or Horzon Zero Dawn with HDR is frakking amazing. The effect, for me at least, is more impressive than ray tracing on other titles, like RE2/3, Control, etc. I really hope that ~32" 2k and 4k OLED/QD OLEDs come down in price so more people can see that difference.
That I agree on. Proper HDR is way better than actual ray tracing, with no added performance cost. But with it obviously looks amazing.
One of the reasons I get some games on my PS5 despite having a fairly powerful PC is because it does HDR so much better than PC. Also literally plug and play with no need to make any weird changes on Windows.
When something new arrives on the technology scene, we usually think it will make everything better with a switch of a button. Like RTX and HDR, but everything comes down to artistic touch and fine tune. The best example I can say is that, Crysis Remastered might have RTX but original version's first sunrise with its orange hue looks so much better.
I've struggled to get the HDR on my Samsung Odyssey to look good. After much effort calibrating, I still can't get a good picture without a massive color washout.
To me, the juice isn't worth the squeeze yet and people are simply defending their purchases at times.
have you used the windows hdr tool in the microsoft store? apparently that helps alot!
My problem with hdr is my main monitor I use for work and my Sony TV are connected to the same card (I mean why wouldn't they be?). This is an issue as windows will not let you turn on HDR unless all monitors connected support it. I get why they do it top a degree but its frustrating and can only be fixed by disabling my monitor when I game on my TV.
You get "why they do it"? What do you mean..."they" don't do it...it is literally how the software and hardware works not some choice for no reason
i don't have that problem at all. main monitor is an older dell 30" connected via DP, TV is a TCL 6 series via HDMI, all connected to a 3080Ti.. go into display settings, click on monitor 2, enable "Use HDR".. it works fine with both enabled
Works on my laptop with the built in display enabled (no HDR) + an external large monitor which has HDR support. Though there are a few issues as Windows HDR support is woeful. I've not tried it with multiple displays on my main PC.
That's only a problem if you're duplicating displays. You can have hdr on individual monitors if they're extended.
Just thought I'd add that it works for me too. I have two monitors connected to my Windows 11 PC - one is HDR enabled and one is much older and probably can't even spell HDR. However, I am still able to turn HDR on and off in Windows when I need to and it works fine for me. Maybe it's like someone else mentioned - that is only a problem if you are duplicating displays and not extending your desktop? Or, maybe it has to do with using a TV as a monitor? Not sure, but it does work.
I have a 4090 graphics card in my desktop and I’m using the Alienware 27 inch 1440 P monitor just realized it had HDR and I’ve been trying it out. I don’t see any problems with it at least on call of duty kind of makes things look more color accurate to me just bothers me that you can’t change the brightness level only the contrast.
yes it does, it's just an afterthought for developers and i hope that changes as more and more screens support hdr nowadays
If you have HDR400 should you turn HDR on or off?
will depend on the actual monitor, i would class it as HDR truly but a small step from sdr if you have full array local dimming and your panel is 10 bit (most are 8 bits which is just normal sdr unchanged from the crt days)
Mixed feeling about HDR on (my) PC monitor. Maybe it's a user error, but when I turned on HDR in win10 the desktop just got too bright, so installed w11, in w11 the desktop does not seem natural, although much better than w10 it is still off in some way and maybe a tad too bright too. HDR in videos well YT to be honest look amazing, but the brightness are a strain to my eyes. I haven't really played any games in true HDR yet, just started Cyberpunk 2077, and while it look very good, it is still kind of straining my eyes. The monitor is Acer x32FP. So in the end I actually prefer running my monitor in SDR, SRGB bc I want things to look natural, well that looks more natural in all desktop related content, pictures etc. Maybe this is just me, I really don't know. I don't regret getting a monitor like this bc the dimming zones still gives better blacks in SDR, but HDR on pc is a mixed feeling.
I'm very pleased with the HDR on my AW QD-OLED. It's comparable to LG OLEDs (maybe not quite as natural and crisp a picture). Elden Ring and GoW are amazing. CP2077 is so so but ultimately worth using. I've found most PC/console titles do an acceptable to good HDR experience.
@AmpEdition Awesome! TY for the rec!
I like playing the division 2 a lot more with the alienware qd-oled in hdr than without it.
what's your issue with 2077? I find it absolutely incredible on my Alienware in HDR
@paskowitz I'm very happy with my Alienware QD-OLED as well. Best picture quality I have every had on my PC.
@AmpEdition Thanks for the recommendation to try Doom Eternal. I've been pretty happy with the results that HDR gives in Battlefield 2042 with one slider, but I've never really played Doom Eternal despite receiving it as a gift. I will have to go try it out to see what "the HDR standard" looks like. 🙂
@@srobb68 Than.
NOT then.
No this is incorrect.
In fact, the hdr performance of different TVs varies even more,and pc can with tv.
I still can’t decide how much 10-bit vs 8 bit dithering matters
The Odyssey Neo G8 is basically perfect for a PC monitor. 1196 local dimming zones, 2000 nits of peak brightness and full HDR10+ support. That's not even mentioning its 240hz refresh rate, it's large amount of options to adjust or its great console support.
You can say the biggest set back for the G8 is the lack of developer support. But when there is proper support the HDR on the G8 is breathtaking. It's the same feeling as building a new PC, and playing all your games at ultra settings at 4k as opposed to playing at 1080p at medium settings. With that said, at the worst of times things look either too contrasty or too washed out, but that's due to poor developer intergration, not the monitor.
The 2000 nits is total BS, it can only hit that for a couple seconds on a test pattern and then can't achieve it again until the entire screen goes dark for a while. Plus it's hitting 2000 nits when the content is only asking for 1000 nits so you end up with blown out highlights and can't tell the difference between parts of the image that are meant to be 1000 nits and 2000 nits.. The real peak is 1100 nits. It can only sustain 300 nits full screen which doesn't even meet the full screen long duration requirements for DisplayHDR400.
The display doesn't follow eotf at all and greatly over darkens dark colors and overbrightens highlights so it will never produce the image the content creator intended. That absolutely is the monitor's fault and the the developer's.
@@Lead_Foot where did you get the 300 nits from? Because you're lying. Anybody with eyes and working retinas will be able to tell you that at Max brightness that it is NOT just 300 nits for fullscreen brightness. SDR brightness alone can be set closer to 600-650 nits if you are fine with needing to squint when looking at a white background.
Also yes, it is the developers fault. I was referring to From Software's Sekiro and Elden Ring which are both well documented and infamous for their poor HDR support. Literally every other game I've played and every streaming service had proper HDR support.
Found this exceptionally helpful, subscribed!
Despite its problems due to G-Sync ultimate making firmware updates impossible, the AW34 QD-Oled was, imo, well worth the upgrade. I just wish I'd bought this before I sunk another chunk of money into a deciptively worded standard LED TV which is absolutley not OLED, and absolutely gave me a headache to try and watch (I need OLED because my eyes are very light sensitive, especially in cases where a TV or monitor only emulates black/dark instead of actually having black/dark). It just sucks that there are too many games that don't switch on the HDR settings in game while I play them connected to the monitor.
The only way to enjoy HDR on PC is with an OLED or a Mini Led with a VA Panel with hdr 1000 and at least 384 dimming zones. Anything lower then they suck.
I have a 2000 hdr VA and still sucks in pc, i dont know why
ive played Elder Scrolls Online on both pc as well as Xbox Series X. And after the Xbox version received a fidelity upgrade, i think the game actually looks better on my xbox now than it does on PC. The better colors really makes the game more attractive and appealing. With the fake HDR on PC, it just looks dull in comparison. And with HDR turned off on PC, it actually looks better than with it on but still doesn't quite give the appeal of Series X. You would think with a $5,000 pc i could get that hdr fidelity but unfortunately i still cant
I know you commented this a while ago, but I'm curious, did you play both of these systems on the same monitor/display?
@wuzzle5261 yes, but hdr on pc sucks
@wuzzle5261 only benefit for pc is the mods
@@thecarsonigen also higher resolution and higher graphics and 100+ fps and getting exclusives from both consoles
As someone with a good PC i see 0% reason to ever get a console again, i have everything i need on my PC
Crazy how this video released on the very same day the Cooler Master Tempest GP27 was announced.
Hdr for Warzone have been a pain I can’t adjust my colors with it off…so I turn it off what a big change…
Fast forward 2023, mini-led full array local dimming with over 300 zone are now widely available.
Samsung's Neo G7 and G8 have dropped in price quite a fair bit and are 1196 zones. I've ordered a Neo G8 yesterday and it's only a little more expensive than what my current Acer X28 cost me, which is 152hz and only HDR400 and it's just a standard IPS. I have a feeling as newer versions of OLED and QLED monitors come out, those older ones from 1-2 years ago are going to plummet in price.
Great video! It seems also like there is confusion about when HDR is supported by PC games thus users will keep HDR off in their OS settings and never experience it. Would also like your thought on 1. Auto HDR in Win11 and 2. PC Gaming on TV such as the 40” LG OLEDs. Thanks!
Most PC players keep HDR off because they don't like that it adds up to 16ms on certain games, they are really weird about their input latency
@@mightymushroom4980 I did not know that! Do you know which games it has that effect on? I assume these are eSports titles and they don’t have HDR support but HDR on in the OS does that? Is it GPU specific?
@@mightymushroom4980 I'm pretty sure it's not the HDR itself that can add a tiny bit (imperceptible to humans) of input lag but rather the something with regards to the HDR & auto-dimming of non-OLED monitors. That's why OLED monitors show no increased input lag whatsoever when running in HDR mode.
I have a 7900xtx and hdr on with 9mm response to the pixel but even with hdr in os turned off I still see no difference other than slight less flicker but when I use sdr ita fine just terrible peek brightness in certain games 😅
Love the D2 (Vow) wallpaper.
Well, when I got my HDR400 display many years ago, I realized how insincere a lot of tech comentators were, because they were complaining about HDR400 and saying how useless it was, and how it wasn't even HDR. I'd look at my HDR400 display and then look over at my old display which was probably about 250 nits. And it was obvious to me they were just lying. I'm not sure what the motivation for doing so is. But there you go. Now I have a HDR600 display and it's obviously better, and I'm starting to see the true intent of HDR. But the fact is, progress is progress. And HDR400 is better than no HDR at all.
Good video , can you please tell me is this hdr good for watching movies, im not a gamer. I will buy a budget friendly monitor but it has hdr 4000 ?
@@ivonikolov6386I absolutely love my aoc cu34g2xp/bk. It's primarily for gaming but watching movies is also very sharp. Paid like 300 euro for it
HDR even on consoles is still hit and miss. The problem for me is the colours don't pop as you'd imagine from the marketing spiel. In fact if a bright spot is on the screen, it actually will wash out colours as the HDR focuses its power on creating that bright spot over colour accuracy in the rest of the imagine. Stick to SDR gaming and you'll see more impressive colour using a HDR TV. Backwards as it sounds I know
I think that is a problem with your display, as a proper HDR screen wont have that problem.
this is simply not true, what tv do you have? you are more than likely using incorrect gamma, brightness, contrast, or color depth settings, cuz hdr looks incredible on my c1 and when a game doesnt support it i am dissapointed because it just simply doesnt look as good.
No... On consoles it looks much better maybe you have a bad TV or monitor... HDR is a bigger deal than ray tracing but you need a good TV
A huge lie. HDR on consoles on both my LG OLED Monitor and LG OLED tv is amazing, and blows HDR on PC (specifically windows 10) away.
Bombshell - there's enough dynamic range in Standard dynamic range that, if the game developers/artists wanted to, they could have ENSURED 'dark areas' were suitably corrected - where it would benefit.
Completely agree. If a game's too dark, i just increase the contrast from the game's settings and max out the monitor's brightness, and all the dark details suddenly come back. It even looks more realistic than before.
Don't agree with title at all. HDR on Win 11 on my AW QD OLED is stunning.
But it's not as good as on consoles that's the point
Would the game problem not be solved with hdr10+ implementation?
I didn't even watch this video. I just clicked to see people saying you're wrong. Haha.
OK well glad to hear, I upgraded to an RT4070 from GTX1070 and was looking for a screen, turns out I don't need to spend 400-500 euros for a great gaming computer, I just bought HP - Gaming x27i Moniteur 27" for 280 euros and it will probably blow me away from a 65Hz 1080p set up I have right now. It's very easy to spend a lot more than needed in PC gaming.
But you can connect qd-oled like s95b to PC and viola
Works just fine on my 3090ti with an LG C1. Running Mass Effect Definitive Edition in Dolby Vision looks amazing af. It's just game companies that are lazy to implement it.
There is no Dolby Vision on PC lol
From all the games you play mass effect with a 3090ti , bruh if i was a girl i would sell my body to get that videocard and play some quake champions on nice settings. Lgc1 isnt that a oled tv? Also nice
June 2023, PC HDR gaming was working fine in Windows 10. Just was told by Microsoft I needed to upgrade to 11. Now HDR comes on but two new games EA Sports Golf and PGA Tour 2k23 both are unplayable with HDR on? I have all new 2022 hardware. TV Hisense 1500 nit 120 fps, Denon receiver, HDR and 120 fps receiver, and AMD 6750 GPU. By the way, Windows says Denon Receiver isn't "certified", which it is? I'm very disappointed with Microsoft.
I turned her on and 30 min latter turned it off a.d applied my own custom settings. Hdr should be set in stone not variants
HDR is weird. On some games it looks amazing, then on others its horrifically bad and everything in-between.
OMG...I've been waiting for a video like this for nearly 5 years, ever since HDR became a thing for gaming. No other reviewer would really go into detail about what's important to look for, or explain the importance of HDR in gaming is. So we the consumers can make an educated decision on what to look for in a gaming monitor.
I think the title of this video is misleading. I run my PC into the Hisense U6H and HDR games look phenomenal. I was playing Ori and the Will of the Whisps today and it looked and sounded stunning. Gaming monitors don't have good HDR support but using a PC to run a living room setup works great.
HDR has made preffer to play some games on Xbox Series X on my Sony A95K OLED over my 3080 PC
Now for the first time ever, you can buy a true HDR monitor for 500$.The GP27Q and M27T20 have 576 dimming zones (as much as apple's pro display XDR lol) excellent color reporduction thanks to a quantum dot layer, and can sustain 1000nits brightnes even at full screen. AND they're 1440p, 144hz. I got the M27T20 and the games with hdr on this display look absolutly fantasitc!
Hdr worked fine for my 2070 super founders edition.
I tried to enable windows 10 Hdr on my gigabyte 3070 and my LG oled tv goes black and the video never comes back on, reboot it and still no video.
I had to put an old video card in it just to get video and disable HDR and then put the 3070 back in.
After that I've never tried to turn it back on. It's probably gigabyte that is the problem and why they had sooo many revisions of the 3070 and on top of selling them for dirt cheap while everyone else was selling 3070's for 2 or 3 times the price.
HDR is gorgeous in PC gaming as long as you have the hardware. I can’t even play without it anymore. I’m using the PG27UQ.
Is the lg 32GQ950 good ??
@@joaquinarchivaldoguzmanloe1073 check the specs and just make sure it’s at least VesaDisplayHDR1000. Preferably Full Array Dimming as well but if it’s not, it should still be decent. You need that brightness.
@@LexLutha yeah it just came out this year suppose to be lg best UltraGearGaming monitor
@@joaquinarchivaldoguzmanloe1073 I’ll check it out. I’ll get back to you 🤔
@@joaquinarchivaldoguzmanloe1073 ok so it can hit 1000 nits but it only has 24 dimming zones. The size and the fact it’s 4K 160hz is super nice. If you never tried HDR on PC this one is actually probably decent. Especially for $1,000.
Dude... the audio track you are playing in the background is HORRIBLE. That repeating, waahh waah waahhh, is very annoying.
Does the PS5 support Dolby vision?? My understanding was it only supported basic HDR10.
I don't think so?
My tv has really bad HDR 😅 that paired with compatibility issues , I never use it
Is it alr if I turn on HDR on my ASUS VG27AQ for gaming?
It seems like tv might be the way to go for PC HDR... I use a 65 inch LG CX, which looks amazing, and a 50 inch samsung qn90a, which will do 144hz and also looks amazing.
Seems like others are attesting to the same with other lg oled models.
Do you think upcoming smaller LG monitors could solve this?
And now. RTX HDR is here to save the day a year later. Please do a video on the rtx hdr app :)
So it's a monitor problem rather than a PC problem
Exactly..
@@Silverbolt1981 Dynamic metadata is also not required, connect Pc to a decent HDR10 TV and you're good to go. Pointless video tbh
@Roc How is it a windows problem?
There's also no HDR for GeForce Now available also with the new 4080 Ultimate Subscription :/
No issue with HDR on Windows 11 and an AW3423DW
So what about PC's that are connected to something like the C2? Still an issue because PC games don't support Dolby?
Battlefield One and Mass Effect Andromeda have Dolby Vision support on PC.
you left out pg27uq and pg32uqx both support hdr
Went 48 inch lg oled as a monitor years ago. So glad i did. I can also use 3840 x 1600 if I want to (with true black bars on top/bottom). But since i have a large desk, i have not needee to.
HDR makes my ps4 look beautiful, but I can't get it to look good for the life of me on my PC. The darks are always too dark no matter what I do. It's really frustrating.
Samsung G7 4k 165hz mini led have a very beautiful hdr but when i start some game, not all on pc i must start in windows mode and after in full screen or the game crash. GSync don't work to, he macke a black screen
I have that Alienware monitor at 2:00 it is actually hdr 1000 not 400 and its an oped not microled
Just bought the LG 32QG950 looks amazing but I don’t know if I should keep hdr off or on for my ps5 for best experience please help
For PS5 always keep HDR on
@@MadOrange644 even thou it’s looks dark ?
I'm convinced that my hdr on Hisense is screwing with my streaming. it gets choppy and slows down and makes it unwatchable
This is why my gaming PC is an HTPC. I send that thing out to a Sony Vizio TV and it works great.
idk for me it looks amazing it looks way better but there are some annoying bugs like right now when i have a game open and im watching youtube it keeps switching between HDR and SDR and each time it switch the screen turns off then on
It works fine for me,your title is misleading btw..
After gaming on HDR i can't really go back, HDR is tooooo goood
Samsung mentioned that they should be coming out with HDR10+ gaming monitors in 2022 . So you would think they would be out by now but I can't find them myself
all odissey monitor are hdr10+ (neo g7/8/9) is wrote on specification on samsung website.
@@MonkeyDBenny okay thanks. I was looking and I saw something about those models supposedly had that but I couldn't find anything definite
What game is that with the space shuttle?
If i game with HDR on a LG UltraGear OLED 27GR95QE-B, what do i lose? (refresh rate, input delay, frames ?)
As far as I know, nothing. If the monitor supports all those features you should be able to use them all at once. Some higher spec monitors are HDR400 or 600 because they probably have to make cutbacks in order for HDR1000 or above to work.
a year went buy an HDR on PC still sucks. EDID data interpretation has A LONG WAYS to go. I purchased the most expensive ASUS 4k 240hz monitor and it kinda sucks for HDR, It just doesnt get bright when it needs to and constantly reverts to the wrong EDID data of a 450nit SDR max luminance when it should be at 1,100 nitt max instead- thus making scenes verry dark when their not supposed to be
The bit about brightness not being important is total bollocks, since our eyes are accustomed to perceving hight energies up to hunders of thousans of nits in real life. The fire, the harsh glints on metallic surfaces, water, what have you, or others energy sources require tons of nits to be represented in a physically correct, realistic and convincing way. Localized peak brightness is just as essential as the encoded amount of f-stops/pixel brightness states, contrast and increased color depth gamut.
I made it all the way…until he said “my personal favorite is Destiny 2”. Um..well..he’s the only person that likes that game.
Which game is shown in the beginning with a craft that looks like the side shuttle?
In some games looks awful. Days Gone for example. Everything is washing out and bleach is all around. In some others looks great. AC:Odyssey for example, looks a lot better when i play with w10 HDR enabled, same goes for Metro Exodus EE & Cyberpunk 2077 with sRGB10. It depends on each game HDR setup as it seems, without a standard ''middle''. Just play with game's settings and monitors brightness/contrast/saturation.
Yeah, it depends of games, on forza 5 on hdr400 lg nano ips it looks great but for new resident e2 or bg3 looks terrible and too much bright
Wow good video this explains so much
Hdr on pc is a fantasy not yet lived 😂 the peek brighness is amazing but pixel scattering and lagging occurs 😅
I gave up on gaming monitors a few years ago, they’re overpriced and TVs left them behind years ago. Now I just use a Samsung mini-led TV.
to me hdr is snake oil. Maybe I am too poor to get that tv that contradicts my findings, maybe I am just wired differently, but I see no difference in any content with it on vs off. Definitely not worth say $200 more.
I've seen HDR 4K monitors for $1,300. That's more than any VR setup. You could get a high end gaming PC for that price that does ray tracing with VR support.
Cos hdr gaming is a gimmick and pc already has the best picture quality and frame rate.hdr is a disaster of mastering and mixing with no real standards cos TV manufacturers all have their own way of interpreting it just look at shadow detail between Sony and Samsung.
Still waiting on the day LG releases a 24" inch OLED monitor (at a reasonable price of course)
Hdr10+ gaming will automatically calibrate your pc games for your TV and monitors if it supports it
its annoying that only samsung support it and they slap a matte coating on their displays
@@MSayeed it's available on sony tvs
Why nobody mentions HLG?