HDR is a Hot Mess

Поделиться
HTML-код
  • Опубликовано: 1 янв 2025

Комментарии • 1,6 тыс.

  • @noclick
    @noclick Год назад +125

    new personal record of Philip not mentioning Half-Life 2 is 1:12 minutes!

  • @PebsBeans
    @PebsBeans Год назад +1981

    when I was younger I thought the HDR in HL2's episodes was just a fun little detail to represent Gordon's eyes adjusting, and had no idea it was a side effect of an actual proper graphics-related feature

    • @spiderjerusalem8505
      @spiderjerusalem8505 Год назад +61

      I always hated HDR in first source because it never used correct color transformation to display the image, but Source 2 does... And it does it quite poorly :/

    • @justaweeb1884
      @justaweeb1884 Год назад +50

      @@2kliksphilip It looks like youre on windows 10. Why are you complaining about hdr when youre not on windows 11? windows 11 has far better hdr support.

    • @koye4427
      @koye4427 Год назад +1

      @@justaweeb1884 They backported all the Windows 11 HDR features to Windows 10 after backlash. The "improved" HDR support has been on both 10 and 11 since the feature has existed.

    • @vitalsignscritical
      @vitalsignscritical Год назад

      @@justaweeb1884 I'm going back to windows 10, win11 sucks too much, the HDR is more than serviceable on Windows 10.
      Should count our blessings, Linux doesn't even have HDR yet, but it's coming soon. I will take some time for all distro's to update though after Ubuntu gets it first. I'm guessing through Gnome, though I'm not too much of a famaliar user to Linux as I've never ran it as my daily driver, so it's only possible to know and learn so much, it's quite hard to keep learning as I go, but the Linux command stucture is much easy to get used to and remember than the commands used in windows terminal programs, CMD, Powershell, etc.

    • @Frille512
      @Frille512 Год назад

      Because windows 11 sucks ass@@justaweeb1884

  • @IAhmadGT
    @IAhmadGT Год назад +1437

    You know it's gonna be a good video when Philip starts it with caboosing

    • @epj0211
      @epj0211 Год назад +43

      Caboosi blz

    • @Flashv28
      @Flashv28 Год назад +85

      As soon as I heard it, I busted right into the caboossy

    • @Trovosity
      @Trovosity Год назад +3

      Lmao

    • @epj0211
      @epj0211 Год назад +4

      @@Flashv28 Oh my!

    • @WangleLine
      @WangleLine Год назад +1

      Yesss :]

  • @HalfDayHero
    @HalfDayHero Год назад +896

    I get the impression at times that not even game devs understand HDR. Seeing only a brightness slider in the HDR settings is deeply concerning.

    • @ernestochang1744
      @ernestochang1744 Год назад +124

      With the way things are going today im surprised they even know watt ambient occlusion is anymore lol, it looks like all the old people back in 2005 forgot to teach the new generation of coders/programmers, and computer nerds how to make proper games because the ammount of shit spewing out from the gaming industry HAS been deeply concerning

    • @EggEnjoyer
      @EggEnjoyer Год назад +137

      @@ernestochang1744 Theyre two different groups of people.
      The old leaders of industry were tech enthusiasts and fanatics.
      The new people in the industry are just kids who went to college and then got a job. When things are institutionalized, you get more mediocrity.
      You've now got game devs who dont even really play games much, but hey they went to college for it.

    • @it-s-a-mystery
      @it-s-a-mystery Год назад +45

      ​@@ernestochang1744This is a weird comment to read in such a fantastic year for videogames ..

    • @ernestochang1744
      @ernestochang1744 Год назад +6

      @@it-s-a-mystery tell that to forza motorsport other then that youre right it has been a solid year for gaming, maybe the tides are turning

    • @KiraSlith
      @KiraSlith Год назад +21

      ​@@it-s-a-mysteryIt'd be more honest to say 2022 was a good year, 2023 has been mostly dead water. Even then, 2022 was still a ways off any single year in of the golden era from 2000 to 2013, where Microsoft, Sony, and Nintendo were all releasing top tier titles like miniguns spit lead. A time when "I don't have time to play all these games" was a legitimate problem rather than a sign of an untreated "Games as a Service" addiction destroying your life.
      Don't get me wrong, an improvement is an improvement, but there have been scant few actually good AAA games since, most of which came from either Bandai Namco or Square Enix. Most everything else having been Indie releases serving niche genres that'd been dead since the early 90s, and survival crafting games. Which, good for the people who were looking forward to the return of the platformer, and rogue/rogue-lites, but they're also not game genres the bulk of gamers from that 2000-2013 era are looking for. Oh, and Hifi Rush fits in here somewhere.
      Meanwhile, where are the action tactical shooters like Rainbow Six Vegas? Pulse pounding racing games like Full Auto and Burnout? Third person Action Adventures had a pair of R&C games. FPS RPGs had Cyberpunk 2077 (after 2 months of patching) and Starefield at a stretch. The straight FPS genre just has CoD left, which devolved into insulting it's audience, and Warzone. The only 2000s era genre not only surviving but thriving, is Action RPGs courtesy the previously mentioned Bandai Namco and Square Enix.

  • @Skazzy3YT
    @Skazzy3YT Год назад +1211

    When Windows 11 first launched, I was under the impression that Auto HDR would be a feature that turned on HDR on your monitor when a HDR source like a game was running and back to SDR when viewing SDR content. Imagine my surprise when I found out what it actually does.

    • @mateboros952
      @mateboros952 Год назад +141

      I turned on Auto HDR on windows 11 and I have HDR screen but it do a shit... My screen doesn't get the signal to change to HDR mode... Its useless until I manually turn on HDR on my screen or in windows I force HDR, then my screen instantly switch to hdr mode.

    • @samcerulean1412
      @samcerulean1412 Год назад +72

      They are working on that, I think HDR10+ supports that feature which Windows is switching to at somepoint.

    • @montagepl
      @montagepl Год назад +7

      I use it with my 3 monitors in Nvidia and it work grate when in Windows 11 or Battlefield 2042

    • @Web720
      @Web720 Год назад +28

      What does it do?
      I also remember Xbox talking about how they made an OG Xbox backcompat with the Series X with full HDR support.

    • @flamingscar5263
      @flamingscar5263 Год назад +70

      AutoHDR is really good tho, I upgraded to windows 11 for that reason specifically and in 9/10 scenarios autoHDR makes the image look better then what my display would show in SDR mode

  • @Gwilo
    @Gwilo Год назад +340

    trying to actually get a HDR video rendered in the correct format is just a nightmare on its own - I've had to upload my RUclips videos directly and use the built-in RUclips editor to trim it lately, otherwise nobody can watch my greyed-out video!

    • @Ham24brand
      @Ham24brand Год назад +21

      I want to make videos in HDR, but no free video editors support it. I can use OBS or Gforce Experience to record it, but it must all be in one take. It's annoying how little support there is for it. The constant marketing of it on almost every TV, monitor, and phone would make you think it would have some good support.

    • @Sammysapphira
      @Sammysapphira Год назад +22

      ​@@Ham24brandUse DaVinci Resolve... it supports it

    • @thenosid951
      @thenosid951 Год назад +1

      Use the HLG color profile

    • @SlenderSmurf
      @SlenderSmurf Год назад +5

      ​@@SammysapphiraI got the free version downloaded only for it not to be able to open my Shadowplay files... apparently the paid version can do it

    • @Fadexz
      @Fadexz Год назад +2

      Might be able to use ffmpeg to handle HDR stuff fine. RUclips will re-encode when trimming as far as I know so not ideal.

  • @molsen7549
    @molsen7549 Год назад +639

    Doom Eternal was the first game I played when I got my new OLED monitor, and I was blown away. Unfortunately, it set up unreal expectations for me for the future.

    • @MedicAthlete24W
      @MedicAthlete24W Год назад +44

      Surprisingly Modern Warfare 2019 has a pretty good HDR implementation. Its settings aren’t as in-depth as Doom: Eternal but they make the game pop when calibrated properly.

    • @thenonexistinghero
      @thenonexistinghero Год назад +9

      Dunno if you played it, but Gears 5 has incredible HDR as well and I think the best I've seen in a game so far. The reason it looks so good because they changed the way the lighting works when you turn on HDR to properly take advantage of HDR.

    • @tomatosbananas1812
      @tomatosbananas1812 Год назад +4

      AC Origin and Odissey too, with the settings "max lumens" and "white paper" 👍

    • @ProexProduction
      @ProexProduction Год назад +10

      Try Metro Exodus too, also insane HDR implementation

    • @Big_Red1
      @Big_Red1 Год назад +8

      The only game I've played on my OLED that seems to have proper black levels in HDR mode is Shadow of the Tomb Raider (haven't played Doom since getting it, will have to try it). Most that I've tried don't have true black and it is REAAAAAAAAALY annoying.

  • @HQ_Default
    @HQ_Default Год назад +388

    Despite the Source Engine's HDR being fake, using a simulation of the player's eyes adjusting to differing light levels is actually a really clever way of replicating HDR in the context of a video game.

    • @Qwertworks
      @Qwertworks Год назад +15

      In bad company 2 that also happened. I generally really like the look of the old frostbite 2 engine in Bad Company 2

    • @ezraho8449
      @ezraho8449 Год назад +21

      It’s enabled by default in Unreal Engine. It’s called adaptive exposure. The problem is that devs don’t get very much control over it. In most cases it’s easier to build two lighting environments. One indoor one outdoor. And program a transition between the two when you walk through the doorway.

    • @smergthedargon8974
      @smergthedargon8974 Год назад +7

      I fucking hate that shit in video games
      My eyes will do the adjustment for me!

    • @Bluelightzero
      @Bluelightzero 11 месяцев назад +4

      @@smergthedargon8974 Wouldn't work unless you had a HDR monitor.

    • @smergthedargon8974
      @smergthedargon8974 11 месяцев назад +1

      @@Bluelightzero But we're talking about Source Engine eye adaptation, AKA fake HDR, so I _don't need an HDR monitor_

  • @coldfire6869
    @coldfire6869 Год назад +30

    HDR breaks my warmth screenfilter and results in a sudden flashbang every time

    • @cupid3890
      @cupid3890 10 дней назад +1

      Ugh i hate the iPhone implementation of hdr in social media, esp on instagram.
      Every so often a post comes up that blinds me, doesnt help that instagrams compression worsens the quality of every post which makes it a blinding and indecipherable mess where i have no idea whats even going on.

  • @PhonicUK
    @PhonicUK Год назад +56

    It's probably worth mentioning that the 'HDR' shown in Lost Coast is expressly designed to try and mimic how the human eye sees things to make the experience more immersive. If we're in a dark room then indeed we can't resolve details in the bright outdoors until our eyes adjust.

    • @Case_
      @Case_ 11 месяцев назад +3

      Yeah, it's more eye adaptation than HDR. But they did advertise it as "HDR" back in the day, and quite extensively at that.

  • @anonony9081
    @anonony9081 Год назад +354

    Ori and the will of the wisps is probably the best example of HDR implementation. It looks amazing and the contrast between the bright areas and the dark areas really makes the image pop

    • @poppysilver
      @poppysilver Год назад +31

      plus the game is bussin

    • @oscarfriberg7661
      @oscarfriberg7661 Год назад +26

      Second this. I never understood the point of HDR until I played Ori. That game really showed me the true potential of the images my monitor can produce. It's sad no other media has come close to it ever since.

    • @BlackParade01
      @BlackParade01 Год назад +10

      ​@@oscarfriberg7661I recommend Horizon Forbidden West and Cyberpunk 2077 in HDR.

    • @Exponaut_R-01
      @Exponaut_R-01 Год назад +7

      Ori supports HDR??
      Surprising, but I guess it makes sense that game is full of light and dark emphasis.

    • @BadhamKnibbs
      @BadhamKnibbs Год назад +6

      I found Tetris Effect was a surprisingly good example of HDR too, the metallic materials in some of the backgrounds floored me with how good they look with HDR on

  • @mariokotlar303
    @mariokotlar303 Год назад +37

    I feel your frustration, and to do my part I did my best to make HDR in Talos Principle 2 look the way HDR is supposed to look. Given that you really enjoyed the first game, I look forward to hearing how you feel about the HDR implementation in the sequel.

    • @mariokotlar303
      @mariokotlar303 11 месяцев назад +2

      @the_grand_secgend2134 Thanks :) I designed the Activator mechanic, gameplay for ~10% of puzzles, entire South 1 map including all the arhitecture on it, handled technical stuff regarding rendering, light, colorscience, and ofcourse HDR.

    • @mariokotlar303
      @mariokotlar303 11 месяцев назад

      @the_grand_secgend2134 ❤️

  • @CxWxD
    @CxWxD Год назад +65

    Personally I think that well-presented HDR on a high quality HDR screen has a transformative effect, not a subtle one. I really wish it was more readily available in monitors and more widely supported in games and content as a result like you mentioned in the video. It is very much not a gimmick and I think makes images pop far more than any further increases in resolution would (surely 8K TVs have reached the official threshold of diminished returns in that space). I have no knowledge of the amount of extra work or expertise needed for games to present HDR well, but it seems to be significant based on how rare it is lol

    • @guitarzilla555
      @guitarzilla555 Год назад +5

      "Well-presented" being the key. I thought 3D was transformative when done correctly (like Avatar and Hugo.) But studios just wanted to cash in with cheap 3D conversions and used it as a gimmick. And that's how people remember it.

    • @LegendaryPlank
      @LegendaryPlank Год назад +2

      I'm of a similar mind. Chasing mega high resolutions for a minute result that a limited number of people even notice seems silly when good HDR representation can improve visuals tenfold in my opinion.

    • @drunkhusband6257
      @drunkhusband6257 Год назад

      It is very much a gimmick.....it doesn't make things pop. It's darker, worse colors, it's a huge marketing scam to sell displays.

    • @drunkhusband6257
      @drunkhusband6257 Год назад

      Same BS I've heart the last 10 years. HDR is overrated junk tech that needs to go away. Properly calibrated SDR is always better, brighter, and more detailed.@@joechapman8208

    • @KAIJUCHOMPS
      @KAIJUCHOMPS Год назад +4

      Yeah, I frankly find that HDR has been more transformative of my gaming and movie viewing experiences than 4K is, even at large screen sizes.

  • @superslash7254
    @superslash7254 Год назад +16

    If you think being left behind on PC feels bad now imagine what it's like for those of us old enough to remember CRTs. In 2005 I was gaming on a 24" screen with zero input lag, zero pixel response times, an enormous color gamut, incredible brightness and contrast, and at a resolution of 2048x1536. It's taken us nearly 20 years just to get CLOSE to that in resolution alone, and we're still nowhere near the smoothness and speed of electrons in a vacuum refresh rates of CRTs.

    • @asinglefrenchfry
      @asinglefrenchfry 14 дней назад +5

      1440p and 4K monitors have existed for a while, so we have caught up and surpassed on the resolution front. OLEDs are the closest thing to CRTs in terms of input lag and pixel response time. I’d personally rather have an OLED than a CRT of the same resolution and Hz.

    • @deadwhitedragon9751
      @deadwhitedragon9751 11 дней назад

      ​@@asinglefrenchfry oleds are great, but god are they expensive

    • @notlNSIGHT
      @notlNSIGHT 11 дней назад

      @@deadwhitedragon9751Yeah. Especially the fancy new QD-OLED monitors. But the price is absolutely worth it. That Alienware 34 inch ultrawide QD-OLED is like $700 compared to its typical $1100 on Amazon.

    • @user-ym8zp6ld4e
      @user-ym8zp6ld4e 3 часа назад

      Those games had 100000 times less polygons on 3d and were mostly 2d, superma 64 and et are 2 of the top 2 worst launches off all time

  • @raffdraws
    @raffdraws Год назад +99

    This is literally what I'm dealing with after buying an HDR monitor. Although I'm happy with Cyberpunk 2077's implementation of it. Even certain areas in Mass Effect : Legendary Edition look great with it.

    • @LexiHearts3
      @LexiHearts3 Год назад +4

      Have you tried Spiderman 2? The hdr looks really good in all the RUclips clips of it I've seen

    • @raffdraws
      @raffdraws Год назад

      No I don't have a PS5 sadly I'll try it once it's on PC @@LexiHearts3

    • @1chrisanderson
      @1chrisanderson Год назад +8

      Cyberpunk needs Reshade with Lilium's black floor fix shader to REALLY look correct IMO. I have an LG C2 OLED and it was night and day after tweaking it this way.

    • @GCCarsCA
      @GCCarsCA Год назад

      ​@@LexiHearts3 I've played Spider-Man 1 on PC in HDR and it's probably the best implementation I have personally seen in a PC game

    • @mopilok
      @mopilok Год назад +2

      @@GCCarsCA forza 5 is pretty good as well

  • @Catzzye
    @Catzzye Год назад +27

    The more I learn about HDR the less I feel like I actually know about HDR :(

    • @kendokaaa
      @kendokaaa Год назад +6

      Here's a simple explanation: An HDR display can display darkness and very high brightness at the same time. HDR content takes advantage of that. That's it. The stuff about colour banding and deeper colours aren't inherent to HDR.
      HDR displays usually use a bunch of lighting zones to increase contrast unless they're OLED, where each pixel acts as a lighting zone. Some monitors are advertised as HDR but only fit the HDR-400 standard which might as well not be HDR since it doesn't require local dimming (lighting zones) and it only requires 400 nits which isn't that bright.

    • @SkandiTV
      @SkandiTV Год назад +3

      SDR shows a dark to bright image, HDR can show very very dark to very very bright on the same image.

    • @peaceable55
      @peaceable55 14 дней назад +1

      ​@@kendokaaait's been so long since I've seen that profile picture

  • @SoggyMicrowaveNugget
    @SoggyMicrowaveNugget Год назад +41

    You've perfectly encapsulated trying to keep up with the latest tech... most the time its better to just stick to whatever is mainstream.

    • @kuabarra
      @kuabarra Год назад +9

      Yea, I’m tired of paying an early adopter tax and still get a sub optimal experience. Could save a lot of money and time messing around by waiting a few years for new tech

    • @kendokaaa
      @kendokaaa Год назад

      What sucks is HDR has been a thing for a while, it's just still so expensive so not much effort goes into it because few people have HDR monitors. Tons of "HDR-supporting" monitors that can display the signal but few that do it well enough to be worth it

    • @SkandiTV
      @SkandiTV Год назад +2

      @@kendokaaa The dumbest thing is every TV and Phone nowadays is fully OLED or better and perfectly supports HDR, gaming monitors are just so far behind the curve it's laughable.

    • @mimimimeow
      @mimimimeow Год назад +1

      @@kendokaaa Even suckier is 2016 upper range TVs already do in excess of 1000 nits HDR, manufacturers found out ppl have no idea about it so they skimped on brightness until 2021 or so. It's even worse in monitor markets, the price of a proper HDR monitor back then with passable panel quality could buy you a Sony Master Series TV

  • @comradestinger
    @comradestinger Год назад +248

    I think a big factor in lack of game support is game developers being paranoid about what their game will look like on different HDR monitors. With SDR, there's a well established standard to aim for, where artists can go "if it looks good on my screen, it will look good for the player"
    With HDR, all that goes out the window and you kind of have to do the lighting, camera and postprocess tweaking *all over again* for HDR, with no guarantee it will even look right once it gets to the player.

    • @LostPhysx
      @LostPhysx Год назад +27

      100% same for Raytracing, but you don't see devs ducking around RTX either

    • @jwhi419
      @jwhi419 Год назад +72

      That's non sense. You don't film a movie and worry about peoples tvs not being to display it. You shoot and take advantage of the full rec 2020 with a range of 10,000 cm^2. If no one that doesn't have a mastering monitor can see alot of it. Then you still release it anyways and wait for tvs to catch up.
      Game devs just don't want to spend the money at all color mastering.
      Devs don't even adjust their mastering for ray tracing on. They just use the same colors when its turned on and then gamers wonder why the game doesn't look any different at all. Like the witcher 3. No color adjusments.

    • @cube2fox
      @cube2fox Год назад +53

      The devs of Alan Wake 2 said the artists optimized everything both for SDR and HDR. But reviewers didn't even mention HDR.

    • @Exoclypse
      @Exoclypse Год назад +35

      Most devs don't have HDR screens.

    • @NickJerrison
      @NickJerrison Год назад +8

      @@cube2foxI was about to talk about Alan Wake here. I truly feel like HDR was THE intended way to play in there. In SDR everything feels... washed? It's almost like they made the color range even more restricted. The blacks aren't black enough, the brights aren't bright enough. Almost like in those recent Capcom games where the brightness calibration has you select the darkest and brightest range first but you just go with the default and everything feels gray.
      Tried it later in HDR and it's by far the superior way to play it, although it sort of fits the "overdarkens" category that Philip mentioned here, so it's not perfect either.

  • @marcasrealaccount
    @marcasrealaccount Год назад +77

    It's crazy how there are many "HDR" monitors that are lacking in both brightness and pixel depth, some monitors say they are HDR400 with 8 bits per channel, when in reality they might only be 6+2 (6 bits of actual voltage regulation + 2 bits for a strobing effect) and even then the 400 nits is almost the same as standard monitors can achieve...
    A proper HDR monitor imho is one with at least 10 real bits of precision per channel with at least 1000 nits brightness, however brightness is less important than color accuracy.

    • @NeurodivergentSuperiority
      @NeurodivergentSuperiority Год назад

      is this good advice?

    • @superslash7254
      @superslash7254 Год назад +22

      The problem is the standards bodies don't have no spine and no teeth. They've got no spine to actually say "No, we don't care that it's good advertising, anything below HDR10 is not HDR" and no teeth to actually ENFORCE that in any meaningful way.

    • @rawhide_kobayashi
      @rawhide_kobayashi Год назад +11

      really, you should just avoid any "HDR" monitors that aren't OLED at this point. the only ones that are actually good reach OLED pricing, and OLEDs are superior in every way except that one everyone knows about, which is increasingly a non-issue, particularly if you are just using it for gaming / media consumption.

    • @marcasrealaccount
      @marcasrealaccount Год назад +8

      @@rawhide_kobayashi For me an OLED panel would actually be an extremely dumb choice, primarily because I do development and so I tend to have the same UI open for upwards of 8 to 10 hours a day, and that is generally a bad idea for OLED panels, even on my current LCD panel I end up with a tiny bit of image retention. But yeah if you use your panel to look at changing content then it's not really an issue, unless said content has some static elements... (transparent elements pose a smaller risk)

    • @michaelr8189
      @michaelr8189 Год назад +18

      The "HDR 400" certification just feels like a massive scam, and I can't think of any good reasons for why it exists. The best it can do is accept HDR input, but it will always display in SDR. If you're not that literate in the tech (me), then it can be easy to assume a VESA HDR Certified label will get you what you need, even though the 400 label literally can't perform the job. Why is it a thing?

  • @angusthenerd
    @angusthenerd Год назад +7

    Editing HDR is absolutely the hardest part. With the correct settings OBS looks identical to the real thing during playback (when VLC isn't lagging from the high bitrate) but exporting it through an editor with settings identical to OBS leaves the lovely bright colours only slightly too pale and I don't have the patience to spend hours tweaking colour and balance levels. HDR is quite the rabbit hole.

  • @mohc6865
    @mohc6865 Год назад +18

    Thanks Philip, love this kind of video where you go over a subject I know very little about. Explained excellently

  • @G7ue
    @G7ue Год назад +26

    BF1 also had some good HDR as far as I remember.
    My issue with HDR on PC is how half the games that support it require you to enable it in the windows settings first. Which always causes issues for me during my normal PC use if I leave it on. Like apps and websites being washed out. I don't want to have to go to that setting and manually toggle it every time I am loading up specific games. So I just end up not using it in those games, despite paying so much for my monitors.

    • @SkyBorik
      @SkyBorik Год назад +1

      Just tried BFV, and it's very great there too

    • @tobcyxob
      @tobcyxob Год назад

      In W11 you can adjust SDR brightness to minimize that 'washed out' effect. While my monitor is only HDR10, I now always run in HDR mode.

    • @raffdraws
      @raffdraws Год назад +1

      There's a shortcut so you don't have to go to that setting each time. It's Windows Key + Alt + B. I turn it on right before launching a game, turn off during work hours.
      Works well for me.

    • @mimimimeow
      @mimimimeow Год назад +1

      The thing about W10/11 HDR tonemapping is if your display has bang-on accurate EOTF curve, HDR desktop slider (paper white) at 10% would be indistinguishable from an actual calibrated SDR screen at 120 nits, and so you could just enable HDR all the time. It's just that 95% of consumer HDR displays have terrible EOTF at the dark end.. so they look horrible on most displays.

    • @IAmNeomic
      @IAmNeomic Год назад +2

      BF1's HDR is actually phenomenal. In SDR mode, the game has the typical video game look-lots of details that pop out, everything's nice and sharp, and the colors are boosted a touch to give everything more of a "hyper-real" look. With HDR on, it's almost like the color pallet is totally different. The saturation is pulled back a bit, the environment almost seems to have more depth, grass and foliage pop in a different way because they stand out more against the more realistic color tones, and everything just has a more natural look to it. Sadly, with BFV, they sort of went back to that "hyper-real" look and even the HDR keeps it, really just affecting the contrast.

  • @existentialselkath1264
    @existentialselkath1264 Год назад +108

    It's next to impossible to take a screenshot in HDR, especially one with a large dynamic range or in stardard formats like EXR. That alone is insane.

    • @twentysixbit
      @twentysixbit Год назад +33

      God the amount of times I forgot I had HDR on and uploaded an screenshot image to discord that looked incredibly grey

    • @lionelpotatoe1048
      @lionelpotatoe1048 Год назад +9

      If you have an Nvidia gpu you can use Ansel to take HDR screenshots, they came out good for me in AC Odyssey.

    • @SlenderSmurf
      @SlenderSmurf Год назад +9

      Windows game bar of all things can do this properly. Win+G and click the picture of an old timey camera.

    • @fennnb
      @fennnb Год назад +4

      Game bar gives you a hdr image file you can view in the default image viewer and also gives you a properly converted sdr image

    • @NickJerrison
      @NickJerrison Год назад +2

      @@SlenderSmurfGame bar turned out to be more useful to me than I expected it to. Their performance overlay is nice and compact, compared to Geforce's that takes like 15% of the screen, AND game bar's one has VRAM stats as well. The ability to pin almost anything on top of your screen (which includes customizable external crosshairs) is also fantastic.

  • @ShinobuX99
    @ShinobuX99 Год назад +15

    I made a significant effort to understand the details behind how HDR works back when I got my first HDR-capable TV.
    I was very enthusiastic about this tech, as all the demos and such I had seen seemed so striking and vivid.
    But when I actually tried to understand how to use it, I found almost no solid documentation, and eventually I more or less gave up.
    Today I use the media player MPC with a plug-in called MadVR to view HDR content on my TV via my PC. No idea how "correct" of a result this provides though.
    I've tried cross-referencing the appearance by playing the same movies on my phone (via Disney+ for example) and trying to change all the MadVR and TV settings to match that look.
    I think it's mostly the same now, but honestly who bloody knows.
    I've tried doing similar things with games. I'm usually very pleased with how PS5 games look in HDR, so in the cases where I own a game on both PS5 and PC, I've tried matching the HDR appearance of the PS5 on the PC.
    But this is kinda hit-and-miss, because even if one game looks good with one set of settings, others may not.
    Maybe there just aren't that many PC games that have proper HDR implementations, or maybe I just don't know what I'm doing.
    I will say though that the PC port of "Ratchet & Clank: Rift Apart" looks spectacular. I feel that it looks more or less the same on PC as it does on PS5, which is to say absurdly brilliant.
    I wouldn't say this is a very good game overall, but it's likely the most visually striking one that I've played, and my go-to example when I want to show off HDR to people who visit.
    Anyway, cool video. Thanks for providing some more insight into this rather messy topic.

    • @MrSnaztastic
      @MrSnaztastic Год назад +1

      When using MPC with MadVR it performs tonemapping to compress the colour range like Philip mentioned in the video, so it looks roughly "correct" when viewed in SDR mode, pretty clever honestly. It's made me feel like I'm not particularly missing out by watching my 4K blu rays on PC, which in itself is a difficult thing to get working, but dammit i want to stay on one device.

    • @ShinobuX99
      @ShinobuX99 Год назад

      @@MrSnaztastic MadVR also has the option to pass-through the HDR signal to the display, instead of tone-mapping it into SDR.
      This is how I do it when playing HDR media on my TV at least, and in this scenario I *think* the TV gets a proper HDR signal, in the same way a 4K Blu-Ray player would send out.
      But again, who knows, haha.

  • @KingKrouch
    @KingKrouch Год назад +63

    Valve claims that the Steam Deck has better HDR support than Windows, I'd be willing to believe them. Although some games with HDR support like Strange Brigade don't let you turn it on when playing in Gamescope.
    That being said, have you tried out Special K's HDR retrofitting? It's a game changer for games that don't natively support HDR.

    • @Omega-mr1jg
      @Omega-mr1jg Год назад +7

      Nope, linux does not have proper HDR implementation yet, and Gamescope (which is able to run HDR Games) still isnt perfect at it
      In the future, Wayland promises HDR as a feature, but thats long in the future.

    • @mechanicalmonk2020
      @mechanicalmonk2020 Год назад +29

      ​@@Omega-mr1jg gamescope's HDR _is_ the proper implementation.

    • @mar2ck_
      @mar2ck_ Год назад +18

      ​@@Omega-mr1jgWhat's wrong with Gamescope's HDR? It handles color profiles and screen EDID better than windows and they've used Proton to fix the HDR output in a lot of games so they don't look washed out like they do on Windows.

    • @comradeklaymore
      @comradeklaymore Год назад +26

      @@Omega-mr1jg Normal Linux computers don't have HDR support yeah (although KDE Plasma 6 comes out in February and will have basic HDR support). However, since the Steam Deck uses Gamescope all the time and Valve has put a lot of work into HDR, the Steam Deck OLED does have HDR support. If you watch the LTT or Digital Foundry videos you can see them running games in HDR like Cyberpunk and Ori Will of the Wisps.

    • @DissyFanart
      @DissyFanart Год назад +3

      ​​@@comradeklaymorebut but google but google said linux hdr not real how could programmers made it??? Kernel compositor ummm
      or something like that

  • @owencmyk
    @owencmyk Год назад +18

    "Dynamic Range" refers to a very simple concept: The difference between the blackest blacks and the whitest whites. The issue with talking about it is the way it applies in several different ways. For examples:
    - A camera can be HDR in the sense that it can capture really dark and really bright colors with great detail. (though usually "HDR" isn't used in favour of just... measuring the dynamic range in "Stops" which is more or less a measure of how many times you can double the brightness before you reach the maximum brightness)
    - A game can be HDR if it renders things with colors brighter than white. (even if they get tonemapped back down to SDR later)
    - Content can be HDR if it stores colors brighter than white.
    - A monitor can be HDR in the sense that the darkest black and lightest white it displays are very far apart in brightness. (even if it doesn't actually display "HDR content")
    - And finally, what people usually mean when they say "HDR", is that the game rendered in native HDR, stored the image in HDR, and sent it to a monitor which can interpret colors brighter than white, and has enough dynamic range to display them at a satisfying brightness. It's that combination that's incredibly rare.

  • @ZestyRanchDressing
    @ZestyRanchDressing Год назад +76

    It's worth noting that HDR support in Windows 11 is much better than Win10, although still far from perfect. Using the HDR Calibration Tool also helps a lot. With the way I have my system set up, HDR is always on, but SDR content and the Windows desktop still look good, and HDR media/games look great in my opinion. AutoHDR is also pretty good for games that don't have native HDR.

    • @BrotherO4
      @BrotherO4 Год назад +3

      This, with W11 I now just leave HDR always on, autoHDR on, and simple make SDR brightness nice and low for everyday viewing.

    • @devlin1991
      @devlin1991 Год назад +5

      Agreed. I have an LG C9 (760 nit peak), I created the calibration profile in Windows and set it as the default color profile. That + HDR on and SDR set to a comfortable nit value via the HDR settings looks great and "just works". I run the TV on HGiG mode. Baldurs Gate 3 and Talos Principle 2 both work great.
      Edit: I do not use auto-HDR, I try to follow creators intent and play SDR games in SDR.

    • @zxbc1
      @zxbc1 Год назад +8

      Between windows 11 AutoHDR, and Special K, HDR gaming on PC is anything but a hot mess, it's actually quite good now. The terrible triple A games are just a drop in a bucket in terms of number of games on PC, so whenever I hear people talk about ___ being terrible because "triple A games are bad at it" I just cringe. It shows a stark disconnect with reality.

    • @LeegallyBliindLOL
      @LeegallyBliindLOL Год назад +5

      It should prompt you once you enable HDR tho. I still don't get why we have to manually download an app and do it

    • @BrotherO4
      @BrotherO4 Год назад +2

      @@LeegallyBliindLOL cause for 2 reasons. reason 1. some game base the max nits from the OS level and others ignore it. aka those that provide you with a manural settings are the one that skip the OS.
      reason 2. many TV/monitor use "Tone mapping per scene. so Windows is reading the Default "max nits" from the Device which is base on a "Tone mapping" system. so its completely False.
      so when you turn the tone mapping off or use Hgig. We remove the TV/Monitor from adjusting the brightness. leaving only the OS/game to control everything. also note you should do this even on the consoles. let the OS/game handle the tone mapping and not have the TV/Monitor involve.
      side note AutoHDR uses the OS information. so for example on my LG c2 42in it would normally tell Windows that my max nit is in fact 4k. which is not even close to the real max nit of 800. so after setting Hgig and using the app i get no more clipping. for games that have a manural settings well the good news is that now you are adjusting settings to a static value as well.

  • @LiterallyPolio
    @LiterallyPolio Год назад +10

    My best HDR experience as someone who bought an LG Ultragear monitor about 6 months ago has been the Dead Space remake. The implementation in that game is just ridiculous. Looks amazing.

    • @mycelia_ow
      @mycelia_ow Год назад +1

      The ultragears use HDR 400, it would look even more incredible on a true HDR 10 display

    • @LiterallyPolio
      @LiterallyPolio Год назад

      @@kevinsedwards it's the OLED one. 1440p 240Hz 27", unsure of the model

    • @LiterallyPolio
      @LiterallyPolio Год назад

      @@mycelia_ow been looking into upgrading it since I'm starting to see a little burn in already and I bought the replacement plan at Best Buy. Bought it at nearly full price so I'd get like $850 towards something new. Was looking at the Samsung models since they boast much higher brightness.

    • @LiterallyPolio
      @LiterallyPolio Год назад

      @@kevinsedwards yea that's what I'm looking at. The QDOLEDs look awesome. I'm on an RTX 3080 so I don't want to go too crazy with the resolution but we'll see where we end up.

  • @wesleywittekiend423
    @wesleywittekiend423 Год назад +78

    I’d be interested to see a follow up on how windows 11’s auto HDR feature factors into this. As someone who has a high end HDR tv connected to my gaming PC, I’m wondering if windows 11 is worth the upgrade.

    • @MarlinHunter673
      @MarlinHunter673 Год назад +33

      It actually works FAR better than built-in HDR in some games. I don't play too many "HDR-enabled" games that come out these days, and the ones that I have played I did before getting an HDR monitor, but it absolutely is the case in War Thunder right now (it wasn't until an update a couple of months ago, which just shows that HDR support is so fickle it gets broken where it previously worked).

    • @Lagger01
      @Lagger01 Год назад +12

      It's literally the only reason to get windows 11. Been playing emulated switch games with auto HDR on my OLED and they looks so clean.

    • @raffdraws
      @raffdraws Год назад +6

      Auto HDR is actually neat

    • @richardhunter9779
      @richardhunter9779 Год назад +9

      It's kind of worthless. For most games, they still look overexposed like normal SDR content does, but the bright spots are way brighter. Leading to far too much of the screen being far too bright. Banding is about the same.
      The only purpose of it is getting extra brightness that your display maybe does not support in SDR mode.

    • @TheCloverskull
      @TheCloverskull Год назад +2

      I have a Samsung Neo G7, and windows HDR with proper colour calibration looks really dang good in most content..

  • @mooistheroo
    @mooistheroo Год назад

    I love this channel! The voice along with the music theme is just so great. Every single one of the videos I watch from you just feels right at home. Thank you!

  • @utfigyii5987
    @utfigyii5987 Год назад +35

    Hey Philip, I do not know if you already heard but the new Steam Deck OLED is capable of HDR too. This is big news because there has not been proper HDR for Linux until now.

    • @BAGG8BAGG
      @BAGG8BAGG Год назад +2

      Are you sure that is right? from what I understand X11 is completely unable to display HDR content, It would need a complete rewrite to work, that is why they are hoping to include it in Wayland instead.
      It's genuinely impossible to find out about HDR stuff in regards to linux right now.

    • @BAGG8BAGG
      @BAGG8BAGG Год назад +1

      @@Sevenhens Cheers for the info :)

    • @utfigyii5987
      @utfigyii5987 Год назад +2

      @@Sevenhens Plasma 6 is planned to have some level of hdr support and it should be released early next year. When steamos will upgrade to plasma 6 is a whole different story. Exciting stuff!

    • @zocker1600
      @zocker1600 Год назад +4

      ​@@Sevenhens
      > The desktop mode runs on wayland
      At least on my Steam Deck with the latest stable update desktop mode is still running on X11

    • @gobgonson8053
      @gobgonson8053 Год назад +13

      @@BAGG8BAGG The way Valve is getting around this is by bypassing the desktop environment entirely through gamescope, their wayland-based compositor. In desktop mode you're still stuck with SDR.

  • @ctcwired
    @ctcwired Год назад +5

    Professional video colorist & HDR mastering engineer here. Agreed with most points in the video. Besides the technical challenges, there's also a cultural ones. Just like when stereo came out and musical instruments would get panned wildly all over the place, or when surround sound came out and every movie had to have something rotating behind you for "wow" factor, HDR has similar issues. Often the brands / marketing / directors either hate HDR or love HDR, and thus push things in various directions. Attempting to "wow" customers for their money by overdoing it, or if they hate HDR they'll publish HDR content that's identical to the SDR content, sometimes even out of spite.
    Even if identical content gets published in SDR/HDR containers, it often won't look identical, because the controls for scaling brightness (called the diffuse white level) are either missing or poorly explained on most consumer displays. For instance the SDR mode on your TV might be set twice as bright as reference, but then the HDR mode might be bang on reference, thus looking weirdly darker. Even worse if your viewing environment isn't appropriate for reference brightness levels (trust me it's not!). That's assuming the HDR mode on the display even performs as well as the SDR mode, and that the playback device is properly switching modes, or properly compositing the mixed content within the UI. The only consumer product that gets this right out of the box at the moment (including mixed content UI compositing and brightness scaling) is modern iOS devices. (The standard is BT.2408 by the way). Chromium browsers & the latest Pixel devices are on their way to getting it right soon as well, from the looks of their developer forums at least. (Google is making a big push for BT.2408 and Gain Map HDR (Ultra HDR) photos support.)
    Just like how a good surround sound mix should immerse you but not distract you, an HDR grade is good when you can barely tell the difference. It is better if you directly compare, but not in a way where anyone sitting next to you would notice or care unless you pointed it out. When HDR is used in this way, it is more likely to give all users a better & more consistent experience, and there will be little to no difference between distribution standards either (HDR10, Dolby Vision, etc.). It is only when pushing to extremes that the standards differ, and displays struggle.
    Finally, HDR video will almost always need to be carefully curated by the hand of an artist / developer / filmmaker. The idea of cameras immediately outputting HDR-ready video is likely to never work all that great, because just like how surround sound needs context to be done well, HDR video needs context to be done well.

    • @Timothy003L
      @Timothy003L Год назад

      Does iOS use scene-referred or display-referred mapping?

    • @ctcwired
      @ctcwired Год назад

      @@Timothy003LIn terms of the display? It's just following BT.2408, where diffuse white for SDR & HDR is anchored at 203nits and is user adjustable from there. I suppose you could call that a display-referred scaling.
      In terms of video capture, most video modes on consumer devices are display-referred. HLG inherited the same ambiguity that Rec. 709 did of being *technically* scene-referred, but no-one actually uses it that way. So on most cameras systems, HLG mode will include a bit of that manufacturer's "baked look". Even broadcast cameras tend to add sweeteners for achieving the "broadcast color" look (more saturation). The iPhone camera's HDR video mode (Which is HLG DV Profile 8.4) still uses all kinds of local-tonemapping, sharpening, etc. across the image, and uses an unusually high ambient viewing environment level and thus will look several stops over exposed unless corrected manually with editing tools, or the Dolby Vision downconversion is applied (some of Apple's tools can do this for you.)
      The new iPhone 15 Pro camera does have a "Log video" mode, which is truly scene-referred, and ACES compatible. Works great in my testing. It disables local tonemapping as well. The PDF document of the log encoding specification is available on Apple's developer site downloads page. (requires login)

    • @Timothy003L
      @Timothy003L Год назад

      ​@@ctcwired Sorry, I meant in terms of displaying SDR content. I assume it uses the method specified in "5.1.1 Display referred mapping." Does it use per-component gamma 2.2 like macOS? Or does it use gamma 2.4 and then adjust the gamma of the luminance according to section 6.2 of BT.2390? The latter would preserve colors but consume more battery.
      And what about sRGB content? Windows uses the piecewise sRGB EOTF to map SDR to HDR. Although technically accurate, it looks washed out compared to a pure power gamma function.

    • @ctcwired
      @ctcwired Год назад +1

      ​@@Timothy003LAhh okay. So unlike AVFoundation/QuickTime on macOS which still to this day uses a strange & archaic OETF inverse for BT.709 decoding (effectively gamma 1.961, or the EOTF minus the dim surround compensation), iOS/iPadOS seem to do things much more appropriately imo. The EOTF of iPhone display varies based on ambient brightness levels, but in my testing it seems to decode most SDR content at pure 2.2. The sRGB spec has always been a bit ambiguous in that regard (several pages of engineers arguing about it on ACEScentral) but from my understanding piecewise encoding was meant for file encoding/decoding, but the physical display (CRT) was assumed to be roughly 2.2 power.
      At the time (1990s) this mismatch in the darkest areas between the encoding math vs physical display's characteristics was accepted as a reasonable tradeoff for the distortion issues caused by attempting to deal with an infinite-slope at low bit-depths on old GPUs. This is probably not necessary anymore, so in modern day there's an argument to be made of "what the standard says" vs "what everyone does". That's on top of color management & environment compensation methods being leaps and bounds better than they were at the time the spec was written. It's a bit of a mess, and hopefully the spec gets updated.
      In terms of how iOS composites SDR with HDR content, Apple uses their "EDR display" system, it effectively ramps the display brightness up while ramping the SDR UI content down at the same time. Sometimes you can see a subtle flickering when this happens. Android calls this same process "SDR dimming". I believe (and hope!) they still use pure power laws for that.
      The Chromium developers meanwhile actually wrote a guide on how they plan to composite HLG still-images in-browser which you might find interesting. docs.google.com/document/d/17T2ek1i2R7tXdfHCnM-i5n6__RoYe0JyMfKmTEjoGR8/edit#heading=h.q1k46in92kx1

  • @jayceneal5273
    @jayceneal5273 10 месяцев назад +2

    RTX HDR has single handedly made hdr worth enabling at this point (at least for games and videos). It really works awesomely well

  • @jarowskis
    @jarowskis Год назад +8

    theres a new cool feature on pixel 8 that you can take hdr photos which map the brightness. And when you display those photos the screen is brighter on the bright areas and darker in dark, looks amazing.

    • @Soandnb
      @Soandnb 7 дней назад

      Samsung also has that, as of the S24. And I think iPhone is going to be able to get it as well. Chromium-based browsers like Google Chrome or Brave will also support this gainmapped JPG format which they call "Ultra HDR"
      And I found a tool that lets me convert the scene-referred EXRs of my artwork into the exact same HDR gainmap format.

  • @Maxoverpower
    @Maxoverpower Год назад +2

    You showed CS2 for a moment there. That game does not support HDR, that's why it looks bad if you try. The ingame settings that mention HDR are just referring to how the game is rendering the visuals internally before tonemapping to SDR.

  • @SirBlicks
    @SirBlicks Год назад +11

    Never thought I would be blinded by my phone until I watched HDR content in the dark 💀

    • @inqizzo
      @inqizzo 12 дней назад

      Now when websited have hdr content on them, its frustrating when the brightness changes every time you enter a certain website.

  • @Pepso8P
    @Pepso8P Год назад +2

    I don't know much about HDR, but what I know is that anything on our new TV looked horrendous regardless of the settings, until we disabled HDR and suddenly everything looked super nice. Not a good marketing move IMO.

    • @willuigi64
      @willuigi64 Год назад

      Yeah poor displays still attempting to accept HDR is just shit HDR. You don't *really* have a proper HDR capable TV if it looks horrendous.

  • @getsideways7257
    @getsideways7257 Год назад +3

    0:33 Some phones do have proper HDR (HDR10, HDR10+ or Dolbly Vision) and can sometimes even shoot in one of the formats. Quite a few games even on PC *also* have proper HDR support (and then there is SpecialK for those that don't).

    • @getsideways7257
      @getsideways7257 Год назад

      Also, it's not "fake HDR", it's called tone mapping.

    • @getsideways7257
      @getsideways7257 Год назад

      Also, there is no "HDR 1000" - it's just that at least 1000 nit peak brightness is preferred for the HDR10 content, but you can get away with just 600 or even 400 (not recommended though)

    • @getsideways7257
      @getsideways7257 Год назад

      2:15 and onwards for a while: actually, it's more about the color space at that point than it is about the brightness. Don't forget that one of the most exciting things about an HDR screen is how deep the color is. If a typical SDR screen is mostly limited to about 100% sRGB color space coverage - some of the most advanced HDR ones can get pretty close to covering the whole of Rec.2020
      As for the banding problem, it would not be as extreme as you are describing it, since the SDR screen would naturally be limited to the same brightness levels like the HDR one, and you are forgetting about GAMMA. And that's why some HDR TVs still use 8 bit panels, not to mention that almost all of the 10 bit ones are just 8 bit + 2 bit FCR.
      Also, you can force lower bit color on your HDR monitor as well...
      Of course, I'm not saying that there is no difference whatsoever, I'm simply trying to say that you are making the difference look more extreme than it actually is.

  • @FightStreeting
    @FightStreeting Год назад

    You have one of the most consistently interesting output of videos, thanks, across all your channels

  • @hedgeearthridge6807
    @hedgeearthridge6807 Год назад +9

    I just got the Pixel 8 which actually has an HDR screen, and I'm blown away by it. Whether its completely genuine or not is besides the point, in a picture of a landscape the clouds actually glow white, it looks so much more realistic and almost tricks you into thinking it's slightly 3D. Colors are so much more nuanced, they're clear while not being oversaturated or undersaturated, it's very genuine. So now I want a computer monitor with actual HDR, and I'm finding it's not really possible without shoveling out my bank account and risking ruining an OLED with burn-in, just to get a patchy broken experience like Philip is describing

  • @clshortfuse_
    @clshortfuse_ Год назад +1

    Excellent video. I work on the Luma mod for Starfield to address that lack of good HDR in video games you mentioned. Just some things I want to mention from watching your video. Tone mapping is when you take the raw scene (eg: Sun is 10K nits) and map it to your output (eg: 100 nits SDR, 1000 nits HDR). If you do it linearly, then you stretch. But that's a bad tone map. You basically want the first 0-15 nits to look identical but roll off the rest. Something at 5K nits shouldn't be 50nits on SDR. It would be close to 95 nits. It's nerdy math, but good tone mapping has a curve.
    If you want to make your HDR mastered video look good in SDR, you want to rebuild that curve again, but for SDR. The BT.2446a is a good way to tone map HDR down to SDR without making everything washed out. But yes, you as the video producer or gamer, or even video game designer, really shouldn't have to concern yourself with all this. We all need better tools.
    Great video!

  • @stevethepocket
    @stevethepocket Год назад +4

    The resolution thing hits home pretty hard. Even now, monitors with a minimum of 192ppi are rare and never the ones they're advertising. Instead they just keep pushing bigger, wider, and/or curved monitors that have the same mediocre resolutions and only work at 150-175% scaling which looks terrible.

    •  Год назад +1

      To be fair, most phones are held at less than half the distance in comparison to the distance that people sit away from a monitor. But you are right. It has bothered me many times by now that monitors still have so little PPI. One should think that a PPI of 200 would be a minimum in 2023 for monitors.

  • @FightStreeting
    @FightStreeting Год назад

    Thanks!

  • @YezzyHD
    @YezzyHD Год назад +21

    omg thank you for this, the support most monitors claming to have "hdr" is also horrible. There are so many displays with horrible hdr that can give you ghosting or flickering. True good hdr is a hard gem to find

    • @ZX3000GT1
      @ZX3000GT1 Год назад +13

      Or those that claims HDR support but only supports it up to a certain brightness limit
      Looking at you, HDR400 displays

    • @NeurodivergentSuperiority
      @NeurodivergentSuperiority Год назад

      @@ZX3000GT1 Better than nothing i guess, even if it's barely anything

    • @michaelr8189
      @michaelr8189 Год назад

      @@ZX3000GT1 Seriously, I feel as if I got scammed by that label a few years ago, and I'm still bitter as hell about it.

    • @mimimimeow
      @mimimimeow Год назад

      it's hard in monitor market, quite easy in TV market. Most upper-mid-range TVs since 2017 do in excess of 600 nits. Though you have to deal with their sizes and VA smearing (but let's be honest, IPS is horrible for HDR)

  • @CastledCard
    @CastledCard Год назад +5

    I used to use HDR and fight it in my editing, but gave up and now I don't use a selling point of my monitor anymore lol. The only thing holding HDR back is the fact editing software isn't caught up yet even though they've had enough time to

    • @IAmNeomic
      @IAmNeomic Год назад

      The main issue is that, the industry really heavily pushed 4K when in reality, HDR should have been the selling point. Most people in a typical living-room TV Setup aren't going to see much difference between 1080p and 4K, outside of content that was poorly mastered in 1080p, like early Blu-rays. Even though HDR is pretty standard now, there's still a ton of TVs being made today that don't have it, or have a very poor implementation of it. Especially a lot of cheaper TVs where "HDR" mode just cranks the backlight up.
      That in itself is causing a lot of issues with people even understanding what HDR Is supposed to be, which is not only a greater range between black and white, but also the tens of millions of new shades of colors that previous formats had to use "tone mapping" software to squish into our old formats.

  • @Ularg7070
    @Ularg7070 Год назад +22

    HDR is one of those things where one bad experience makes you just want to not bother. For me that was RE2 Remake, it felt like I could get the monitor to run in HDR but not the game, or the game to run in HDR but not the monitor. Between that and how funky auto HDR is in windows for playing older non-HDR content, I just leave it set to off now because it's not worth having to juggle these settings depending on which game im playing that day.

    • @hbaabes
      @hbaabes Год назад +2

      True! When I got my monitor that supported HDR, I was very excited to see how it turns out and only ended up being disappointed from the first experience. But one day i tried to play doom eternal and that game turned on HDR automatically and i was like DAMN THIS IS BEAUTIFUL! And i realized the HDR was on and was actually looking very good. No other experience other than doom eternal was ever good tho

    • @abeidiot
      @abeidiot Год назад

      with latest windows. most games don't have this problem. In fact RE2 remake is the only game that made me tear my eyes out. It does weird things with the windows setting for some reason

    • @lukelmaooooo
      @lukelmaooooo Год назад

      yeah the hoops you have to jump through sometimes to get a game looking good in HDR is honestly a slap in the face when you consider that HDR10 has existed for almost a decade now

    • @mycelia_ow
      @mycelia_ow Год назад

      HDR 400 should not exist, it is not true HDR 10

    • @boboboy8189
      @boboboy8189 Год назад

      ​@@mycelia_owsadly, that where monitor manufacturer selling cheap monitor. Its looks horrendious when i open HDR, SDR looks beautiful.
      HDR need at least 600 nits but you still cant look the differences with SDR, only 1000 nits where its looks great

  • @Krushking99
    @Krushking99 Год назад +2

    Thank you for not making this video hdr

  • @eduardbalfego
    @eduardbalfego Год назад +4

    This video represents me very well. I also went through a lot of frustration. I have an HDR compatible setup and after months and months of trying to get it to work I gave up... for now i am sticking with SDR. Maybe in the next five years HDR will have more support and become a standard, but for now it seems that the next improvement is OLED screens

    • @sunyavadin
      @sunyavadin Год назад +1

      If I could just get any two HDR-capable games to both look good AT THE SAME TIME on my HDR monitor I'd call it a win, but the ability to configure the HDR settings within any given game is basically nonexistent, and none of them are consistent in which monitor settings work best with them.

    • @sunyavadin
      @sunyavadin Год назад

      .

  • @Graylord88
    @Graylord88 Год назад +2

    The effect you demonstrated in HL2 isn't really an example of how modern games typically do HDR. That effect is more commonly known as automatic exposure and has nothing to do with what we'd call HDR these days, but just emulates a camera (or eyes) adjusting to light levels. Games can do both automatic exposure and HDR seperately.
    Modern games, especially on console, actually do the real deal, handshaking with the tv making use of it's HDR capability to display a much wider range of contrast the way the developers intended.
    HDR on windows however, my lord is it bad.

  • @TonyDaExpert
    @TonyDaExpert 13 дней назад +3

    HDR when done wrong is nothing special when done right its amazing

  • @dedli_midi
    @dedli_midi Год назад +1

    posted in november 2023. so bold. so relevant and topical

  • @Symthos
    @Symthos Год назад +5

    No seriously, recording in HDR is a NIGHTMARE! Countless clips thrown away because something cool did happen, but either i couldnt capture because shadowplay disabled itself, or shadowplay broke midway through because of HDR. Funnily enough, the only capture that seems to work is the default WindowsKey + G "game bar" but dont leave it running too long, else it suffers a ram leak and i have to force close it in windows 😮‍💨

    • @KARLOSPCgame
      @KARLOSPCgame Год назад

      What about OBS
      I was recording quicksync 10bit 1440p av1 bt.2020 just fine the other day

  • @Sgtcrazyeyes235
    @Sgtcrazyeyes235 Год назад +1

    the real killer for hdr content on pc is the lack of reasonable high refresh hdr displays, me personally i dont want a 32 inch curved 4k screen just because its proper hdr

  • @ISAK.M
    @ISAK.M Год назад +4

    I just bought my first HDR capable monitor today and now this pops up, coincidence? I think not

  • @AmoLaMusicaIAC
    @AmoLaMusicaIAC 14 дней назад +1

    And the big problem is that you need to buy a 500-1000€ monitor to view correctly the not supported HDR content. I got a decent IPS monitor for 150€ (on sale, it was 250€ originally) from LG, that has no contrast but real good color accuracy, no latency and ghosting, 144hz... but it just lose 50% of color accuracy when in HDR mode. All the images become blueish and kinda dead. When connecting the screen on my latest MacBook Pro 14 I can really note the abyssal difference in colors (in SDR mode they are really similar, in HDR colors are dead on the LG). I appreciate the HDR lightning in foliages etc when playing on my playstation, but the total lost of color vibrancy makes me take it off. And I'm even sad I took this decision after i played RDR2. I think it has to do on what you say at 2:19, the screen isn't capable of showing big amount of light AND all the color in the middle at the same time.. it's a shame as you say because HDR is the future and has a lot of potential but for now it isn't just supported by games, apps and screens.

  • @quinndepatten4442
    @quinndepatten4442 Год назад +4

    The examples of HDR were really cool. If I make a 3d game I'd love to make use of real HDR. My eyes are immediately drawn to the brighter areas. It could make for intuitive level design.

  • @Talik13
    @Talik13 Год назад

    Finally! This is such a concise way to explain it. As a photographer and digital artist, I've had to deal with the pains of HDR and the joys of color spaces like sRGB, Pro Photo, AdobeRGB, and the fun new Display P3 and what screens can display them properly. (And I refuse to try and deal with printer inks and color calibrations.)

  • @Lucknutxbl
    @Lucknutxbl Год назад +5

    SpecialK does a pretty nice job of converting SDR games over, not perfect but a lot better than AutoHDR anyway, and you can also dial in the brightness you want with it, sometimes I actually prefer the HDR from SK over the one included with some games too.

  • @hmm9962
    @hmm9962 Год назад +2

    My phone has a 12 Bit HDR dolby vision screen but it isnt really useful because of the same reason lol...no content is made in HDR and dcip3

  • @beardalaxy
    @beardalaxy Год назад +5

    My experience with HDR has been things just looking more washed/blown out. Maybe it technically looks more realistic? IDGAF though if that's the case, I just want my game to look nice. My 10bpc full dynamic range IPS monitor does that just fine.

    • @TechandDoc
      @TechandDoc Год назад +2

      I think most people have shared this experience. Problem is less than 1% are videophiles, and those of us that are can't be bothered to adjust display settings for each individual piece of HDR content. In my case I end up fiddling with it so much it just takes me out of the content being displayed.

    • @MaaZeus
      @MaaZeus Год назад +2

      Then you haven't experienced real HDR as it is exactly the opposite when correctly displayed on a screen that actually shows real HDR image. You have dark blacks, clear whites and bright highlights without losing details in either. The image is closer to how you see light in real life, instead of a postcard-like image that SDR is.
      The problem is that it is a hardware feature and requires good screen to show it but because VESA decided to chicken out on their HDR standards there are a lot of computer monitors on market that claim HDR support but actually do not show it properly, just compress it down to washed out SDR image. IPS is immidietly ruled out, too low contrast ratio even if local dimming. VA can do it but even then it needs a good direct-led local dimming. On plus side it can achieve the brightest highlights. However OLED does HDR the best for most content.
      The other problem is that game developers are half-assing HDR implementation. Movies do not have this problem and you do not have to set or calibrate a god damn thing from your TV (beyond using movie mode if you are a videophile like me), all the information (brightness of the scene etc) is baked into the video and your TV tone maps the image to its best capabilities.

    • @TechandDoc
      @TechandDoc Год назад +1

      @@MaaZeus the hdr information coded into movies only works properly with direct blu ray discs. Streaming and encoded media in all forms uses tone mapping, which is just best guess. So no, until digital streaming HDR is standardized, it's dog shit.

  • @Sevenhens
    @Sevenhens Год назад +2

    Did you use an OLED monitor though? Or atleast a mini-led one? An LCD monitor will have problems with the backlight raising the black level and it won't even get close to a proper HDR implementation no matter what the manufacturer says.
    Another huge issue with good HDR monitors is that the best ones are OLED monitors that unfortunately have pretty low peak brightness levels compared to a standard OLED TVs. For example my LG OLED ULTRAGEAR monitor only goes up to about 600 nits while my LG C1 TV goes up to 1000 nits. Another issue is the auto dimming on 70-100% white level content that happens on the LG monitor, once again destroying the dynamic range the manufacturer advertised. Its all just a mess that made me sort of regret getting an OLED monitor this early on.

  • @romanr1592
    @romanr1592 Год назад +3

    All they had to do is increase the bit depth of SDR, perfectly backwards compatible, instead they made a system where your TV decides for you what kind of picture you are going to see, of course that made a hot mess.

  • @edragyz8596
    @edragyz8596 Год назад +2

    Man, this video didn't even go into the Windows Tonemapping issue. It gets WORSE.
    Essentially all consumer displays have an ABL curve (yes even LCDs) and this curve tells you how bright your display is at a given picture level. For example if 10% of your screen is white and 90% is black, the white portion would be brighter than a 50%/50% image, and even brighter than a 100%/0% image.
    If you use that 10% peak brightness value for your games, you'll end up blowing out parts of the image if the APL of the scene gets too high. The solution would be to set your brightness lower than that or, if you want to avoid these blowouts at ALL COST, set the worst case brightness.
    Translating from BS to English, this means that shiny new HDR1000 monitor you got is only really an HDR300 monitor. And that's not far fetched or an exaggeration. Take Samsung's Odyessey Neo displays as an example; they claim a max of 2000 nits, output 1200 nits, and have a fullscreen brightness of about 350 nits. If you're familiar with HDR displays, you should recognize the Odyssey Neo lineup as some of the better HDR displays available, which really drives my point home.
    There is a true solution, called tonemapping, but this solution is sort of shunned amongst enthusiasts for some reason. Tonemapping essentially allows my display to adjust luminance in accordance with my ABL curve in real time, allowing for an accurate picture AND for that sweet sweet advertised peak brightness. The downside is games need to support 3500 nits in order to give me my full 1565 nits after conversion. If a game only supports 3000 nits I end up a little short at 1400ish.
    Oh except one small problem, Windows either intentionally or unintentionally breaks the HDR metadata handshake needed for this tonemapping to occur, no matter the GPU vendor. Consoles don't have this issue and work flawlessly.
    The fix for THAT is finnicky but it works. Close everything (I'll go into more detail in a moment), open your game with HDR ON, turn HDR OFF (Windows Key + ALT + B), turn HDR ON (Windows Key + ALT + B), open an HDR video to play in Movies and TV, pause the video. Repeat this till it works, sometimes it takes me a couple tries.
    What makes this finnicky is how fragile this tonemapping state is. Open a web browser and it breaks (repeat the steps above), accidentally preview a program window on your taskbar and it breaks, etc etc etc. If you do ANYTHING other than play your game, always check to see if it's working.
    For the video I just click Windows key while still tabbed into the game, which pulls up recently used files, I then click the video I want. Which is actually a custom made video I used to troubleshoot this 2 years ago. This is more consistent for me than opening File Explorer. This video is a 3000 nit box with 10000 nit text that reads, "IF YOU CAN READ THIS THE MONITOR IS WORKING."
    I DM'd Phillip on Twitter 2 years ago when I was having this crisis for the first time on my PG32UQX, crazy how fast time flies.

  • @H.DISTRICT
    @H.DISTRICT Год назад +3

    The Steam Deck OLED is going to be my first HDR display. Looking forward to seeing what it’s all about.

    • @jm27232
      @jm27232 Год назад

      You can't even play HDR videos on it.

  • @philiphansen1103
    @philiphansen1103 Год назад

    Thank you. I really thought it was just me. I'm glad I'm not alone on this.

  • @astrea555
    @astrea555 Год назад +20

    Great video, would have been even better if you explained the HDR color gamuts as well. SDR is all rec709 colors, HDR also adds DCI-P3 and rec 2020 colors. It's not about just more brightness.
    It's also about pure blacks too, I'm guessing this is why you showed OLED tv and you represented the lack of pure blacks on most LCDs on your graph but yeah it's worth insisting.

    • @MaMuSlol
      @MaMuSlol Год назад +3

      It is also worth noting how bad the color gamut in rec709 and srgb (which is the same as rec709). They have only about 50% of the colors of NTSC! Yes, the analog television signal from about 70 years ago. We actually got a big downgrade in color going from analog video signal into HDTV. It's no wonder why most tv manufactures jack up the colors on SDR mode by default.

    • @Meinhardt.Erschwans
      @Meinhardt.Erschwans Год назад +4

      That's not quite right. HDR is CAPABLE of using wider color gamut, but no game makes use of this. In the worst case, colors are displayed highly oversaturated when the Game is not identified as a color managed software. Most developers use HDR exclusively for brightness and contrast. And usually even that quite poorly...

  • @ajko000
    @ajko000 Год назад +1

    I noticed if I enable windows 10 HDR it gets really bland everywhere except in a game that supports it. Why is this? I heard mentions of a color profile or something, but I want to know more. Or is this just because HDR is such a hot mess that it'll only look right if the displayed media is explicitly "working with HDR" to not make it a dim grey soup?

  • @dolan_plz
    @dolan_plz Год назад +4

    I really like what Special K can do with games with broken HDR implementation!\
    DOOM Eternal really has the best HDR I've ever seen. The colors just pop, there's so much detail that you can't see with SDR.

  • @azenyr
    @azenyr Год назад +1

    This problem is extremely worse when you consider that many modern monitors that claim to support HDR are not capable of displaying HDR content AT ALL. All they mean with this "HDR Support" is just that they can in fact support and receive an HDR signal and decode it correctly to show on their non-HDR display panel. It's so extremely misleading and frustrating and most casual users consider HDR as a useless and gimmick thing on PC because of these lies from display and pc manufacturers. The only screens capable of doing proper HDR and mini-LED and OLED screens. No LCD will ever be able to do it no matter how much HDR words they spam on the marketing materials. And most of not all miniLED and OLED monitors still cost close or more than a thousand dollars, and when 99% of pc users use a 150-200 dollars monitor and will probably keep using it until it dies, HDR will not become a PC standard anytime soon

  • @kylekermgard
    @kylekermgard Год назад +12

    I think the lack of mainstream HDR monitors is the biggest issue. I've been using an OLED TV as a monitor because the other options suck.

    • @jekaufo
      @jekaufo Год назад +5

      yes, because hdr 400 monitor are a scam, or with very little zone count

    • @crestofhonor2349
      @crestofhonor2349 Год назад

      Luckily the newer MiniLED and OLED monitors on the market actually have some really good HDR

    • @mimimimeow
      @mimimimeow Год назад

      I've been using a Samsung NU8000 TV as an oversized ultrawide+tall monitor since 2019. It does 800 nits, 120Hz and cost 600 EUR at the time. No monitors come close to that value until 2022 or so.

  • @Tenmo8life
    @Tenmo8life Год назад +4

    5:00 just wanted to inform you *mr klikphilip* , you appear to be missing a model and texture file for one of your desk ornaments 🫤

  • @OverNine9ousend
    @OverNine9ousend Год назад +1

    Massive disclaimer, if content is not for HDR and doesn't have those contrasting colors HDR just kinda ruins it?

  • @cantti
    @cantti Год назад +3

    Thanks for putting this to words and out there. For the longest time, it has bothered me. A lot. I have enjoyed HDR in the living room since the days of PS4. Currently even most of the streaming services put out Dolby Vision like it's a norm. Even my iPhone hops in and out of HDR like a champ. And it's all great.
    But at the same time my PC, with state of the art hardware and OLED monitor - it's a hot mess of a user experience trying to use HDR. Manually toggling it on and off. Half of the "HDR" being some sort of pseudo-HDR shenanigan. And at the same time PC Master Race seems to act like it's all fine and dandy.

  • @Tenraiden
    @Tenraiden Год назад +1

    This is one area where Macs have been light years ahead of PC/Windows. The entire Apple hardware lineup and their software pipeline has supported hiDPI greater than 1080, HDR and deep color gamuts for years while PC users are still stuck in the 1080 sRGB cave pounding rocks 😅

  • @jayzn1931
    @jayzn1931 Год назад +3

    Well, the new steam deck now supports HDR. I don‘t know it myself but I do think it works well. This is great, because it is the first time I heard of HDR on linux as well. As Lightroom now also just got an HDR mode, I am confident that it is getting better. But I also have problems with it all the time, so I just leave it off mostly.

  • @Nightmare78hAlo
    @Nightmare78hAlo Год назад +2

    At this point point i'm kinda rather glad i did not pursue an HDR monitor at the end, in like 2019 i was looking for a new monitor, wanted to go HDR cause i love deep blacks and brights for colours, it's why i often went with VA pannel monitors since they tend to have the best but all of the offerings for HDR monitors were either too expensive still or simply lacked other things i wanted from them.
    At this point, it almost feels like HDR will never properly get adopted and instead things like a full on different display pannel such OLED will be the main way to get more better and deeper colours.

  • @CarbonOtter
    @CarbonOtter Год назад +4

    Another thing to add to this: It's not just highlight and shadow details that HDR enables. There is... another. Color gradients in SDR/8-bit color necessitates color banding due to lack of subtle color representations. HDR/10/12-bit adds in many, MANY more color options to reduce that banding. Without HDR, the only way to make banding less apparent is dithering, which is how most games today handle it and they hide the dithering with awful generic film grain.

    • @cube2fox
      @cube2fox Год назад

      HDR dithering should mostly look pretty good even without film grain. Perhaps better than SDR without dithering.
      Anyway, HDR doesn't add "shadow detail". SDR black is exactly the same as HDR black. The difference is that HDR allows for brighter colors and a wider color gamut. E.g. super white and super pink.

    • @CarbonOtter
      @CarbonOtter Год назад +3

      That's not *quite* right. HDR color gamuts have more steps of color in addition to just more range of brightness. Shadows tend to be cleaner due to better representation. Think an SDR shadowed area only having 12 shades to work with but the HDR representation might have 50. There is indeed more detail in the shadows, however the ability for your screen to render it (OLED vs LCD for example) and the responsiveness of the human eye viewing it do mean the effect is much more subtle in perception. The flashy, easily noticed benefits are on the bright end of the gamut, but the dark end does stand to gain from HDR... though most game developers don't bother with shadow details because the majority of gamers use LCD variants, so good luck there.
      Cyberpunk 2077 with pathtracing enabled has extra shadow details (procedurally using the full HDR color gamut), but the baked shadows for instance are a bit crushed since the devs didn't make HDR a design priority.
      @@cube2fox

  • @andygrundman
    @andygrundman Год назад +1

    It's slowly getting better, 2 of the year's best games feature great HDR implementations: Baldur's Gate 3 and Alan Wake 2. AW2 in particular really stands out, especially when using ray tracing. It's also the only game I'm aware of that makes use of Windows 11's HDR calibration profile. BG3 makes great use of HDR for things like spell effects and shafts of light in underground caves. Another very good sign is the Steam Deck OLED: a proper HDR display that is more accessible to the average gamer. Let's hope it's easy to use....

  • @pandiem
    @pandiem Год назад +3

    For now at least when using an OLED monitor you feel like you're getting the benefit of HDR while in SDR alone because of how much contrast there is, it sort of simulates the HDR spectrum when you can go to absolute zero blacks making your eyes themselves need to adjust sometimes. Turning HDR on to me is a much less noticeable difference than the monitor itself

    • @mycelia_ow
      @mycelia_ow Год назад

      Depends on the panel. The $1200+ OLED HDR monitors go well above 1000 nits and show even more color. Just a very high price to pay. The cheaper OLED monitors are nearly as good though.

    • @pandiem
      @pandiem Год назад

      @@mycelia_ow There is no gaming OLED monitors currently that go above 260 nits for SDR or 1000 for HDR (HDR 10). |
      I think you're thinking of OLED TVs, they definitely get way brighter.
      Despite the low brightness of monitors though, its not noticably dim whatsoever after regular use.

    • @_Slach_
      @_Slach_ Год назад

      ​​@@pandiemI don't understand why brightness is limited in SDR. Is that an anti burn in measure or what? If so, would n't you just get burn in faster by consuming HDR since it's much brighter?

    • @ericthomas2388
      @ericthomas2388 Год назад

      @@_Slach_ It's not limited. It's just beyond the standard, which you can blissfully ignore.

  • @Retalak
    @Retalak Год назад +1

    HDR on W11 actually breaks my shadow play and causes extreme lag.

  • @dustycarrier4413
    @dustycarrier4413 Год назад +9

    There's HDR processing, and then there's HDR content, the two are drastically different concepts, even if the two need one another for HDR content.

  • @AL2009man
    @AL2009man Год назад +1

    This is great timing given Steam Deck OLED has just came out, while simultaneously and casually solving the HDR problem that Windows has by making it in-line with Console HDR

  • @SyntaxDaemon
    @SyntaxDaemon Год назад +4

    HDR is super overrated. Better tonemapping for SDR is where it's at. Maybe eventually the support will be good enough to be worth it.

  • @hoo2042
    @hoo2042 Год назад +1

    Games I've played that had good HDR on PC:
    - Doom (2016)
    - Horizon: Zero Dawn
    - Uncharted
    - Shadow of Tomb Raider
    That may be it...
    I want to put Cyberpunk 2077 on this list, but there's something weird about the HDR in that game that still feels a bit off even after 2.0. I think it's something about the mid-gray tone mapping, although it's hard to describe even though this is literally my job.
    And two of those are ports of PlayStation games. On that note, consoles have had some great HDR content, mostly due to the consistent set of capabilities across users and broader support for HDR on TVs rather than computer monitors (I play on an HDR TV in my living room, even for mouse/keyboard PC games).

  • @BrotherO4
    @BrotherO4 Год назад +4

    I see you use CS2, that means it uses AutoHDR which means it’s uses the OS max nits.. So 1. Make sure to turn off any Tone mapping feature on your monitor and if you using a TV turn on HGiG . 2. Use the Windows HDR calibration tool to set your max Nit. Than open CS2 again and it will be correctly mapped.

    • @Blue_Soul_Slayer
      @Blue_Soul_Slayer Год назад

      Thank you I'll have to try that. I was wondering why it didn't look any different. Like for me, when I bought my new monitor and switched from 1080p low HDR to 4k hdr1000, I was very eager to test everything. Some RUclips videos were great and movies too. Gaming is a meh. There's only a small amount of HDR games. But since I'm using windows 11 I like how it is used when compared to win 10. I found out amd adrenaline couldn't record in HDR which was a bummer so I'm currently using obs. But I eventually gave up on chasing the perfect HDR experience since everything isn't cohesive yet. (Especially games or mainstream RUclips.)

    • @Wozza365
      @Wozza365 Год назад +3

      The fact you have to calibrate things for a decent experience makes it almost inevitable that this will never become a standard feature in every monitor until the process requires no setup for the average user

  • @ultrapotassium
    @ultrapotassium Год назад +1

    HDR IS NOT WIDE COLOR GAMUT (WCG)! HDR alone will NOT get you a greater number of colors in an image, only different colors. You can have 8-bit HDR, or 10-bit/WCG non-HDR. HDR encoding only affects the OETF/EOTF (commonly known as "gamma"). Color space (E.G., Adobe RGB) is yet another element. Unfortunately, these three distinct elements are often conflated, which adds to the confusion and causes broken pipelines, when one or more of these elements is not properly conveyed.

  • @FallingDMG
    @FallingDMG Год назад +6

    I wonder if the Steam Deck OLED will lead to more developers implementing HDR properly. The early reviews all seem to praise HDR content on the new screen, *if* the developer has implemented it properly. Games with bad implementations, on the other hand, were often called out for that.
    Maybe having a popular, HDR capable all-in-one device, like the Deck, will a) make it easier for developers to implement and test HDR and b) give an actual incentive to do so.

  • @__alves_
    @__alves_ 13 дней назад +1

    Same I have the same problem HDR monitir but no content games etc

  • @NeoCyrus777
    @NeoCyrus777 Год назад +6

    Imagine not pirating Adobe products...

    • @MelkisgoedvoorJan
      @MelkisgoedvoorJan Год назад

      I have been searching for it but couldn't find a pirated version of illustrator. Where do I have to look?

  • @FDM-xu4wt
    @FDM-xu4wt Год назад

    Really glad I watched this. I've been contemplating buying an HDR monitor so I can finally make use of that disabled "HDR" option in games, feeling like I'm missing out on the 'real' experience. Knowing it's just another fake experience takes some of that FOMO out of the decision.

  • @ri0t263
    @ri0t263 Год назад +9

    I think most PC gamers are still on IPS displays with terrible contrast ratios. Even though they are "HDR400 certified", content looks dark and underwhelming with HDR enabled. Maybe when OLED or at least decent local dimming displays become more mainstream we'll see more HDR content and support.

  • @pucktoad
    @pucktoad Год назад +1

    I will be honest, I don't care about most graphical improvements. They all feel like gimmicks.
    1080p,60fps,vsync on. That's all I needs, from RDR2 to Deus Ex

  • @pastyboi2789
    @pastyboi2789 Год назад +7

    We need to go back to crts man no native rez good contrast. Shit was good (minus the weight and size)

  • @kaden-sd6vb
    @kaden-sd6vb Год назад +1

    when I turned off HDR in Armored Core 6(it was enabled by default), I was stunned at the world of color I had been deprived of by HDR before.
    and greatly annoyed at the fact that the HDR did nothing but make the game look significantly worse. it was so *grey* with the HDR. everything was so faint and whited-out color-wise.
    now I sorta know why it did that.

    • @mimimimeow
      @mimimimeow Год назад

      Armored Core 6 actually has a decent HDR but do not push the brightness too much or it lifts the black floor making everything grey. You can even use the slider to push certain bright highlights like lasers and flames into DCI-P3 color space. That is assuming you have a proper HDR display to begin with.

  • @burger406
    @burger406 Год назад +9

    dont care, my montier is SDR TN 165hz
    -burger40

  • @croneryveit9070
    @croneryveit9070 Год назад

    Its criminal that this video isn't in hdr format on yt.

  • @shivanshagarwal12
    @shivanshagarwal12 Год назад +5

    just saying hi to to philip because he said he reads every comment :)

  • @kendokaaa
    @kendokaaa Год назад +1

    There's some conflation or lack of explanation in this video regarding HDR (brightness), colour (bit) depth and colour gamut. You get less banding on a 10bit display with 10bit content, without needing HDR. Additionally, those colours we don't usually see on displays are missing because most content is sRGB. Monitors with super wide colour gamuts that include very deep colours have existed for a while, to the point where it's been a problem with a lot of monitors lacking an sRGB clamp mode so the sRGB content gets stretched to the much wider colour gamut, resulting in oversaturation. HDR doesn't necessarily have anything to do with that. A game could be made for higher colour gamut monitors just like some applications are and these don't require HDR to be supported or enabled.

  • @Pocketkid2
    @Pocketkid2 Год назад +3

    A more accurate video title would be HDR is a hot mess for computers. 4K Bluray and streaming services have had HDR for movies and TV shows since 2016 and since the process is so streamlined it rarely goes wrong and HDR improves the image over SDR overwhelmingly more often than not.

  • @BoloH.
    @BoloH. Год назад +2

    Bought a 240Hz about two years ago, "Wow this has HDR!", tried it once or twice and haven't bothered with it since. You can make an alignment chart of various HDR implementations.

    • @tflp1950
      @tflp1950 Год назад

      That absolutely is not a true-hdr capable monitor like an OLED