For anyone who's wondering how 8-bit, 10-bit, 12-bit translate to 16M, 1B, 68B colors, it's because the bits specifies one of the 3 RGB colors. So you would multiple all 3 to get your full color (n-bits for Red, n-bits for Green, n-bits for Blue). So 8-bit: (2^8)^3 = 256^3 = 16,777,216 10-bit: (2^10)^3 = 1024^3 = 1,073,741,824 12-bit: (2^12)^3 = 4096^3 = 68,719,476,736
"We _just_ got through the whole thing [format war] with Blu-Ray vs. HD-DVD!" "The HD DVD Promotion Group was dissolved on March 28, 2008." -Wikipedia 10 years hasn't really qualified as recent in...well, about a decade.
I like most of your explanation except you're incorrect about Dolby Vision's HDR abilities. Dolby Vision does have a 12-bit color gamut that no current TV has but, it down-samples to 10-bit that the top of the line 10-bit televisions can do. Plus, Dolby Vision's HDR is dynamic, which means it can adjust on the fly frame-by-frame where as HDR10 does not. HDR10 uses a static HDR that has a set tone mapping range based on the TV or the media format's set nit brightness.
@Hiro Nito Actually the chipset in the TV adjusts the Dolby Vision content to it's capabilities. So totaly different to HDR10, that is displayed as it comes. The experience of Dolby Vision is nearly identical on different TVs and as close as the producer wanted it. HDR10 is displayed like the TV brand likes it with it's tonemapping.
so basically, Dolby Vision is like 8k in that you can shoot with it but there is almost no reasonable way to get the most out of that format yet with the current technology. For Dolby Vision it is color, for 8k it is resolution, there is not a lot of options for TVs or monitors that can reproduce the full potential of the image. Great explanation!
Not true. 8k is resolution based. It's harder to see the difference with something as simple as a sharper more detailed image. Especially since we don't have many things filmed in 8k. Plus they are normally down scaled to 4k in that case. Movies with cgi or like marvel movies for example may be filmed in 4k or higher at times but they are down scaled to a 2k digital intermediate everytime because chi can not be rendered at 4k. Anyways...back to the point you can notice a color difference from pretty much any viewing distance. With that being said hdr 10 is a static HDR. Dolby vision adjusts frame by frame. I would say it looks 10 times better than hdr10. Maybe I am getting it confused from a standard led/QLED to OLED. But I had recently upgraded to a Sony OLED and the difference is astronomical in my opinion. And OLEDs are compatible with Dolby vision. Basically I just recommend using a OLED in any case if possible
Okay this was 3 years ago and he says none of the television in market has the Dolby vision and now currently in 2020, i phone 12 can record a video in 4k Dolby vision which is crazy enough upgrade within 3 years.
LG super UHD TV's and OLED flagship TV's has Dolby vision Technology since from 2016.... As of 2018 model, LG Super UHD TV's and OLED TV"S supports HDR10, Dolby VISION , HLG And Advance Technicolor as well....
@@red5standingby419 Dolby Vision is good not because of 12 bit, but because of dynamic metadata. HDR10 uses static metadata which is not that good. HDR10+ has dynamic but lacks support (it never will get widespread support).
Honestly I am fine with HDR 10. My tv does do Dolby vision but it doesn't look right because of the 10 bit color scheme. My tv is preset to HDR 10 and everything looks amazing on it.
Mate a magical experience to me is a night in the sack with 3 Asian beauty's, now that's feckin magical!. Your talking about a tv set which at best is bloody good.
You had the idea of HDR completely wrong. HDR isn’t about the colours. Yes, HDR supports 10-bit but it more has to do with dynamic range - in which colour depth comes along with it.
Dolby says it in their whitepaper that can be found on their website. The 12Bit colordepth is important, even if the panel can only display 10Bit. The important part is the PQ EOTF curve, that can show banding in dark areas, even with HDR10 (0-10nits with a Delta E over 4). And Dolby Vision ensures a good picture on all TVs, because every Dolby Vision enabled TV gets sent to Dolby to messure the capabilities. These parameters are written on a Chip that interprets the DV stream and adjusts the values to the TV. So the picture is as close to the producers intent as possible.
Thankfully, as of 2020, there isn't much to worry about anymore regarding the whole HDR10 vs Dolby Vision "format war", because the amount of reasonably priced TV's that support *both* are becoming more and more common. It's actually possible to get a decently sized display that supports at least both of those two standards, if not *all three* (when you include HLG, which doesn't even really need to be natively supported by your display since it can easily be converted in real time to HDR10 when played through compatible software and devices) for less than $500 USD now.
Nah, MicroLED won't be ready for consumers before ~2025 and it will take a few years before it's mature (it will have issues for sure on launch). Samsung will probably push them ASAP because LCD is so dated and they never got OLED right (or want to buy LG panels). All top-end TV's uses OLED at this point but samsung keeps insisting on OLD TECH.
You know a director hates it when he sees his film on a TV with just a measly one billion colours....."Ugh, that should be bubblegum pink not electric pink!!"
Hey, I don’t know your name, but I would like to say that, I am from RECIFE in Brazil South America, and your video was very elucidating, or I could say explanatory if you will, thanks for sharing that to this part of the world...and keep up the good work.
I really really love the 4K Restaurations from Arrow, Kino Lorber, Criterion Collection, Second Sight, Eureka Entertainment, Shout Factory, Studio Canal, WB Archive etc, but I also like my older (good) bluray's on my LG Oled Evo tv with Filmmaker Mode.
Thanks a lot for the explanation. I was looking for something easy to grasp and you did a great job. Other videos always use terms like "peek brightness, deep backs etc..." making the understanding of the difference tougher. Cheers. 👍🏾
Actually, HDR is also about light aswel as color. For a screen to be true HDR at the minimum, it needs to be able display 1000 nits. That’s because the bigger distance there is between the darkest and the brightest spots in an image, equals a lot more detail to come forward. Therefor creating a higher dynamic range. That’s also why an OLED tv capable of producing 1000 nits would be the best TV screen out there, since the distance between blacks and whites would be so huge.
This is about the best explanation that I’ve read about how true HDR works. A lot of fellas throw shade on TVs like the Samsung QN90, try to compare it with the likes the LG C1 and Sony A80J with their mere 700 maximum nits when the QN90 can get up to 2000, which gives you amazing HDR.
My TV is only 250 nit's peak brightness. Nature programs are bright and vibrant so you don't need super brightness. Some programs are dull though same as expensive TVs
@@neilcampbell4833 that is different. A proper HDR screen would only brighten a portion of the screen when needed like when showing bright reflections on the chrome fittings of a car. The other portions like the interior would not be bright and would be dark. This creates that amazingly high contrast. Another benefit of HDR is wider color gamuts. So saturated things like neon lights would look as such.
Haven't found a 12bit tv panel anyway. Apple is unfortunately supporting dolbyvision and does not include hdr10 metadata in any of it's dolbyvision movie lineup.
Yash .. if you have a HDR10 only tv, you get the HDR 10 version of the movies. It says Dolby Vision before you buy or rent it, but after purchase you get the HDR10 version if your tv doesn't support Dolby Vision.
Actually it's just dolby vision. Apple TV processes the Dolby Vision data to create a fake HDR10 output. It's not authentic HDR10. :( And I can actually see a difference when watching HDR10 from Netflix on my PS4 against the same video on apple tv.
I have the same topic at the moment and like your video. People do not understand the still (3years later) the problem is the panel and not the format.
Can you plz! Help anyone I purchased a nanocell 90 lg tv and when Dolby vision is on the image looks so dark and gray nothing pops out but when I put my ps5 that doesn’t have Dolby vision it looks amazing super 4K colors are bright and clear is my tv defective?
haloharry97 That's not true. Some low end TVs(TCL and Vizio) have support for Dolby Vision. And to answer his questions. No it would not look worse as all Dolby Vision TVs support HDR10.
The problem is that the content we can get today is more available in dolby vision, and it is not compatible with HDR10. So, although he TV cant support all DV, it can support it partially. While the HDR, we don't have much content with it.
What about cable or satellite TV relay ? I don't think it's there yet for either HDR 10 or Dolby Vision. I think max it can support 1080mp only. So these technologies does it have any impact on quality of cable TV relay
So with my new Sony 9000H TV that supports Dolby Vision, no point still as it's still only a 10bit panel so can't even take true advantage of the Dolby Vision content out there?
HDR10 = Basic HDR. Dolby Vision = High-end HDR (Jawdropping on OLED when implemented correctly - HDR10 can be too tho, but DV seems to impress more because of dynamic metadata).
In the market for either an LG (Dolby Vision) or Panasonic (HDR) OLED. I’ve just purchased an Apple TV 4K. Want to stick with Panasonic as current plasma has better colours out of the box (THX). Am I missing out on Dolby Vision or is it still ok to go with HDR?
So.......... I'm unsure if to get an LG OLED or a Samsung QLED. One of the reasons of the OLED was that it supports Dolby Vision and HDR 10, the QLED only supports HDR 10. Does this mean that if I got a QLED but watched my content through an Apple Tv 4k that supports Dolby Vision too. I would essentially bypass this problem? Is it just a licence issue rather than the screens capability difference between QLED and OLED?
Playing 4K HDR movies on my PC, the colors were inaccurate until i found out i needed codecs like madVR (i guess to convert the color gamut from BT.2020). Does anyone know about the proper software or what settings one would need to adjust? I understand i need an HDR monitor, but if i play something in VLC nightly 4.0, the brightness is much better and the color looks amazing. But only VLC nightly 4.0 seems to give this result, not any other version or player that i have tried.
As of 2022 most of TV's still don't support that much peak brightness which is required by this technologies (Dolby vision and HDR 10+) but over the years panels got some massive improvement and are more affordable than before also HDR 10+ is also become dynamic now which gives viewing experience close to Dolby vision normal users won't notice this difference
This video is BS. Dolby Vision enabled TVs come with a chip that down converts the luminance range to that of the TV. So the highlights do not get clipped and you still get an image as close to the creator's intent.
I have a 4k tv with dolby vision and my Xbox series X has some games which support Dolby vision and they look incredible. I feel like I'm in a museum looking at a real canvas painting sometimes. I can tell a difference between the hdr10 games and the dolby vision games.
What do you mean when you say Dolby Vision technology doesn't really exist, but will over the next few years? There are tvs with 12 bit hdr, that supports Dolby Vision and also movies/series with Dolby Vision, you said it yourself that Newflix has titles shown in Dolby Vision, so watching a title in Dolby Vision in Netflix on a Dolby Vision tv, how is that not a Dolby Vision experience?so what do you mean?
The 12 Bit is used by Dolby to support brightness up to 10.000Nits in the future. You would need 14Bits for that brightness range, but the have a PQ EOTF curve that puts a lot more info into the dark areas than the bright ones, without artifacts. With 10Bits and a PQ curve the difference in brightness on large areas is noticeable. From 0-10 Nits a PQ curve with 10Bit has a Delta E of 6 to 4, compared to 2 to 1 for 12Bit (1 being the threshold for the human eye). This is why you can even have banding with HDR video in nearly uniform areas like a blue sky.
Dear Dragonridernetwork ! Is there any difference if a Samsung Q90r would be dolby vision (DV) capable in picture quality if a film produced with DV ?? As I am listening to your advices it is more than enough HDR10 and HLG (the two static metadata) because the color space is fast always DCI-P3 at the films today . There is no need of DV . And most of the displays can not exceed 1000nits And the dynamic tone mapping in brightness under 1000nit you need not have dynamic methadata ( DV or HDR10+)..?? But if I play a DV content with the Samsung Q90r (peak brightness 2000 nit) which has no DV what happened ??? And What happened with an LG C9 if you try to play HDR10+ content ??? LG C9 is only DV capable and no HDR10+ ??? I did not get a correct answer ! Can you answer these questions please !
The biggest problem I have with this is that as a photographer, I know that anything over 8 bpc will have no noticable increase in color quality. 8bpc is all you need to see everything beautifully. So what's the point in 10 and 12 bpc? The point is flexibility when developing RAW photos. When you make certain changes to shadows and highlights, darks and lights, contrast, and brightness, etc. While developing a RAW photo, the more color information you have, the better because you can make more extreme changes and not lose any information. However, as far as displaying the photo once you're finished, 8 bpc is enough. And that picture you showed of the sun with the color banding? That was NOT 8 bpc. That was a lot less. A picture like that would be smooth and look perfect displayed at 8bpc. So this explanation is not really right. Or at the very least, it's misleading and confusing 😔
After 8boc there's no improvement... until you switch to HLG photo, which basically uses them fancy ass new HDR video features like luminance on the photo
Contrast is exponentially improved, both local and global. Local contrast improvement leads to higher perceived details. More details can also be displayed in the highlights of the image.
haloharry97 There a few Dolby Vision Blu-rays out on the market(which you can search up) 4K Blu-rays are either HDR10, Dolby Vision, or both. Each getting a separate layer on the disc.
HDR10+ is dynamic though if i remember correctly. But still....there is no Blu-ray UHD which supports HDR10+. All supports Dolby Vision. Also Netflix and Amazon Prime supports Dolby Vision. It really is an industry standard.
It's still a 10bit color depth. While it's better than the static HDR10 that all content has. It's still inferior to Dolby Vision. Which I still think that Dolby Vision will win this HDR "War".
The main benefit of HDR is the brightness level, studies show users are far more susceptible to increased contrast than color. This is because Humans have poor color memory
For anyone who's wondering how 8-bit, 10-bit, 12-bit translate to 16M, 1B, 68B colors, it's because the bits specifies one of the 3 RGB colors. So you would multiple all 3 to get your full color (n-bits for Red, n-bits for Green, n-bits for Blue).
So
8-bit: (2^8)^3 = 256^3 = 16,777,216
10-bit: (2^10)^3 = 1024^3 = 1,073,741,824
12-bit: (2^12)^3 = 4096^3 = 68,719,476,736
I've been out of school for 7 years. This hurt my brain 😂😂😂
Oh wow, so in other words, 12-bit is waaaaaaaay better
"We _just_ got through the whole thing [format war] with Blu-Ray vs. HD-DVD!"
"The HD DVD Promotion Group was dissolved on March 28, 2008." -Wikipedia
10 years hasn't really qualified as recent in...well, about a decade.
Yeah I was like 'you just wake up bruh?'
Thank you! That's exactly what I said.
Tell that to toshiba, they lost a billion dollars on hd dvd
I like most of your explanation except you're incorrect about Dolby Vision's HDR abilities. Dolby Vision does have a 12-bit color gamut that no current TV has but, it down-samples to 10-bit that the top of the line 10-bit televisions can do. Plus, Dolby Vision's HDR is dynamic, which means it can adjust on the fly frame-by-frame where as HDR10 does not. HDR10 uses a static HDR that has a set tone mapping range based on the TV or the media format's set nit brightness.
freddywayne I'm actually doing an update to my Sony 930e to be able to display Dolby vision
freddywayne hmmm 12 bit and dynamic seems way better future wise. Dolby vision wins. But then Betamax video was better quality.
@Hiro Nito you're the one that's stupid. No movie is over 1000 nits , directors don't want to blind the viewer when watching a movie.
@Old Man hendo no.
@Hiro Nito Actually the chipset in the TV adjusts the Dolby Vision content to it's capabilities. So totaly different to HDR10, that is displayed as it comes. The experience of Dolby Vision is nearly identical on different TVs and as close as the producer wanted it. HDR10 is displayed like the TV brand likes it with it's tonemapping.
Who’s watching this cause u wanna know what Dolby vision is for the iPhone 12 pro
Bro
Mee 😁
The question is can the display on the 12 Pro show us those true 12-bit or just HDR10+ ones
My phone also have
😂
5 years ago of this video Dolby vision was not awailable. But now it is awailable.
so basically, Dolby Vision is like 8k in that you can shoot with it but there is almost no reasonable way to get the most out of that format yet with the current technology. For Dolby Vision it is color, for 8k it is resolution, there is not a lot of options for TVs or monitors that can reproduce the full potential of the image. Great explanation!
Thanks!
Not true. 8k is resolution based. It's harder to see the difference with something as simple as a sharper more detailed image. Especially since we don't have many things filmed in 8k. Plus they are normally down scaled to 4k in that case. Movies with cgi or like marvel movies for example may be filmed in 4k or higher at times but they are down scaled to a 2k digital intermediate everytime because chi can not be rendered at 4k. Anyways...back to the point you can notice a color difference from pretty much any viewing distance. With that being said hdr 10 is a static HDR. Dolby vision adjusts frame by frame. I would say it looks 10 times better than hdr10. Maybe I am getting it confused from a standard led/QLED to OLED. But I had recently upgraded to a Sony OLED and the difference is astronomical in my opinion. And OLEDs are compatible with Dolby vision. Basically I just recommend using a OLED in any case if possible
BS. Dolby Vision is dynamic so it scales to the capabilities of the TV while maintaining the creative intent.
@@Janken_Pro the TV by definition can’t show what was captured so…. Okie dokie
Apple has Dolby vision now😂
😂😂😂
Haha i was comparing iphone 12 pro with one plus 8t😂😂
@@HARRYMELBURNIAN damn lmao
Apple's new iPhone can ONLY record in Dolby Vision.
How is that funny ?
Okay this was 3 years ago and he says none of the television in market has the Dolby vision and now currently in 2020, i phone 12 can record a video in 4k Dolby vision which is crazy enough upgrade within 3 years.
Yeah, no kidding! :0
Ya just because your phone can record 4k dolby vision doesn't mean it looks as good as a professional grade camera.
He looks like Dustin from stranger things.
This is definitely a first.
Also somewhat similar to Seth Rogen, lol.
DragonRiderNetwork lmmfao
A cross between Chris Pratt and Seth Rogen may be.
I can’t unsee it now
why does seth rogen sound so different in this video?
He doesn't look enough like Seth Rogen for that joke to be funny... you just ruined Christmas.
@@gerald2540 yeah.. .im like... 😑 but apparently 22 others agree.
3:58- I think 4 years later, the tech is here ,yes?
Bravia 9
@@siimmrxWhat about bravia 7
LG super UHD TV's and OLED flagship TV's has Dolby vision Technology since from 2016.... As of 2018 model, LG Super UHD TV's and OLED TV"S supports HDR10, Dolby VISION , HLG And Advance Technicolor as well....
But are they 12bit panels?
LG also support Dolby Atmos!
@@red5standingby419 no 12 but panels yet, 4-5 years away at least
@@red5standingby419 Dolby Vision is good not because of 12 bit, but because of dynamic metadata. HDR10 uses static metadata which is not that good. HDR10+ has dynamic but lacks support (it never will get widespread support).
Honestly I am fine with HDR 10. My tv does do Dolby vision but it doesn't look right because of the 10 bit color scheme. My tv is preset to HDR 10 and everything looks amazing on it.
Dolby Vision HDR on an LG C8 OLED tv is honestly...a magical experience.
Hi my tv as dolby vison and hdr10 but I switch off the dolby vison do you have both on
Mate a magical experience to me is a night in the sack with 3 Asian beauty's, now that's feckin magical!. Your talking about a tv set which at best is bloody good.
@@paulmidgley8040 nah, this is better screen quality than cinemas
@@118halospartan well whatever floats your boat dude.
And a dull one... my QLED is better
Is the intro a remixed dubstep version of "where is my mind"? Am I the only one that hears it?
Yeah, it's literally said in the description
You had the idea of HDR completely wrong. HDR isn’t about the colours. Yes, HDR supports 10-bit but it more has to do with dynamic range - in which colour depth comes along with it.
Dolby says it in their whitepaper that can be found on their website. The 12Bit colordepth is important, even if the panel can only display 10Bit. The important part is the PQ EOTF curve, that can show banding in dark areas, even with HDR10 (0-10nits with a Delta E over 4).
And Dolby Vision ensures a good picture on all TVs, because every Dolby Vision enabled TV gets sent to Dolby to messure the capabilities. These parameters are written on a Chip that interprets the DV stream and adjusts the values to the TV. So the picture is as close to the producers intent as possible.
Just simple as that...no discuss. Good and direct job.
Thanks!
Thankfully, as of 2020, there isn't much to worry about anymore regarding the whole HDR10 vs Dolby Vision "format war", because the amount of reasonably priced TV's that support *both* are becoming more and more common. It's actually possible to get a decently sized display that supports at least both of those two standards, if not *all three* (when you include HLG, which doesn't even really need to be natively supported by your display since it can easily be converted in real time to HDR10 when played through compatible software and devices) for less than $500 USD now.
In about 2-3 years we will have 8K micro-LED 12 bit Dolby Vision TVs :D
Nah, MicroLED won't be ready for consumers before ~2025 and it will take a few years before it's mature (it will have issues for sure on launch). Samsung will probably push them ASAP because LCD is so dated and they never got OLED right (or want to buy LG panels). All top-end TV's uses OLED at this point but samsung keeps insisting on OLD TECH.
Yea and it'll cost you your first born child.
Are you Seth Rogan’s son 🤓
Well, the technology is here now!
HDR: open source
HDR 10: open source 10 bit
HDR 10+: Samsung owned HDR10
Dolby Vision: Dolby owned HDR 10
Fantastic explanation. At last I understand the differences. Thank you
*_68 Gazillion COLORS!!!! Looks at a color wheel, there's about 25 colors on there, where are the other 68,999.... Gazillion?_* 🤣
You know a director hates it when he sees his film on a TV with just a measly one billion colours....."Ugh, that should be bubblegum pink not electric pink!!"
When you feet are in the air and head on the ground
And what about HDR 10+ vs Dolby Vision?
What would be very interesting to make an update video.
All I know is...my new LG CX OLED supports Dolby Vision and so does Netflix, Disney Plus etc. So I'm a happy visual viewer all around!!
Is the dolby vision better then hdr10? Like do the colors pop out better? I plan on getting a LG oled BX soon to replace my sansung nu6900 tv
Hey, I don’t know your name, but I would like to say that, I am from RECIFE in Brazil South America, and your video was very elucidating, or I could say explanatory if you will, thanks for sharing that to this part of the world...and keep up the good work.
Thank you very much! Glad to have viewers from around the world!
You can have both on a TV right? I mean HDR10 and Dolby vision and the TV adapts to the content right?
Yes that's true!
I really really love the 4K Restaurations from Arrow, Kino Lorber, Criterion Collection, Second Sight, Eureka Entertainment, Shout Factory, Studio Canal, WB Archive etc, but I also like my older (good) bluray's on my LG Oled Evo tv with Filmmaker Mode.
Man that solves the problem, thank you for the explanation bud!
No problem!
Thanks a lot for the explanation. I was looking for something easy to grasp and you did a great job. Other videos always use terms like "peek brightness, deep backs etc..." making the understanding of the difference tougher.
Cheers. 👍🏾
I have a TV that supports both hdr10 and Dolby vision...
the name of the tv?
@@kygz2969 Vizio model number M55-D0
Any TV that supports DV automatically supports HDR10.
Icke Greada same I got the 50 4K hdr
LOL, that set has terrible reviews. GL with that.
Actually, HDR is also about light aswel as color. For a screen to be true HDR at the minimum, it needs to be able display 1000 nits. That’s because the bigger distance there is between the darkest and the brightest spots in an image, equals a lot more detail to come forward. Therefor creating a higher dynamic range. That’s also why an OLED tv capable of producing 1000 nits would be the best TV screen out there, since the distance between blacks and whites would be so huge.
This is about the best explanation that I’ve read about how true HDR works. A lot of fellas throw shade on TVs like the Samsung QN90, try to compare it with the likes the LG C1 and Sony A80J with their mere 700 maximum nits when the QN90 can get up to 2000, which gives you amazing HDR.
My TV is only 250 nit's peak brightness. Nature programs are bright and vibrant so you don't need super brightness. Some programs are dull though same as expensive TVs
@@neilcampbell4833 that is different. A proper HDR screen would only brighten a portion of the screen when needed like when showing bright reflections on the chrome fittings of a car. The other portions like the interior would not be bright and would be dark. This creates that amazingly high contrast. Another benefit of HDR is wider color gamuts. So saturated things like neon lights would look as such.
Haven't found a 12bit tv panel anyway. Apple is unfortunately supporting dolbyvision and does not include hdr10 metadata in any of it's dolbyvision movie lineup.
Yash .. if you have a HDR10 only tv, you get the HDR 10 version of the movies.
It says Dolby Vision before you buy or rent it, but after purchase you get the HDR10 version if your tv doesn't support Dolby Vision.
Actually it's just dolby vision. Apple TV processes the Dolby Vision data to create a fake HDR10 output. It's not authentic HDR10. :(
And I can actually see a difference when watching HDR10 from Netflix on my PS4 against the same video on apple tv.
I have the same topic at the moment and like your video. People do not understand the still (3years later) the problem is the panel and not the format.
Video starts at 1:15
Woah, so based off this information, the iPhone 12 has 12-bit HDR because it’s equipped with Dolby Vision?! o_o That’s incredible.
Dope intro song. I love it
Thanks!
Ok but why are you staring directly in my soul dude.. blink or something ffs!
6 years later we still cant take full advantage of DV
And now we have a sweet HDR 10+
Would Dolby Vision change to HDR 10 by the Apple TV 4K
Can you plz! Help anyone I purchased a nanocell 90 lg tv and when Dolby vision is on the image looks so dark and gray nothing pops out but when I put my ps5 that doesn’t have Dolby vision it looks amazing super 4K colors are bright and clear is my tv defective?
Didn't know Seth Rogen knows so much about TV's and picture quality.
Everyone's got a hobby.
I got one question. Will all hdr10 content look the same, better or worse on a Dolby vision tv?
idk why it would look worse
if anything i can imagion it will look better as most dolby tv are the high end tv.
haloharry97 That's not true.
Some low end TVs(TCL and Vizio) have support for Dolby Vision.
And to answer his questions. No it would not look worse as all Dolby Vision TVs support HDR10.
It will look the same.
The problem is that the content we can get today is more available in dolby vision, and it is not compatible with HDR10. So, although he TV cant support all DV, it can support it partially. While the HDR, we don't have much content with it.
How vu premium android 55" TV has 8bit panel and has Dolby vision certification??? Can you explain?
Marketing.
My OPPO Blu-ray has a 36 bit color setting that seems to work with my new 4K tv, must not be the same thing you are talking about here?
if I was to stream a movie on Vudu using Dolby Vision vs the disc version which is HDR10, which one is better?
So, can a dolby vision tv play a hdr video? Or the justplay the 4k resolution?
Thank You for knowledge!
You're welcome!
What about cable or satellite TV relay ? I don't think it's there yet for either HDR 10 or Dolby Vision. I think max it can support 1080mp only. So these technologies does it have any impact on quality of cable TV relay
Yeah. I think a lot of this will be more focused on the media you play at home rather than the videos you gets over cable or satellite.
@@DragonRiderNetwork yes , I hope at least these premium TVs have some AI chips to enhance I mean upscale the cable or satellite tv relay
So with my new Sony 9000H TV that supports Dolby Vision, no point still as it's still only a 10bit panel so can't even take true advantage of the Dolby Vision content out there?
If a movie is in DV and I play it on samsung will it play in SDR or HDR ?
good vid. I finally understood some technical nuances that I did not understand before.
Thanks! Glad I was able to help.
Great explaination, what is btw the name o the song in the background?
I think the 2022 Samsung QN90B, with the
latest HDR10 +, is going to be my next big purchase.
Why not the S90B
Samsung haha. They don't support Dolby vision
HDR10 = Basic HDR. Dolby Vision = High-end HDR (Jawdropping on OLED when implemented correctly - HDR10 can be too tho, but DV seems to impress more because of dynamic metadata).
In the market for either an LG (Dolby Vision) or Panasonic (HDR) OLED. I’ve just purchased an Apple TV 4K. Want to stick with Panasonic as current plasma has better colours out of the box (THX). Am I missing out on Dolby Vision or is it still ok to go with HDR?
So..........
I'm unsure if to get an LG OLED or a Samsung QLED. One of the reasons of the OLED was that it supports Dolby Vision and HDR 10, the QLED only supports HDR 10.
Does this mean that if I got a QLED but watched my content through an Apple Tv 4k that supports Dolby Vision too. I would essentially bypass this problem?
Is it just a licence issue rather than the screens capability difference between QLED and OLED?
Playing 4K HDR movies on my PC, the colors were inaccurate until i found out i needed codecs like madVR (i guess to convert the color gamut from BT.2020). Does anyone know about the proper software or what settings one would need to adjust? I understand i need an HDR monitor, but if i play something in VLC nightly 4.0, the brightness is much better and the color looks amazing. But only VLC nightly 4.0 seems to give this result, not any other version or player that i have tried.
As of 2022 most of TV's still don't support that much peak brightness which is required by this technologies (Dolby vision and HDR 10+) but over the years panels got some massive improvement and are more affordable than before also HDR 10+ is also become dynamic now which gives viewing experience close to Dolby vision normal users won't notice this difference
This video is BS. Dolby Vision enabled TVs come with a chip that down converts the luminance range to that of the TV. So the highlights do not get clipped and you still get an image as close to the creator's intent.
Very simple and informative explanation. Thank you.
Please tell me how I have a Samsung NU8000 and when HDR is enable it looks horrible and with it off it looks beautiful with colors
I have a 4k tv with dolby vision and my Xbox series X has some games which support Dolby vision and they look incredible. I feel like I'm in a museum looking at a real canvas painting sometimes. I can tell a difference between the hdr10 games and the dolby vision games.
Excited to test out this tech on my new iphone 12 pro max 512gb!!
My man be flexing!
What do you mean when you say Dolby Vision technology doesn't really exist, but will over the next few years? There are tvs with 12 bit hdr, that supports Dolby Vision and also movies/series with Dolby Vision, you said it yourself that Newflix has titles shown in Dolby Vision, so watching a title in Dolby Vision in Netflix on a Dolby Vision tv, how is that not a Dolby Vision experience?so what do you mean?
I think he means the video quality is technically downgraded as it's not on a physical disk and disc quality is superior to streaming.
But RUclips does not accept Dolby. They convert your Dolby video to HR10. Right?
This intro/outro gives me Mr Nobody nostalgia
I don't completely get the point of Dolby Vision supporting TV's that cannot produce 12bit color, could you elaborate (I'm a nooby on this) :)
The 12 Bit is used by Dolby to support brightness up to 10.000Nits in the future. You would need 14Bits for that brightness range, but the have a PQ EOTF curve that puts a lot more info into the dark areas than the bright ones, without artifacts. With 10Bits and a PQ curve the difference in brightness on large areas is noticeable. From 0-10 Nits a PQ curve with 10Bit has a Delta E of 6 to 4, compared to 2 to 1 for 12Bit (1 being the threshold for the human eye). This is why you can even have banding with HDR video in nearly uniform areas like a blue sky.
@@florichi Totally clears it up
@@florichi Thank you man, this was really ueful. So there still is a noticable difference in the 10b and the 12b will create near perfect HDR.
i guys i have a query that HDMI2.0 is a 18GB Bandwidth system that support 4K@60Hz 8Bit/color, then how can it carry HDR10 means 10bit/color. guide me
Dear Dragonridernetwork !
Is there any difference if a Samsung Q90r would be dolby vision (DV) capable in picture quality if a film produced with DV ??
As I am listening to your advices it is more than enough HDR10 and HLG (the two static metadata) because the
color space is fast always DCI-P3 at the films today . There is no need of DV . And most of the displays can not exceed 1000nits
And the dynamic tone mapping in brightness under 1000nit you need not have dynamic methadata ( DV or HDR10+)..??
But if I play a DV content with the Samsung Q90r (peak brightness 2000 nit) which has no DV what happened ???
And What happened with an LG C9 if you try to play HDR10+ content ??? LG C9 is only DV capable and no HDR10+ ???
I did not get a correct answer ! Can you answer these questions please !
New iPhone support Dolby Vision right?
Correct.
What’s the song in the beginning. And When will DV come to vizio on xbox
Here because i have an lg hdr10 monitor. Everyone think it sucks but resident evil 2 looked good on it on hdr10
The biggest problem I have with this is that as a photographer, I know that anything over 8 bpc will have no noticable increase in color quality. 8bpc is all you need to see everything beautifully. So what's the point in 10 and 12 bpc? The point is flexibility when developing RAW photos. When you make certain changes to shadows and highlights, darks and lights, contrast, and brightness, etc. While developing a RAW photo, the more color information you have, the better because you can make more extreme changes and not lose any information. However, as far as displaying the photo once you're finished, 8 bpc is enough. And that picture you showed of the sun with the color banding? That was NOT 8 bpc. That was a lot less. A picture like that would be smooth and look perfect displayed at 8bpc. So this explanation is not really right. Or at the very least, it's misleading and confusing 😔
After 8boc there's no improvement... until you switch to HLG photo, which basically uses them fancy ass new HDR video features like luminance on the photo
8bpc is fine for srgb and rec 709 content. Any larger color space like rec. 2020 require 10 bit values to accommodate the higher number of colors.
Contrast is exponentially improved, both local and global. Local contrast improvement leads to higher perceived details. More details can also be displayed in the highlights of the image.
Do you know if 4k bluray shave 12 bit color encoded on the disc for Dolby Vision, or is the Dolby Vision currently encoded at 10 bit on the discs.
I do not know any dolby Vision bluerays, I thought all dolby hdr is digital only.
haloharry97 There a few Dolby Vision Blu-rays out on the market(which you can search up)
4K Blu-rays are either HDR10, Dolby Vision, or both. Each getting a separate layer on the disc.
Thank you for simplifying this, whew!
My tv C8 lg OLED 4k supports both hd10 and dolby vision also dolby atmos
ur video not in 4k or hdr?
I have a LG OLED65G6P TV and support both HDR and Dolby Vision. so no problem here
“We just got through the whole thing between BluRay and HD DVD” - yeah like 10 years ago? 😂 also you forgot HLG so there’s actually 3 types of HDR
Considering its 2020 now and im looking at video to understand if its valid from Samsung's Tech.
What if i use dolby vision on a 10 bit screen ? Is it still better than hdr 10 ?
Yes. DV supports both 10 and 12 bit
HDR10+ is dynamic though if i remember correctly. But still....there is no Blu-ray UHD which supports HDR10+. All supports Dolby Vision. Also Netflix and Amazon Prime supports Dolby Vision. It really is an industry standard.
It's still a 10bit color depth. While it's better than the static HDR10 that all content has. It's still inferior to Dolby Vision. Which I still think that Dolby Vision will win this HDR "War".
Est-ce que la musique d'intro ne serait pas Where Is My Mind par hasard ? BTW thanks it's very instructive ;) Greating from France !
Is Dolby visions hardware or software?
Now a phone is capable to record Dolby vision .after watching iPhone 12 cinematography
Amazing how fast technology changes.
Whats that song melody from the intro?
HDR amd Dolbyvision for me doesnt matter. I prefer a perfect calibrated oled or ips panel with 10 bit colors.
so lg tv without or with dolby?
With, it is amazing
But dolby vision cant ho 12 bit since there is no 12bit content and tv as well. The advantage of dolby vision is the scene to scene dynamic hdr
And now here we are today,dolby vision TV's everywhere.and it is damn better than HDR 10
Also, Dolby vision will convert itself to be hdr 10 if that's all you have. In a sense anyway.
Great explanation!, Thanks for the video.
Glad it was helpful!
Thank you soo much for such a beautiful explanation
My pleasure. Glad you found it helpful!
The main benefit of HDR is the brightness level, studies show users are far more susceptible to increased contrast than color. This is because Humans have poor color memory
Thanks for the video
HDR10+ is on par with Dolby Vision, HDR10 isn't.