Thanks for the information. Very interesting. You got yourself a new subscriber. I’m a nerd and love this lol. I got to appreciate my new LG CX 65” and my series X even more 😍
This is an interesting topic right now, because all the available (and many of the upcoming) AV Receivers that support HDMI 2.1 have that Panasonic Chipset mismatch issue right now. It happens on Nvidia cards and Xbox, but not PS5. Now, the question is, whether Microsoft could simply offer a "workaround option" for now that allows us to limit the Series S and X Output to 32Gbit/s like the PS5 currently is, thus enabling AVR users to enjoy 4K120HDR (albeit in 4:2:2)...🤔
You are right I did the test I put an 8bit game and I put the console in 10 bit and I noticed that the colors were not natural they were saturated on the neck I went back to 8 bit and c 'was much better, and for the games which sound in 10 bit you are also right because the console automatically switches to 10 bit so it is better to leave in 8 bit like that we have the best of both worlds
Nice vid but flawed opinion. The xbox and tv do have auto settings but unfortunately they will go with the lowest common denominator to ensure a smoother experience when there is still more to the image that is not being represented, namely in the bit depth. Really turn on 12 bit, disallow 4:2:2, set your tv to native color space from auto (similar effect as the 4:2:2 xbox option) and you will see a clear difference between 12 bit and 10 bit. Film it, show people, it's clear as night and day. The only problem I have had with specifiying higher settings on my xbox is with Hulu, it gets a greenish tint to it but that is removed by checking 4:2:2 color again. I'm doing this on a 55" h8f also, so you cx should be even better.
@@dillonweaver2307 lol at what exactly? I did this and it works on my LG C3. Every other answer left my picture looking like a yellow piss hell with non existent blacks.
@@Gorgutsforcongress you don’t know how to setup the TV and the console together is all. 12 bit panels don’t exist in TVs yet that I know of so no reason to have input lag when your having to change it back to 10bit again. It’s still 10 bit if your lucky. A lot of TVs are 8bit but can accept 10 bit input also so then might as well be on 8bit. Up to you but those consoles are not very powerful nor TVs so make them do the least possible. And 4:2:2 is only a setting you should need on a really crappy 4k TV.
@@Gorgutsforcongress what’s your gamma at what’s your black levels at what color settings are you using what about white levels?? If you don’t like a warm picture you don’t like accurate colors but that’s another things that’s just preference. I try to set all mine up as color accurate as possible no additional features turned on and get the lowest possible latency I can. My LG CX I have many ways I can run it but it’s mostly about if it’s a 8bit or 10bit game and if it uses HGIG or not.
I calibrate everyone’s display for them just about they always call me when they get a new TV to adjust it for home use and give them options for light and dark room use. I do it by eye but I’ve done it so long I don’t need instruments usually to get a pretty accurate display and fix other issues they come with because there setup usually for being under a bunch of lighting and usually cooler color lighting. So the stupid modes they come on look like garbage unless you only ever seen them that way. I get a lot of people that say there new TV makes them nauseous. Would you happen to know why that is?
The CX is limited to 40gbps, putting it in 12 bit will lower the chroma subs ambling to 4.2.0, 10 bit will give you RGB 4.4.4 which is what you want, now put your oled into PC mode to take advantage of it.
@@tomekkrr7806 The TV and your Xbox "have to speak the same language", when you put the TV (LG CX, C9, etc) in PC mode, it will recognize and process the 4.4.4 signal without conversion, if you don't, it will process all signals from the Xbox as 4.2.2
@@MrKenn1230 so only in pc Mode it is 4.4.4 ? I tested some pictures and in console mode the edges are smoother while in pc mode the the edges are less smooth. More steps are visible. Very strange.
I have an element smart TV and it let's me use the 120Hz with 12Bit but I'm not actually sure if it's doing the real 12Bit since I don't think it has a picture info menu like that one, it may just be outputting 10Bit or even 8Bit while the console says 12Bit.
I don't remember if the older LG CX could give one the full 48gbs. I know the C3 can. For a couple of years, they went back to 40gbs. The Xbox can't display 48GBs..only 40GBs and the PS5 can only display 32GBs. That's why true 4K@120HZ is impossible on the PS5 and really almost impossible on the XBSX because it doesn't have enough power. There is one game released two years ago, that is a 2D scrolling game, that can maintain actual 4K@120HZ on the XBSX. Most games are between 1080P@120hz or 1800P@120hz. A 60hz panel can only really supply 4K@60z with 8bit. An HDMI 2.0 port/cable can only handle up to 18GBs. (Not 40GBs or 48GBs) The question is 4:2:2 at 12 bit (17.82GBs) or 4:4:4 at 8bit (17.82GBs)? That's the highest for an HDMI 2.0 cable or a 60HZ refresh rate panel and not a 120hz or an HDMI 2.1 port/cable. 12bit at 60HZ us only 24.06 GBs. Any 2.1 port TV can do this.
Thank you so much for this, I was looking at this setting on my Series X and you explained everything perfectly. Answered all of my questions before I had a chance to ask. Subscribed. Thank you!
On the X series I'm lost to have the best image in sdr and hdr advise you to put on 8 bit or 10 bit, because if the console automatically switches to 10 bit what is the point of putting it in 10 bit I don't understand thank you to enlighten me I but 8 bit or 10 bit?
For xbox it is easy to just keep it at 8bit and let it do the math for us. I will see what I can do for Playstation as PS5 settings give you very limited control
This really helped. I tried changing the override to hdmi but switching back to auto detect because the resolution dropped. For whatever reason I never on my own thought to just increase the resolution back up to 1080p. Then I made the mistake of changing the color bit only to learn that it isn't necessary. So now years later I'm finally playing on 60hz which wasn't a option before.
Hey, is there a way to fix the frames per second counter on the free sync menu when in 120 hz mode? Somehow it always displays 5.5 when the system is in 120hz mode
Don’t tick 4:2:2 and leave it to 8bit when hdr is played it will automatically change to 10bit. When sdr is played it will stay locked to 8bit. Just leave it alone or you will wash colours in sdr games
to use in a 4k tv in 60hz and HDMI 2.0 i have choose 8 bit right? and with HDR ON, the depht color will change automactly to 10bit? is correct? i havo to check ycc 4:2:2?
Great video on Sony a90k would u use the reality creation setting or just leave on auto and if not how come ? And what about the advanced contrast enhancer would u use this ?
I have an Xbox X and I have bought some 4k film of the movie page but when I come to play them I get a message your TV does not support UltaHD format................. I have changed every setting 8 bit 10 bit 12 bit gone to HDMI Mode you name it I have done it I have watched many Utube video's but still get the message .................. My PS5 plays 4k discs perfect so it not the TV an LG bought last year please can you give advice ................Thank You ?
Why does this dude just sit on the setting and talk, while we’re waiting for him to change the setting?!??! @6:23 - @8:08 he talks that whole time while sitting on the drop down selection…
12-bit is pointless considering there's no 12-bit panels in existence except for maybe prototype even professional monitors late Sony that cost $30,000 for a 32-in grading monitor not not have a 12 bit panels so it's pointless as well 40Gbps HDMI 2.1 is more than enough for any 10-bit panel .....again I repeat no there is no 12-bit panel made by any consumer or professional display at the moment in the future when 12 minutes is available the only source will be Dolby vision what is mastered in a rec.2020 container with 12 bit color and can only display at the moment a little over a billion colors in 10 bit on a 10bit panel. When 12 bit panels are available we will need the 48Gbps hdmi 2,1 bandwidth but again right now it's pointless. 10-bit can display a little over a billion colors at the moment on a 10-bit panel TVs in the future with 12 bit panel will be able to display 64 billion colors and have a lot more nits enable to display that many colors probably looking around 2024-2026 for 12 bit panels to be mainstream TBH if we see some this year at the CES then it will possibly be a bit sooner date vs 2026.
Great info, Marc. Thanks. I'll only add that the top of the line PC graphics cards (like the Rt3090) can produce a 12bit, 4.4.4, 4K, HDR, 120Hz picture so when 12 bit panels do go mainstream, Dolby Vision will be in good company!
@@MrKenn1230 I believe its only disc based media with Dolby Vision 12 bit not sure about streaming from Netflix or others, if it has the 12 bit meta data I just don't know 🤷🏼♂️ cheers and stay safe 🇨🇦🍁🇨🇦
How do you show that FreeSync Info? I pressed the green button or two dot button on my lg remote, but all it does is control my xbox home screen like a controller? Mind you I got AMD freesync off because it disables Dolby Vision.
So regardless of what the tv is capable of. The bandwith from the console is limited your saying ? So if I wanted 12 bit 444 120hz 4k it’s just not possible befusss the Xbox only outputs 40gb ?
2 YRS and I'm just about to put the hammer down on a series X and found your vid and thought dude I'm on currently on an xbox one and a benq projector from 2015 it's putting out a crisp 1080p Pic and really love this platform alot where is the series X gonna disappoint? I subbed I want to know what's going on since I'll watch it. I really need to know is my settings wack or what? Have a good one. Later days. -951-
Quick question. Why is it that when i have my Series X on my Lg CX set to 60 hz it says ycb and when I have it set to 120hz it says RGB 8 or RGB 10, BUT WHEN I have my PS5 ON the LG CX always says RGB 12?????? DOES THIS MEAN the PS5 IS USING BETTER COLORS THUS giving a superior IMAGE????? 🤔 DOES this mean Series X cannot do Rgb 12???
On pc hdr is horrid so been staying to sdr just wondering if there is any benefit to 10 bit sdr or if everything should keep to 8 bit also should I be going for 4.4.4 or should I stick with rgb? - thanks
I'm to help the community... source/input (hdmi) set to PC or rename to PC this will deactivate every t.v. processing, black level high/normal, sharpness is up to you, xbox: tv connection HDMI (resolution depends, xbox series x 2160 and s 1440, choose the highest), 8-bit, PC-RGB... bye, bye jaggies but, HDR is out too now your is acting like a monitor (PC gamers will understand) but, is t really worth the HDR? In my case NO! I prefer a natural image
The C9 does accept a 12 bit video signal but it is only a 10 bit panel- It will downsample the signal to 10 bit to display it. If you are also using PC RGB with 4.4.4 chroma output and 120Hz, your Xbox or PS5 is reducing the chroma output to 4.2.2 or 4.2.0 (lowering picture quality) because they cap total output at 40Gpbs (Xbox) or 32Gbps (PS5) (not the total max input of the C9 of 48Gbps). Assuming you are using the new consoles and not a high end PC rig, you may want to lower the bitrate on the console to 10 bits.
@@MrKenn1230 Maybe, it's kinda true tbh but since the C9 has 48gbps it runs smoother and not at full or overload the system of the tv. If I change it to 10 bits panel it makes the image not pitch black but kinda grey I have tested all the possibilities between Series X/C9 configuration and 12 bits it's certainly the best blacks you can get
@@MrKenn1230 no, it is not 12 or 10 bit. It is WOLED, so it can as well be 10 * 4 = 40 bit. Also for HDR 12 bit is just better black and white gradation.
@@lolerie In my example I was speaking about the panel’s native color depth. Most TVs today that support HDR are 10 bit panels meaning they can reproduce 1024 shades of each of the primary colors (red, green, blue). They are other factors in hdmi video and audio bandwidth like you mentioned but the ultimate limiting factor in the number of colors the Tv can display is the native bit count of the panel.
Samsung QN90B has 14 bit picture processing, but it's still a 10 bit panel. Even if the source content isn't 8 bit, is there any gain to selecting 10 bit?
I saw Vincent say that allowing ycc 4:2:2 on hdmi 2.0 tvs actually outputs 12bit422 instead of the signal info displaying 8bit422 my question is does this same process work on non dolby vison tvs specifically hdr thru 4k60?
I have a Samsung 50” 7 series RU7100 and wondering what colour depth setting I should set my series X too I noticed if I set it to 8 bit the game tile for assassins creed Valhalla looks pixelated on the Xbox home screen, I’ve heard that it changes to 10 bit automatically when viewing content, I’ve heard setting it to 10 bit causes unnessisary hdmi bandwidth Cos it’s converting it to 10 bit twice?, and I’ve heard 12 bit can cause chopped blacks if the panel doesn’t support it, so I’m completely lost on what to set it too
Im having issues with my xbox series is displaying dolby vision gaming on my tcl 5 series tv. There is a grey like haze showing on my tv wen hdr is enabled. Wen the dolby vision box is checked only.
I have 2 questions i hope someone can help with. The first question is more a confirmation. I bought the new LG NanoCell 85 series 65in. For my xbox series x, do i want to use 8bit, 10bit, or 12bit? 10bit would be best correct? My second question is, my new TV has "AMD FreeSync Premium". Should i turn this option on? If so, whats the best setting? High or wide? Thanks everyone. Any advice is greatly appreciated.
Hello. You can set your bitrate to 8 or 10 , either will work, I'd just look to see which one looks better. I use 10 bit since the xbox has to go to 10 bit for HDR anyways. Make sure you uncheck the YCC 4.2.2 on your series X. Also, you may want to skip AMD free sync and just use VRR- Freesync on the LG can disable DolbyVison in some instances without added benefit. It seems that reviewers are saying that VRR and Freesync have similar performance ( but VRR doesn't have the DolbyVision loss).
@@MrKenn1230 Thanks man. This is the first bit of advice ive read with some explanations to help. Im going to jump on and check my settings again. Thanks again!
It shouldn't be. Until 12 bit panels come out, we all have 10 bit panels (recent TVs). 12 bits is using bandwidth that most people would not see in their picture output (in general)
Great video! Thanks so much for your help! I'm trying to get my lg c1 to play (fps) with ori and the wisp? I'm changing it from 60hz to 120 in my settings but when i look at my game optimizer it says 59 frames..... nick
Check the settings for the game if it is in 120 fps mode or not. I just checked and it is the game which can be played in 60 and 120. Make sure it is set to 120 fps ( performnace in the game settings)
I have my 2022 55 lg cx on 12-bit going at 118.7 Hz, 3840 x 2160p@120, says YCBCR420 12B 4L6 not sure what that means but I actually like running rgb instead of regular color space
I've been all over the Internet to try to figure this out exactly what TV you need for these settings and I've come to the conclusion if it's 4KTV you can have any 3 of these settings and these settings and I mean any 4KTV because if it wasn't 4K then it wouldn't be able to run those settings and the xbox doesn't tell me that it can't choose those settings when I choose them therefore any 4KTV will work crank that shit up to 36 I don't let anybody tell you differently.
I have a 8-bit + FRC "10-bit" monitor and i usually go with 10bit 422. Get less bandning when gaming. I honestly can't tell a difference in other things except the color banding. Text and menus looks the same.
I have an 8bit +FRC myself the Samsung Q70b QLED tv and I uncecked my 422 box in settings and having the console in 8bits. You say that enabling 422 and going up on 10bits on my seriesX would be better?, It's not all the SDR content like games in SDR and RUclips videos in 8 bits anyway and my seriesX would go automatically up and down correctly without being forced on 10 bits always?. Wouldn't 422 limit my bits on 10 and I could never achieve 12bits while it's a enabled? I mean 420 -12 bits would always be better than 422 10 bits if there is content available for that range of bit depth?, don't you think?.
He did say 32. Maybe his somewhat strong accent made it difficult to hear it. You americans need subtitles when british or aussie speaks which is just ridiculous.
It’s 2022 and I was suddenly bugged by this in terms of which is the better option. I blame my PC (long story). So basically what I gathered is PC RBG is best for monitors, otherwise Standard is for TVs and bit depth wise just leave it at 8-bit like you said because the console will work out when it needs to use anything higher. So far would you say I am on the money? I hope so! Lol
Bro your English is PERFECT, the kids complaining about your speech has hearing problems.
Thank you brother 🙏
Fax
The people that complain, would happily listen to mumble rap on a daily basis lol
He speaks better than I! AND I can ONLY speak English!
You sound like a beautiful young man
Thanks for the information. Very interesting. You got yourself a new subscriber. I’m a nerd and love this lol. I got to appreciate my new LG CX 65” and my series X even more 😍
This is an interesting topic right now, because all the available (and many of the upcoming) AV Receivers that support HDMI 2.1 have that Panasonic Chipset mismatch issue right now.
It happens on Nvidia cards and Xbox, but not PS5. Now, the question is, whether Microsoft could simply offer a "workaround option" for now that allows us to limit the Series S and X Output to 32Gbit/s like the PS5 currently is, thus enabling AVR users to enjoy 4K120HDR (albeit in 4:2:2)...🤔
🙄🙄🙄 Ridiculous, am I right.
I cannot believe the amount of shit we have to do just to get plasma screen black levels. Absolutely shocking.
Amazing content, nice and unbiased. Thanks, get a like out of me.
You are right I did the test I put an 8bit game and I put the console in 10 bit and I noticed that the colors were not natural they were saturated on the neck I went back to 8 bit and c 'was much better, and for the games which sound in 10 bit you are also right because the console automatically switches to 10 bit so it is better to leave in 8 bit like that we have the best of both worlds
Thank you
@@MyGadgetsWorld what if your monitor is 10 bit?
8bit 4:4:4 on every 60hz panel is the way to go. It's 17.82GBs and under the 18GBs HDMI 2.0 threshold.
Nice vid but flawed opinion. The xbox and tv do have auto settings but unfortunately they will go with the lowest common denominator to ensure a smoother experience when there is still more to the image that is not being represented, namely in the bit depth. Really turn on 12 bit, disallow 4:2:2, set your tv to native color space from auto (similar effect as the 4:2:2 xbox option) and you will see a clear difference between 12 bit and 10 bit. Film it, show people, it's clear as night and day. The only problem I have had with specifiying higher settings on my xbox is with Hulu, it gets a greenish tint to it but that is removed by checking 4:2:2 color again. I'm doing this on a 55" h8f also, so you cx should be even better.
Lol
@@dillonweaver2307 lol at what exactly? I did this and it works on my LG C3. Every other answer left my picture looking like a yellow piss hell with non existent blacks.
@@Gorgutsforcongress you don’t know how to setup the TV and the console together is all. 12 bit panels don’t exist in TVs yet that I know of so no reason to have input lag when your having to change it back to 10bit again. It’s still 10 bit if your lucky. A lot of TVs are 8bit but can accept 10 bit input also so then might as well be on 8bit. Up to you but those consoles are not very powerful nor TVs so make them do the least possible. And 4:2:2 is only a setting you should need on a really crappy 4k TV.
@@Gorgutsforcongress what’s your gamma at what’s your black levels at what color settings are you using what about white levels?? If you don’t like a warm picture you don’t like accurate colors but that’s another things that’s just preference. I try to set all mine up as color accurate as possible no additional features turned on and get the lowest possible latency I can. My LG CX I have many ways I can run it but it’s mostly about if it’s a 8bit or 10bit game and if it uses HGIG or not.
I calibrate everyone’s display for them just about they always call me when they get a new TV to adjust it for home use and give them options for light and dark room use. I do it by eye but I’ve done it so long I don’t need instruments usually to get a pretty accurate display and fix other issues they come with because there setup usually for being under a bunch of lighting and usually cooler color lighting. So the stupid modes they come on look like garbage unless you only ever seen them that way. I get a lot of people that say there new TV makes them nauseous. Would you happen to know why that is?
The CX is limited to 40gbps, putting it in 12 bit will lower the chroma subs ambling to 4.2.0, 10 bit will give you RGB 4.4.4 which is what you want, now put your oled into PC mode to take advantage of it.
Why do i need to put tze tv into pc mode? I know Vincent said it too. But in gears 5 it shows also RGB.. thanks
@@tomekkrr7806 Lower input lag.
@@ReLoadXxXxX no. Input lag is 13ms in game mode and is not lower in pc mode.
@@tomekkrr7806 The TV and your Xbox "have to speak the same language", when you put the TV (LG CX, C9, etc) in PC mode, it will recognize and process the 4.4.4 signal without conversion, if you don't, it will process all signals from the Xbox as 4.2.2
@@MrKenn1230 so only in pc Mode it is 4.4.4 ? I tested some pictures and in console mode the edges are smoother while in pc mode the the edges are less smooth. More steps are visible. Very strange.
I have an element smart TV and it let's me use the 120Hz with 12Bit but I'm not actually sure if it's doing the real 12Bit since I don't think it has a picture info menu like that one, it may just be outputting 10Bit or even 8Bit while the console says 12Bit.
I set mines to 10 but to upscale my games so my tv don’t have to keep switching 8 and 10 all the time
Very helpful, THANK YOU
Welcome Jeremy!
I have a 1080p asus monitor that can support up to 165hz what color debt should I keep it at? I also have a Xbox series x
I always hear that people think 8 bit is the best but i don’t know for sure it’s just what I hear
put your tv 30 bit per pixel 10 bit get your money wroth
What U prefer for fps games (Compettive Games) plz explain?
1080p 120hz
I have an LG CX too, I’ve turned HDR10 off on my Xbox series x, should I leave it on 8bit? What should my display settings look like?
Leave it at 8bit and let xbox and tv do the math for you.
Why did you turn off HDR10?
Is there a good reason for turning hdr10 off ?
@@euanm10 with it enabled it looks kinda washed out, I’ve left it off and adjusted my picture settings and it looks far more vibrant.
I don't remember if the older LG CX could give one the full 48gbs. I know the C3 can. For a couple of years, they went back to 40gbs. The Xbox can't display 48GBs..only 40GBs and the PS5 can only display 32GBs. That's why true 4K@120HZ is impossible on the PS5 and really almost impossible on the XBSX because it doesn't have enough power. There is one game released two years ago, that is a 2D scrolling game, that can maintain actual 4K@120HZ on the XBSX. Most games are between 1080P@120hz or 1800P@120hz. A 60hz panel can only really supply 4K@60z with 8bit. An HDMI 2.0 port/cable can only handle up to 18GBs. (Not 40GBs or 48GBs) The question is 4:2:2 at 12 bit (17.82GBs) or 4:4:4 at 8bit (17.82GBs)? That's the highest for an HDMI 2.0 cable or a 60HZ refresh rate panel and not a 120hz or an HDMI 2.1 port/cable. 12bit at 60HZ us only 24.06 GBs. Any 2.1 port TV can do this.
10bits is best option, best colors and deep black
This was an awesome explanation! Thank you!
Thank you. Please visit my new website. Visit my Official Website here:
mgwreviews.com
Mine works on the 12 bit will that be good for warzone ?
No lol
Thank you so much for this, I was looking at this setting on my Series X and you explained everything perfectly. Answered all of my questions before I had a chance to ask. Subscribed. Thank you!
Dark dark mode _
On the X series I'm lost to have the best image in sdr and hdr advise you to put on 8 bit or 10 bit, because if the console automatically switches to 10 bit what is the point of putting it in 10 bit I don't understand thank you to enlighten me I but 8 bit or 10 bit?
We would really appreciate a same video for PS5 aswell.. with lg cx and samsung tv..
People are really confused out there..
Hey Sumit I have couple of videos lined up i will make sure to include this.
For xbox it is easy to just keep it at 8bit and let it do the math for us. I will see what I can do for Playstation as PS5 settings give you very limited control
Ps5 is capped at 32gigs apparently 4.2.2
Vizio M55Q8-H1 2020. Which color
Bit 8 or 10? And enable or disable YCC 4:2:2? I play 4K HDR 60hz and 1080 none HDR 60hz games.
How are you getting RGB? I keep getting ycb 4:2:0
I am using TV in PC mode
How do you get that pc mode?
@@widaflow change the hdmi input from game to pc
@@widaflow change HDMI input icon (icon, not text) to PC.
This really helped. I tried changing the override to hdmi but switching back to auto detect because the resolution dropped. For whatever reason I never on my own thought to just increase the resolution back up to 1080p. Then I made the mistake of changing the color bit only to learn that it isn't necessary. So now years later I'm finally playing on 60hz which wasn't a option before.
I've got 36 bits ( 12 bit ) is that the best?
No
Hey, is there a way to fix the frames per second counter on the free sync menu when in 120 hz mode? Somehow it always displays 5.5 when the system is in 120hz mode
I've got a q60r Samsung should I set at 10bit or 8bit
Surprised you can see anything in game. That gears screen looks to have extreme crushed blacks.
Great video brother. So I have a lg G1 I should definitely change my output to 10bit correct? Will everything look better ?
I have a Samsung Q80A 4k 120hz tv, seems to work at 12 bit
Don’t tick 4:2:2 and leave it to 8bit when hdr is played it will automatically change to 10bit. When sdr is played it will stay locked to 8bit. Just leave it alone or you will wash colours in sdr games
@wesley 504 SDR=Standard Dynamic Range.
It does not on Windows, in HD color.
I get less color banding in games when I use 10bit 422. (2.0 monitor with 8-bit + FRC)
@@Mr.Pepperss Not sure about that.
It’s not there for me only 24 but then I change to hdmi overrides and it’s there I need a fix anyone know?
Thanks for the explanation and advice.... much appreciated mate ✊👊
Welcome. Have a great weekend and happy new year.!
@@MyGadgetsWorld heyyy, you too my brother ✌
Wow smart.. thanks man! How about the color space? Standard or PC RGB?
to use in a 4k tv in 60hz and HDMI 2.0 i have choose 8 bit right? and with HDR ON, the depht color will change automactly to 10bit? is correct? i havo to check ycc 4:2:2?
8 bit changes to 10 bit automatically on hdr games
@@itzdougie9745 no, it does not, on PC on nvidia in Windows 10 HD color menu. 8 bit but it will be HDR.
@@lolerie I meant 8 bit changes to 10 bit on non hdr games , sorry
@@itzdougie9745 yes, that is true. Games that support HDR will switch you to 10 bit HDR from 8/10/12 bit SDR.
If you set it to 10-bit and then play a Xbox game that doesn't have HDR which would be 8-bit does it automatically switch to 8-bit colour?
No because you are forcing it to keep it at 10bit.
Keep it at 8bit
@@MyGadgetsWorld Xbox should just put in an auto option in that menu so we can just set it and forget it.
@@MyGadgetsWorld why advise you to keep in 8 bit? normally 10 bit or 12 bit is better?
@@Mck179 because xbox can switch it automatically for you. No need to push it to 10 or 12. SDR content uses 8 bit
Great video on Sony a90k would u use the reality creation setting or just leave on auto and if not how come ? And what about the advanced contrast enhancer would u use this ?
I have an Xbox X and I have bought some 4k film of the movie page but when I come to play them I get a message your TV does not support UltaHD format................. I have changed every setting 8 bit 10 bit 12 bit gone to HDMI Mode you name it I have done it I have watched many Utube video's but still get the message .................. My PS5 plays 4k discs perfect so it not the TV an LG bought last year please can you give advice ................Thank You ?
Why does this dude just sit on the setting and talk, while we’re waiting for him to change the setting?!??! @6:23 - @8:08 he talks that whole time while sitting on the drop down selection…
12-bit is pointless considering there's no 12-bit panels in existence except for maybe prototype even professional monitors late Sony that cost $30,000 for a 32-in grading monitor not not have a 12 bit panels so it's pointless as well 40Gbps HDMI 2.1 is more than enough for any 10-bit panel .....again I repeat no there is no 12-bit panel made by any consumer or professional display at the moment in the future when 12 minutes is available the only source will be Dolby vision what is mastered in a rec.2020 container with 12 bit color and can only display at the moment a little over a billion colors in 10 bit on a 10bit panel. When 12 bit panels are available we will need the 48Gbps hdmi 2,1 bandwidth but again right now it's pointless.
10-bit can display a little over a billion colors at the moment on a 10-bit panel TVs in the future with 12 bit panel will be able to display 64 billion colors and have a lot more nits enable to display that many colors probably looking around 2024-2026 for 12 bit panels to be mainstream TBH if we see some this year at the CES then it will possibly be a bit sooner date vs 2026.
Great info, Marc. Thanks. I'll only add that the top of the line PC graphics cards (like the Rt3090) can produce a 12bit, 4.4.4, 4K, HDR, 120Hz picture so when 12 bit panels do go mainstream, Dolby Vision will be in good company!
@@MrKenn1230 I believe its only disc based media with Dolby Vision 12 bit not sure about streaming from Netflix or others, if it has the 12 bit meta data I just don't know 🤷🏼♂️ cheers and stay safe 🇨🇦🍁🇨🇦
Dolby Pulsar is 10 bit + FRC, which is good enough. Also, HDR has some bits used for luminance, which means that 12 bit will be better.
@@marcfavell Netflix and Dolby + uses profile 5 with IPTPQc2 that can encode 11.5 bits in 10 bits. See wikipedia article on ICtCp.
@@lolerie that's just to do with colors volume the panel itself cannot change there's only 10 bit panels ATM in 2021
Informative video. I appreciate that. 👍
How do you show that FreeSync Info? I pressed the green button or two dot button on my lg remote, but all it does is control my xbox home screen like a controller? Mind you I got AMD freesync off because it disables Dolby Vision.
U need to press the small green button at the bottom quickly several times until u see it.its a geek thing :)
So regardless of what the tv is capable of. The bandwith from the console is limited your saying ? So if I wanted 12 bit 444 120hz 4k it’s just not possible befusss the Xbox only outputs 40gb ?
Awesome work my friend!! Thank you!!
Can the Xbox series x on the LG C9 do 12-bit @120hz ?
No, it can't support 12-bit regardless of panel.
The reason is the bandwidth of the Xbox. It has 40gbps. Max 10bit 4.4.4.
2 YRS and I'm just about to put the hammer down on a series X and found your vid and thought dude I'm on currently on an xbox one and a benq projector from 2015 it's putting out a crisp 1080p Pic and really love this platform alot where is the series X gonna disappoint? I subbed I want to know what's going on since I'll watch it. I really need to know is my settings wack or what? Have a good one. Later days. -951-
What about 1440p@120 12 bit gaming?
Yes, it does work.
How do you do this?
Thank You! The game looks like shit when I select the 10bit and it got me really confused. Thank you once again.
I have the LG oled c7 TV and whe. I play my Xbox on 10 bit there's alot of croping on video why? By croping the image takes along time to resolve
Quick question. Why is it that when i have my Series X on my Lg CX set to 60 hz it says ycb and when I have it set to 120hz it says RGB 8 or RGB 10, BUT WHEN I have my PS5 ON the LG CX always says RGB 12?????? DOES THIS MEAN the PS5 IS USING BETTER COLORS THUS giving a superior IMAGE????? 🤔 DOES this mean Series X cannot do Rgb 12???
On pc hdr is horrid so been staying to sdr just wondering if there is any benefit to 10 bit sdr or if everything should keep to 8 bit also should I be going for 4.4.4 or should I stick with rgb? - thanks
There is no such thing like 10 bit on SDR 🤣
I'm to help the community... source/input (hdmi) set to PC or rename to PC this will deactivate every t.v. processing, black level high/normal, sharpness is up to you, xbox: tv connection HDMI (resolution depends, xbox series x 2160 and s 1440, choose the highest), 8-bit, PC-RGB... bye, bye jaggies but, HDR is out too now your is acting like a monitor (PC gamers will understand) but, is t really worth the HDR? In my case NO! I prefer a natural image
My C9 supports 12 bits I'm also using with PC RGB
I just added the C9 is a 12bit 48GBps panel too lol.
The C9 does accept a 12 bit video signal but it is only a 10 bit panel- It will downsample the signal to 10 bit to display it. If you are also using PC RGB with 4.4.4 chroma output and 120Hz, your Xbox or PS5 is reducing the chroma output to 4.2.2 or 4.2.0 (lowering picture quality) because they cap total output at 40Gpbs (Xbox) or 32Gbps (PS5) (not the total max input of the C9 of 48Gbps). Assuming you are using the new consoles and not a high end PC rig, you may want to lower the bitrate on the console to 10 bits.
@@MrKenn1230 Maybe, it's kinda true tbh but since the C9 has 48gbps it runs smoother and not at full or overload the system of the tv. If I change it to 10 bits panel it makes the image not pitch black but kinda grey I have tested all the possibilities between Series X/C9 configuration and 12 bits it's certainly the best blacks you can get
@@MrKenn1230 no, it is not 12 or 10 bit. It is WOLED, so it can as well be 10 * 4 = 40 bit. Also for HDR 12 bit is just better black and white gradation.
@@lolerie In my example I was speaking about the panel’s native color depth. Most TVs today that support HDR are 10 bit panels meaning they can reproduce 1024 shades of each of the primary colors (red, green, blue). They are other factors in hdmi video and audio bandwidth like you mentioned but the ultimate limiting factor in the number of colors the Tv can display is the native bit count of the panel.
Looks like ur menu is high contrast
Yes. I like it like that. :)
I accidentally changed my display overrides and my monitor is all black i cant see or do anything I tried restarting
I wish someone help me out with Asus rog shift oled monitor settings for the Xbox series X and ps5.
I just got the series x and after 2 days now it flickers and it shuts down I went to play a game it got fuzzy froze and shut down now it flickers
Skip to 8:40
Samsung QN90B has 14 bit picture processing, but it's still a 10 bit panel. Even if the source content isn't 8 bit, is there any gain to selecting 10 bit?
My lg t v will do RGB 12b TM 1920x1080 AT 119.8 FPS IS THAT GOOD?
Great video bro. Question? Can the LG Oled C1 support 12 Bit Xbox Series X?
I will prefer to keep it at 8 or 10, No TV supports 12bit color at the moment.
@@MyGadgetsWorld thanks bro! I appreciate it 👍🏽
The c1 supports 12bit on my series s so I would think the x is no different
I saw Vincent say that allowing ycc 4:2:2 on hdmi 2.0 tvs actually outputs 12bit422 instead of the signal info displaying 8bit422 my question is does this same process work on non dolby vison tvs specifically hdr thru 4k60?
You know why RE village shows as 8bit on the series X/CX?
Hello, on my oled 4k with my x box series x advise you to set the color space to 8 bit, 10 bit, or 12 bit?
10
Depending on your monitor if it's HDR then 10 bit is all you need and can utilize on today's monitors
I have a Samsung 50” 7 series RU7100 and wondering what colour depth setting I should set my series X too
I noticed if I set it to 8 bit the game tile for assassins creed Valhalla looks pixelated on the Xbox home screen, I’ve heard that it changes to 10 bit automatically when viewing content, I’ve heard setting it to 10 bit causes unnessisary hdmi bandwidth Cos it’s converting it to 10 bit twice?, and I’ve heard 12 bit can cause chopped blacks if the panel doesn’t support it, so I’m completely lost on what to set it too
Такая же дилемма, тоже такой же телевизор и xbox series x.Уже решили проблему?
My screen doesn’t turn dark when I switch to 12 but. I have lg c2
When will be see 8k at 240 12 bit 4:4:4 with display ports 😂
What would you recommend for Sony 900e owners? I’m running 1080p 120hz, which in this mode, hdr isn’t supported. Please help! Thanks
I wish I had sony 900e or 900f. Sorry . I hope someone else can help in the comments.
...It’s a 10 bit panel...
Indeed.
Im having issues with my xbox series is displaying dolby vision gaming on my tcl 5 series tv.
There is a grey like haze showing on my tv wen hdr is enabled.
Wen the dolby vision box is checked only.
I have 2 questions i hope someone can help with.
The first question is more a confirmation.
I bought the new LG NanoCell 85 series 65in. For my xbox series x, do i want to use 8bit, 10bit, or 12bit? 10bit would be best correct?
My second question is, my new TV has "AMD FreeSync Premium". Should i turn this option on? If so, whats the best setting? High or wide?
Thanks everyone. Any advice is greatly appreciated.
Hello. You can set your bitrate to 8 or 10 , either will work, I'd just look to see which one looks better. I use 10 bit since the xbox has to go to 10 bit for HDR anyways. Make sure you uncheck the YCC 4.2.2 on your series X. Also, you may want to skip AMD free sync and just use VRR- Freesync on the LG can disable DolbyVison in some instances without added benefit. It seems that reviewers are saying that VRR and Freesync have similar performance ( but VRR doesn't have the DolbyVision loss).
@@MrKenn1230 Thanks man. This is the first bit of advice ive read with some explanations to help. Im going to jump on and check my settings again. Thanks again!
so 10bit is the best not 12bit?
What are best settings if using ps5 or Xbox one x or new Xbox if using LG OLED65C8pua tv?
thanks for all. Can I put Color Space RGB(PC) on lg c2? what happen if i do this?
So basically 12 bit is better?
It shouldn't be. Until 12 bit panels come out, we all have 10 bit panels (recent TVs). 12 bits is using bandwidth that most people would not see in their picture output (in general)
The Samsung tv can run 12bit in 120hz,4:2:2
Great video! Thanks so much for your help! I'm trying to get my lg c1 to play (fps) with ori and the wisp? I'm changing it from 60hz to 120 in my settings but when i look at my game optimizer it says 59 frames.....
nick
It is happening in other games or only that specific one.?
Check the settings for the game if it is in 120 fps mode or not. I just checked and it is the game which can be played in 60 and 120. Make sure it is set to 120 fps ( performnace in the game settings)
@@MyGadgetsWorld I'll get back to you
I have my 2022 55 lg cx on 12-bit going at 118.7 Hz, 3840 x 2160p@120, says YCBCR420 12B 4L6 not sure what that means but I actually like running rgb instead of regular color space
If there is no point to changing it to 10 bits and 12 bits, why has Microsoft added these options ? 🤔
Thanks for the video, you did great.
I got an xbox one X not series X I'm pretty sure it can run on 36 so yeah
Hi,can you tell me which is best 55 or 48 tv for the xbox series x please ?
65”
Lg c1
I've been all over the Internet to try to figure this out exactly what TV you need for these settings and I've come to the conclusion if it's 4KTV you can have any 3 of these settings and these settings and I mean any 4KTV because if it wasn't 4K then it wouldn't be able to run those settings and the xbox doesn't tell me that it can't choose those settings when I choose them therefore any 4KTV will work crank that shit up to 36 I don't let anybody tell you differently.
I have a fire insignia tv which bit and standard should I put it on
I have a 8-bit + FRC "10-bit" monitor and i usually go with 10bit 422. Get less bandning when gaming. I honestly can't tell a difference in other things except the color banding. Text and menus looks the same.
Thanks for this info.
I have an 8bit +FRC myself the Samsung Q70b QLED tv and I uncecked my 422 box in settings and having the console in 8bits. You say that enabling 422 and going up on 10bits on my seriesX would be better?, It's not all the SDR content like games in SDR and RUclips videos in 8 bits anyway and my seriesX would go automatically up and down correctly without being forced on 10 bits always?. Wouldn't 422 limit my bits on 10 and I could never achieve 12bits while it's a enabled? I mean 420 -12 bits would always be better than 422 10 bits if there is content available for that range of bit depth?, don't you think?.
4 lanes at 8 Gbps = 32 Gbps, not 42 Gbps.
He did say 32. Maybe his somewhat strong accent made it difficult to hear it. You americans need subtitles when british or aussie speaks which is just ridiculous.
I have the 2019 Samsung series 6 4k can I use this setting and how do I get Dolby vision
Wherd you get your tv at for 4k uhd and 120 hz and how much
$1000 n over. I get from bestbuy or Costco
@@MyGadgetsWorld whats it called bro i need a link maybe lol
@@medievalshadows2836 LG CX?
The XSX dashboard font and settings icon is blurry when you select it. Is anyone experiencing on their LG CX?
12 bit works for me on a 4k60fps tv that does not support dolby vision, vrr, and allm
My Samsung is able to sustain 12 bit and pcrgb does that mean my wifi be less powerful compare the when I had it at 8bit
My series s doesn't allow 12 bit
It’s 2022 and I was suddenly bugged by this in terms of which is the better option. I blame my PC (long story). So basically what I gathered is PC RBG is best for monitors, otherwise Standard is for TVs and bit depth wise just leave it at 8-bit like you said because the console will work out when it needs to use anything higher. So far would you say I am on the money? I hope so! Lol
im using a series S, with an 1440p 120 hz panel, what bit you woould say is good?
Can we get an update this for the new LG c1 2021 model
So 12 bit for my new samsung odyssey 7
thanks..liked and sub'd
Thank you so much.
i got a vot a samsung Q80B. 65 in...i put it on 12bit and it works fine..
You set it to 8 bit and keep ycc 422 un ticked also don't use the xbox hdr calibration its shite
What happens to the color bit depth when watching Netflix does it stay on 8 bit or switch to 10 bit automatically ?
What if you tested this on the C9? Would it change the results?
Almost same