This has to be one of THE most clear and concise explanations of fps vs. Hz I've ever seen, incl. coverage of the downsides of v-sync and why adaptive sync is relevant today. The first three minutes of this video alone should be required watching for anyone trying to understand gaming monitor refresh rate specs.
Because he skipped the whole context of frame buffering, without of which it's barely scratching the bottom of the topic. I'd suggest watching something like "Refresh Rates, V-Sync Settings and Frame Buffers Explained" by SixteenThirtyTwo, that actually explains why which issue occurs, instead of just stating the outcome.
thanks. might give it a look but not everybody needs that level of detail, if you're a consumer trying to inform a purchase decision you're more likely to care about outcomes than the detailed technical explanation on how it occurs. there is a balance that needs to be struck based on your goals and audience.
@@wyntje83 Absolutely. But that's not just a theoretical knowledge - I think you do need to understand what's going on to know how to set your frame limits and buffering options to your needs if not using variable refresh rate.
@@Taretsmay u help me ? I have a adaptive sync monitor ,165hz What would be the best settings in nvidia panel and In game for best game experience? Even my game runs fine 90% of the time ,I’m some scenarios even with high fps ,the game will lag and stuttering all way around ,maybe some help ?
Maybe something to clarify to all: In the nvidia conrtol panel, there is an option to force v-sync on or off. That has an 'adaptive' and 'adaptive half refresh' option for _YEARS_ now. That has NOTHING to do with VRR :). It's just an old-school option but with a duplicate name. It basically means 'soft vsync lock'. By setting it do adaptive sync, you force a framepaced fps limit to the refreshrate of your monitor, causing perfect framepacing without turning vsync on in games (and not having the double buffering). But the moment you drop below your refreshrate, vsync is dropped and you get tearing instead of a big latency hit. It's like consoles used to do, and it is the perfect way to get good smooth gameplay at a perfect locked 60 or locked 30 if you have a 60hz-non-VRR screen. But if you have VRR, its' useless, and the name 'adaptive' has nothing to do with 'adaptive sync' in monitors :). (It's 'adaptive on or off' so to speak)
I have a question, I have a gaming laptop (GTX 1650) but I want to buy a gaming laptop that has a freesync option, will it work anyway even if I have an NVIDIA card?
Ok, that's just adaptive v-sync. It actually works just like normal v-sync (double buffer), but the difference is that it will disable and allow the latest frame available to show if the GPU's FPS drops from the target frame rate of the monitor. This is why it causes a tear. It will lock again after performance increases making you have double buffer again. I think some fast paced games still uses that on consoles in order to get more responsiveness and avoid normal v-sync stutter when dropping FPS.
One thing to note is AMD FreeSync Premium Pro monitors have the display's color gamut / luminance values embedded in the EDID so HDR games can calibrate to the display's capabilities a lot better than some stupid brightness sliders ever could, AFAIK this is something NVIDIA could interoperate with instead of letting Dolby reinvent the same thing and charge royalties for every device in the chain....
Freesync Premium Pro is a completely proprietary extension and isn't a part of the VESA standard, Nvidia couldn't support it even if they wanted to. It's not unique in embedding static metadata, that's just part of the base HDR10 spec, and it doesn't embed dynamic metadata like Dolby Vision/HDR10+. It was just a way of specific AMD-partnered games to tonemap directly to the display to improve latency. It's also been functionally abandoned by AMD, with no games added to the initial list of 8 that it launched with. Freesync Premium Pro is most notable for breaking HDR on supported displays with AMD GPUs, which is why Tim didn't recommend the AW3423DWF for over a year if you owned an AMD GPU as it would completely break tonemapping. Dell has since fixed this in firmware updates by just using the default HDR pipeline with AMD GPUs instead of the broken and abandoned Freesync Premium HDR pipeline, but it's still an issue on a number of Samsung monitors. Your entire comment is made up nonsense.
@RAM_845 AMD can't exceed 600 nits brightness with freesync enabled on monitors that hit 1000 nits with gsync enabled on nvidia. It's been talked about on this channel. You have to disable freesync to exceed 600 nits.
@@MrDutch1e You have the name of the video that talks about that. I would like to learn more on that as I have an AMD GPU along side FPP, but planning to get a TI Super early next year to go with my DWF
can anyone help here .. i have RX7900xtx and i play 4K 120HZ .. almost buying LG monitor with FreeCync premium Pro .. shall i buy it or not ? guys some advices needed here@@MrDutch1e
can anyone help here .. i have RX7900xtx and i play 4K 120HZ .. almost buying LG monitor with FreeCync premium Pro .. shall i buy it or not ? guys some advices needed here@@Jas7520
Thank you so much for this extensive deep and yet simple explanaition of the topic. If there are other resources like this online I have not stumbled upon them. Great work!
Thanks so much for the clear explanation! Things have changed so much that I had myself in knots trying to understand the current differences in G-sync and freesync. Learned some new things too. Great vid!
Thank you for this, I wondered for ages wtf adaptive sync is compared to g-sync and freesync and I couldn't find a decent explanation anywhere. I thought adaptive sync was a frame sync tech for hdmi or something lol
My QD OLED is rated for both, (and tested by Tim on this channel) 1000 nits is too bright for my gaming environment though. HDR on anything but OLED doesn't really work well across the board, especially with fine bright details like a starry sky. Can't beat light emitting pixels.
Here in Portugal, sometimes retailers don't even put basic specs in the spec sheet like Refresh Rate or Panel Type let alone Adaptive Sync or VRR. Fortunately we have your channel to save us from ignorance.
It really does work like magic. I can still comfortably play games at 60 FPS even after being so used to higher refresh rates because of adaptive sync. The image just looks smoother and the lack of input lag and tearing even at higher frames is so nice to have.
I am lucky to come across this video because I was having a very hard time buying a monitor because I was afraid that I needed a G-sync monitor because I didn't want to lose out on a feature but this video helped me realized that what I thought was wrong. But in a good way because now it made it way easier for me to pick a monitor since I now know that I won't miss out on anything if I don't buy a monitor with G-sync.
This video is misinformation. Displays with hardware G-Sync modules are objectively better. FreeSync doesn't have weird refresh rates with as much stability as a display with a hardware G-Sync module.
@@maxirunpl I got a 1080p 24in TUF gaming monitor that does have G-sync but I think what I regret is not getting a 2k monitor instead because 1080p is just ok. I'm not sure if it's because of the specific monitor I have or if it's because going from my laptop's 2k display to 1080p made me realize that downgrading the resolution was more of a compromise than I thought it would be.
@@boredomkid82711080p resolution is great on 24" monitor, but yeah once you tasted that 2k on a laptop screen which is probably smaller than 24" you can't go back.
Besides Freesync and G-sync, there is another form of VRR called HDMI Forum VRR. It is part of the HDMI 2.1 standard, but it doesn't need HDMI 2.1 to work. There are plenty of HDMI 2.0 monitors that support it. HDMI Forum VRR requires at least an AMD 6000 card or an Nvidia RTX 20/GTX 16 series card. Regarding game consoles, the Xbox series s/x supports both Freesync and HDMI Forum VRR. The PS5 only supports HDMI Forum VRR. For those that have Intel graphics cards, they only support adaptive sync through DisplayPort. Edit: at 9:52, Tim forgot to mention that you need a DisplayPort connection in order to use G-sync on Freesync monitors. But how come that I'm using using G-sync through HDMI? your monitor supports HDMI forum VRR and that's what's being used. (just like Freesync, HDMI Forum VRR is also labeled as G-SYNC in the Nvidia control panel).
But it is HDMI - it would be best of that crap was completely abandoned. Their proprietary bullshit with all the requirements like having active chips in cables, needing specific hardware for every single part, disallowing other technologies being used at the same time or their DRM will make everything look like crap... nobody should support such scumbags.
The PS5 also does not feature LFC support, which is a major dissapointment for me as my 4k 120hz LG TV with HDMI 2.1 is flickering in games that go below the refresh window and back often, like FF XVI. My G-Sync Compatible monitor with HDMI 2.1 does not though, which implies more thorough VRR testing for it's certification..
Wait so if I use freesync with a monitor that uses hdmi forum vrr am I even experiencing vrr? Should I turn off freesync? I play on a ps5 with the Dell g2724d. This monitor uses hdmi 2.0 but it supports hdmi 2.1 features such as vrr.
@@cc-fz5ne Try it. maybe it works better, maybe worse. You never know until you try it. It really depends on how Sony, Dell and the HDMI cable maker implemented their parts (and yes - due to how stupidly anti-consumer HDMI is - cables can block such features).
@@cc-fz5ne The PS5 does not support freesync (with LFC) but supports adaptive sync through HDMI VRR, which means that if the fps gets lower than the refresh window of the monitor (Which would most often be 48hz-120hz) it will revert back to normal v-sync. This can result in stutter or flicker. You don't need to turn off Freesync, but you still won't get it's benefits on the PS5, just the regular HDMI adaptive-sync without the software magic.
LFC - Gsync simply had that as it was part of nvidias module. So every Gsync monitor at the very least supported 1-60Hz. FreeSync had no such requirements and there were monitors with a Vrr range as low as 48-60Hz (the absolute worst of the worst) - such extreme crap was rare but LFC it self was also rarely a feature of early freesync monitors.
That's why should use FreeSync Premium as it requires LFC. If I understand correctly, G-Sync Compatible may also not support LFC. Moreover, it is unclear whether TVs with HDMI VRR support do support LFC.
I wish there was more attention given to the variable pixel overdrive of G-sync modules, as that can make a massive difference in ghosting and overshoot reduction for LCD panels while also just making it not be a pain in the ass to change overdrive modes everytime you switch to something running at a significantly different framerate. It was a shame that as big of a difference as it made, G-sync modules rarely showed up in monitors and added way too much to the cost for most people to justify. I'm honestly glad that we're seeing more OLED gaming monitors hit the market, as there is no need for overdrive settings as each pixel is lit individually.
G-sync / freesync implementation in OLED panels is a joke. You get strobing and flickering with minimal framerate variables and it can't be resolved due to OLED technology itself. That's why there's no OLED panels with true G-Sync module as it also wouldn't help.
@@d.ryan96 - That's hyperbole and you know it. If you're talking about VRR gamma changes during large framerate changes, that's not just an OLED issue.
@@40Sec Sorry pal, my old XB272BMIPRZX doesn't have these issues at all, tested it extensively with emulators and other means to push some non standard variable framerate variations. It doesn't strobe even on loading screens. After some experience with LG Oled CX tv, Samsung C27RG54FQR and MSI OPTIX G27C4 I can confidently say that non G-Sync module vrr just doesn't work well with my expectations. I could even argue that vrr is broken outside of TN panels due to ghosting and input lag of IPS panels when they stray too far away from maximum hz range.
I might add that I'm early adopter of G-Sync and mostly follow the rule of capping the framerate slightly below max Hz of the screen for consistent input lag and perceived montion. In more demanding games i tend to cap the framerate to something that's achievable 99% of the time. That's something that totally breaks f.e. 160hz ips panel, when you want to cap fps at something closer to 100 fps
@@d.ryan96 - Really not interested in anecdotes when speaking about inherent properties of technologies. My statement about VRR gamma shifts being an issue on both LCD and OLED wasn't an opinion, regardless of whether you've gotten lucky/unlucky with the panels you bought.
To clear common misunderstandings of V-Sync: * It's possible to play without input lag, it doesn't have any until the refresh rate is exceeded by the framerate or pre-rendered frames are not properly set. Input lag is highly influenced by the frame time which means whatever sync technology you're using, you'll be bottenecked by your monitor refresh if you want no tearing. * Frames duplicated have no effect in the input, it's only a visual artifact which the higher the refresh, the less noticeable. This is completely alleviated with Free/G-Sync. * V-Sync only works in fullscreen. Finally V-Sync actually can help adaptive sync technologies by providing full frames, only that Nvidia users should know that Low latency needs to be enabled at Ultra for that, that used to be called Max pre-rendered and the default is 3 which is pretty bad, setting it ultra makes it 1 which should always be tbh. Say for example you have 144hz, capping with RTSS to 120 fps in full screen with v-sync using free/g-sync will yield extremely good visuals with no compromises so long your hardware can keep up.
Going to have to edit the title in a dozen days! This was great to hear as someone who has paid very little attention since getting my XB270HU, the first all-boxes-ticked 1440p display.
Thanks so much for making this video and clarifying this for me, as I was up in the air about the Asus Rog strix 27 in LED panel that just recently came out that you reviewed in your top five, and it didn't say g-sync so I was worried it wasn't going to work well
The fact that these are essentially the same technology with different brand names on them is all I needed to know. I was between 2 of the exact same monitor wondering if I should get the one that’s “G-Sync compatible” since I have an NVIDIA card lol. Thank you sir.
One problem remains though. And this isn't a major problem but is a problem nonetheless for some of us users. While AMD Radeon GPUs have Freesync functionality over HDMI port and cable on some of the 60hz, 75hz and 100hz HDMI-only Freesync monitors, on such monitors with Nvidia, Freesync can't be enabled via HDMI cable-port.
@@tmsphere ya, I should just throw away my hdmi cable and hdmi freesync monitor just cause some random mor0n on the internet thinks it should 'go' away". kcuf off. Go troll somewhere else.
It's a good thing that finally things like this are getting better, a G-sync monitor at that time ware unaffordable at those times, all NVIDIA's peripherals ware ridiculous expensive. I didn't know that G-sync also offered their technology to other brands, I did know this about Free-sync, but I didn't know that G-sync was also released to other brands
I would appreciate a clear explanation on v-sync options "fast" and "adaptive" and what you should use. Also the effect of low latency mode. (Nvidia control panel)
Fast v-sync: Also known as true triple buffering. Lets the GPU run the game without FPS limit exceeding the monitor's maximum refresh rate to get less input lag without tear. It does that by always showing the most recent frame rendered by the GPU without any sequential order in its buffers (like normal triple buffer does) discarding the others in each refresh cycle of the monitor. This type of v-sync causes nonlinear motion, meaning that the objects motion speed appears to slightly "accelerate" or "decelerate" at different locations when moving from point A to B. There's a way to make it less distracting by always running the game with a FPS above the monitor's refresh that's divisible by it. It also works just like normal triple buffering v-sync when running the game below the maximum refresh rate of the monitor. Adaptive v-sync: Works just like normal v-sync (double buffer) at the monitor's native refresh rate, but the difference is that it will auto disable and allow the latest frame available to show if the GPU's FPS drops from the target frame rate of the monitor. This causes a tear, but makes it have less input lag than normal v-sync or regular triple buffer v-sync. It does that in order to get more responsiveness and avoid normal v-sync stutter when dropping FPS, but will lock again after performance increases making you have double buffer again. Unlike fast-sync it has a consistent motion and it's better to use when your GPU can provide a stable locked frame rate. Personally, I would use adaptive v-sync (from Nvidia inspector) with a locked frame limiter at around 0.01~0.05 decimals to remove normal sync buffer at the target refresh and get better latency without any screen tearing. In regards to low latency mode, it tries to remove the number of frames the CPU can pre-render before the GPU takes over. It may cause stutter in CPU bound games, but maybe provide better latency in GPU bound ones. Turning it on means it only lets the CPU pre-render 1 frame and Ultra eliminates all frames the CPU can pre-render. Try each to see if you get better results. TLDR; If you don't have a VRR screen but have a GPU that vomits FPS, use fast. If you want consistent locked frame pacing, but wants a little less input lag in performance intensive moments, use adaptive. LLM may cause stutter depending on the type of game.
So there are no disadvantages when I use my gaming PC with Nvidia GPU instead of an AMD to play on a new TCL TV? The TV says "AMD FreeSync Premium Pro" and "HDMI 2.1 VRR" are supported but G-Sync not. But I got everything right that shoud not be a problem, right? Thank you very much :D
So, THX to your recommendation I got myself 34" aw3423dw. OMG - I am like 2 weeks in this game and even looking at my desktop makes me smile. Free Sync all the way (6900XT) no tearing, no input lag, no ghosting, colors from another world, perfect black and jawdropping HDR content. (driver brightness -15, saturation 135) ❤ Thank you ❤ P.S. Only drawback is pixel refresh every 4h (6-8 minutes break) - what suppose to keep panel healthy.
You can delay the pixel refresh until standby mode is engaged (at least you can on the DWF model) 3 year warranty, unless you're spending an awful lot of time with static white images i wouldn't worry.
@@oktc68 no warranty for me - bought myself a cheap returned piece. I am at risk so I'd rather not gamble with endurance just yet ;) Once you get used to "recommended" breaks it ain't that bad. I actually turned auto prompt off and just occasionally check if panel health went 🟡 - then I move to side screen for a bit and turn refresh on.
You know there's shortcut for switching between SDR and HDR on windows? Windows Key + Alt + B, I don't think you need to change any saturation and brightness settings to see HDR.
@@pxwxontop it's not about HDR, more like personal pref. for oversaturated picture :) + AW released a new firmware and color profile for this screen as they were both somewhat flawed. much better but still running mine bit oversaturated by choice ^^
Thanks for making this video because having all this info into one well made video is hard to find. I have the gsync ultimate oled Alienware, mainly got it because of a great deal at the time and was concerned if it would work on my AMD gpu's and works just fine. You just need to enable Adaptive Sync in the AMD Software and you're good to go.
Nice. I have it the other way around. I will buy a monitor with freesync and have a nvidia gpu. Because g-sync is too expensive for me and 99%(not really, but a lot) of the monitors use freesync, so I couldn't find any that would fit me.
I work in film and video post-production and often need to mix different formats, so I'm aware of these issues (plus interlacing, don't get me started on interlacing). I often _avoid_ watching videos on this subject, because even when RUclipsrs actually have a clue about the technical details (and let's be honest, a lot of them don't), they tend to be sloppy with language in a way that misleads viewers that don't _already_ understand these issues, and my forehead gets sore from all the facepalming. Anyway, long story short: congratulations (and thank you) for wording everything absolutely perfectly. 🤓👍 I would have added a little graphic showing a strip of "frames being rendered" and "frames being shown", with arrows pointing from one to the other (to illustrate the discontinuity caused by mismatched frame and refresh rates), but I think the description and slow motion were clear enough to get the point across.
was loving g-sync until i upgraded to a 45" LG oled. now it's just constant brightness flicker until i turn adaptive sync off. Gonna have to make do with fast sync and low latency mode.
Look out for firmware updates for that monitor which may fix it. Sometimes it's either that, a less than premium cable, using HDMI instead of DP or turning the feature on then fully restarting the monitor. Because it shouldn't do it.
@@j3m638 No all OLEDs does this in VRR as it's shifting the gamma through different refresh rates, I got it on all my LG C-series and the Alienware QD-OLED. Nothing you can fix other than turning off VRR/G-Sync.
@@j3m638 latest 45gr95QE software 3.09 installed, using the DP1.4 cable it came with. I even tried the one my previous 34GN850 came with. I could restart, turn the monitor off, do this every day and it won't change. Love the monitor, but while everyone is touting OLED as 'the way forward', n o b o d y is mentioning that having VRR (and especially a costly power-hungry g-sync module) in an OLED panel is completely pointless as the second you launch any game, half of the frames are, entirely at random, a completely random different gamma.
Good to know, thanks. I am changing my setup and may be purchasing a monitor in the near future. Also, I finally understand Vsync and Adaptive sync a bit better now.
According to my experience even with new FreeSync labeled monitor you can have problems with G-Sync enabled. One of examples here is Dell G3223Q (real garbage TBH). I'd recommend to avoid ignoring G-Sunc compatibility label.
I have yet to see a G-Sync monitor with sharpness settings, which is why I go with Freesync instead, as its nice to have in those games where FXAA or such is forced, or when used with a console where textures might often be blurry because of forced FXAA or other blurry settings.
@@MaaZeus Sure, and thats a semi new feature, but that wont work for consoles on the same monitor. Im not going to buy 2-3 different screens for the same area/space of gaming.
Some very solid info here on the differences between G-SYNC , FreeSync and Adaptive Sync.. I am in the market for a 4K monitor in the New Year so watching this video was very helpful.
same here did any catch your eyes? im barely getting into pc gaming builds and pc gaming. im trying to invest in the one that can handle all the top end and newest games
@@BillaBong1 Yes, one did. Unless I see something online that changes my mind in the next few weeks, my money is on the Gigabyte M32U. This is a 32 inch 4k 144 hz IPS monitor. This looks to be a good all rounder which should provide a significant upgrade to my present setup. I will also be buying one of the new Nvidia Super4070 Ti graphics cards when they come out to power this new monitor. I am eagerly looking forward to some buttery smooth gaming on this new rig.
@@HDEFMAN1 Heck yeah bro!👍you hit the right check boxs. im also looking for a 32inch 4k screen to fit in a spot im plan on turning into my lil gaming area. i got the rtx 4070 but gonna save for the supers when they come out. got the i9 14900k paired with a msi z790 mother board still debating on 48gb 2x24gb or 64gb 2x 32 domintator titaniums, or the gskill trident z5s. what u think i should go with? any other recomendations?
@@BillaBong1 If you plan on doing any video editing or content creation or any task that requires some heavy lifting then I would say that you could find a use for 48GB or even 64GB, if not then 32 GB may well suffice. As to memory brands I imagine that any well known brand will work as long as it is compatible with your motherboard. One thing that often gets overlooked when putting together a new build is sourcing a good psu. I went with Seasonic for my last build. You don't want to spend all that money on expensive components and then use a cheap psu to power everything. Sounds like you are going to have a sweet system when you put it all together.
this is a fantastic video -- thank you. i am looking for PC parts for my next build & i was thinking of leaving the monitors last just because i was worried whether i was going with AMD or NVidia...
The worst part about monitors: "Hey, the LG 27GP850 seems nice - lets look if i can get it nearby.... LG 27GP850P-B .... 2 other letters - oh, very different specifications". That number/letter-salad and docents of variants just makes it a chore to find a good deal.
I would have liked if you showed judder issues with camera-pan demos, adaptive sync is a very important features, way more than modern and currently marketed ones.
Sad that the problem is very rarely discussed and you have no clue what panels are affected. ( in my case is happened when fps drops below vrr range, maybe “low frame rate compensation” feature is fixing it?)
In some cases, flickering can be caused by any type of performance overlay (like rivatuner overlay), so you should try disabling it completely and retry. Might do nothing, but it's worth a shot.
@@TheMaximusPride - You're right. I have flicker with GSync on my TCL in some games (but not others) with my Nvidia GPU, while the Freesync Ultimate in my Samsung panel is literally perfect in almost every game with no flicker at all on my AMD GPU. Both of these displays are VA. At least with my two particular panels, Freesync works considerably better than GSync, but it really seems to be on a panel-by-panel basis how well VRR works for a given person. In some displays it's flawless (my Samsung), in some it has varied results (my TCL) and in some it's just plain awful and doesn't work right. I wish the industry would get a better handle on VRR because it seems like right now there's too wide of a variance of VRR performance depending on the panel type, manufacturer and processing in the display itself, and when you do get flickering, it ruins the game.
Boy are you behind the news with respect to Free Sync vs GSync. When Nvidia developed Gsync it was a proprietary technology. The monitor manufactures had to buy the special Gsync hardware from Nvidia making these monitors more expensive. About a year later AMD developed Freesync that was an open source and open hardware technology (this means that there were no royalties) when this came out for every true Gsync monitors sold a lot more Free Sync monitors and all tv's were Free Sync. All the Nvida GPU owners complained to Nvidia so Nvidia quietly adopted Free Sync and they were able to make their GPU's compatible with Free Sync through software and included this in their drivers. The other thing that Nvidia did was to bribe most TV manufactures to market their TV's as supporting GSync on their packaging. However when reading the fine print you discover that these TV's also support AMD GPU's. This means the actual technology is Free Sync. I doubt that any true GSync monitors are made today and all TV's are Free Sync.
My monitor is from the early days of G-Sync Ultimate back in 2018, before they quietly lowered the standards for the qualification in 2021. As such, it has an older G-Sync module in it, and I've tested my monitor and found it doesn't work well with AMD cards. I bought it for the incredible HDR performance and color accuracy, and I honestly don't regret buying it, even if it only hits those values at or below 98Hz. I usually frame cap all my games to 90Hz, anyways.
@@Quast LOL Right? According to Nvidia, the qualification would have been impossible for OLED monitors to meet since it required 1000 nits peak brightness as a minimum, so they changed it to "lifelike HDR". I think they could have just added a seperate qualifier for OLEDs.
@MasterZoentrobe or just stated that for oleds it's only 1000nits @ max 2% or something, and let's say 300nits @ 100% window size. In my experience, an oled with max 300 nits but peaks of 900 at up to 5 or 10% window size, far beats any other panel type. The ability to have 900nits right next to a 100% black pixel is just something else.
I do think that VRR struggles when the frame to frame variance becomes too high though. Might be worth looking into. Additionally, you still need to perform some tricks (rivatuner frame limit, or radeon chill, nvidia reflex) to keep it below the VRR range in high fps scenarios.
Playing below vrr makes no sense (it's probably 48fps most often), and you won't generate more frames - practically every game now has limitations in settings...
I have oled so I just turn VRR off only thing I turn on is in-game frame cap VRR relies on games having level performance which most don't And if the game has level performance then its usually good performance @@The_Noticer.
This channel was great. I have been following it for the past couple of month as guide to my monitor buying decision. Along the way I learned a lot of stuff. Gsync vs free sync, HD10 vs HDR400, 10 bit, 8bit, HDMI 2.1 vs HDMI 2.0, refresh rate 165 htz, ghosting vs overshooting, Srgb, DCI-P3, mini response time, 4k vs 2k vs FHD display (heck that was hard), IPS vs VA vs TN vs OLED vs nano IPS . Yesterday i bought a monitor and now that journey is coming to an end. I feel sad. I would miss these folks and wish them the very best. Go break a leg guys !
i went through THREE different 7900XTX, at least 7 windows re-installs, 4 freesync premium monitors (including a $1000 alienware monitor) and i couldn't get rid of the issue with my monitor randomly flashing a black screen. no matter what i tried, it kept happening. in the end i bought a msi 4090 suprim and it solved everything right away. i didn't even have to do a clean windows install or remove the AMD drivers from my previous GPU. i'm currently using the same alienware QD-OLED freesync monitor that i was having black screen flickering on with 2 different AMD cards, but i'm not having the issue with my nvidia card. the amount of time and money i wasted troubleshooting, packing/shipping/returning/driving/RMAing to get rid of this issue was not worth the "savings" of buying an AMD card over nvidia. i don't really care what the prices are, next time i upgrade i'm buying only nvidia. the time and stress i won't have to put in is worth the extra $600 that the RTX 4090 cost me
I've just recently built a new gaming PC, ended up going full Team Red with AMD for both CPU (R5 7600X) and GPU (RX 7800 XT).. my monitor choice was based more on price than it was on features, and I ended up with the LG 34GP950G, which is branded with "G-Sync Ultimate"... here's the clincher: I have serious stuttering issues with virtually all full-screen videos when I enable "Adaptive Sync" in my AMD driver software... I can't say I've noticed issues with local media files (not many to test with), but in RUclips, Twitch, and Netflix, severe stuttering starts up mere seconds after entering full screen, sometimes getting noticeably worse in stages every few seconds thereafter. this stutter goes away instantly when I leave full screen. I must point out that this issue simply does not happen when I have this setting turned off. furthermore, enabling the "refresh rate overclock" on my monitor (allowing up to 180Hz) or leaving that off ("only" up to 144Hz) does not seem to make any difference, it still happens with Adaptive Sync enabled. FWIW, I have heard the current beta drivers (23.30.something..? I think?) do apparently fix this, though I haven't tried them myself as yet.. I'm currently still on 23.20.01.10. so with regards to the theory behind the implementation of VRR technology, this was indeed a very interesting and informative video, but it definitely doesn't paint the full picture.... at least not smoothly (sorry, couldn't resist that one). regardless, I just jumped from FHD at 60Hz to UWQHD at 180Hz, and it's blowing my mind... I don't think I could ever go back after this. (:
Just a theory but in my case I have an LG CX Oled and one of the first thing's I noticed when watching content via inbuilt apps on Netflix, RUclips etc is that I have terrible stutter on panning shots and after going down the rabbithole, I found out because the response time is so instantaneous on Oled panels, anything that doesn't reach it's native 120hz has a really noticeable slideshow effect due to 24hz, 30hz & 60hz showing a direct presentation of the smoothness of the content. My only saving grace is the motion interpolation on board, but the artifacts can be pretty bad at times so it has become a pet peeve for me. When reading your comment I just had the idea that when you used adaptive sync, it must have lowered the monitors refresh to match the fps of the video which inturn produced the stuttering due to your panels instant response time not masking those frame changes like a traditional LCD does, in my research I learned that due to LCD's response time in the pixels turning on or off being quite a bit higher, that it in turn makes any motion or panning alot smoother as the low refresh rates are just basically less perceivable than if it was instantaneous like it is on oled and possibly because your monitor is displaying a fixed refresh rate when it's turned off maybe helps it display smoother? I maybe wrong though as I don't use dedicated PC monitors, G-sync works with LG CX & 4090 and I have tested it out with it on & off in content but it still seems pretty unsmooth to me in both scenario's. Joys of Oled!
@@09RHYS I've never used any super high refresh rate OLEDs before, so that is very interesting indeed.. the weird thing is that I would expect my monitor refresh rate to drop to either 60Hz or 30Hz in the case of RUclips vids in full screen, but not only does it feel slower than that, it also feels inconsistent, like it's constantly "hunting" for the right refresh rate to use. really annoying..
@@The_Noticer. I'm familiar with that issue, but unfortunately that is very different. I've tried that, and tested with numerous browsers (Chrome, Edge, Brave, and Firefox). this is specifically VRR causing havoc.
@@AndreGreeffIf I'm not mistaken I believe non 1080p60 streams from YT are at 24hz? Maybe with adaptive sync turned on, it could be having a weird mismatch between the G-Sync Ultimate module & your AMD card and it's possibly overcompensating the sync maybe and not enabling the 1-180hz VRR range properly? It does sound pretty annoying after paying a good chunk of money for it all and then having to manually switch when you game or play media. It could also well be the browser or heck even the stream causing it to happen where maybe it isn't a constant fps but instead a variable thus making the stream feel unsmooth when adaptive sync is turned on, making it seem faster or slower than it is meant to be, I know I have issues with any variable fps media I play using my built-in media player on the CX via DLNA casting from my phone, where because it isn't a fixed fps, I get really bad motion handling with the aforementioned interpolation as well as with any 29hz/25hz content, which presents as a tiny stutter every 5secs and is jarring when things get very fast in motion or panning shots. Honestly monitors and TV's will never cease to baffle me when it comes to the source material's fps rates and it's handling of it! haha Also, for some reason I read your first paragraph thinking you were referring to owning the new alienware oled! Sorry it was a little late and I think the person above your comment at the time said they had one I believe 😅 But I guess it still is relevant to the previous theory due to your panel still having super low response times with a Nano IPS. Also, one last thing is have you tried any kind of disabling of the monitor features itself when using Adaptive Sync? That maybe worth a try to see if any of the features are what's causing the issue. Anyways, I hope you have some luck with figuring it out, maybe the beta drivers will be the best solution until an official driver comes along of all else fails! 👍🏻
I'm still a bit scared that the issues with my HP Omen 25 will repeat - when switching it to FreeSync (using a 3060Ti) it absolutely messes up the colour saturation, especially greens, and lowers the brightness to 90. When manually upping it again, it sometimes switches back to a non VRR 'custom' mode - so I usually don't use that feature. I mainly just noticed that some applications cause it to fizzle out, with the screen going black, which is super annoying, and I'm not sure if that's just cause they're technically 'incompatible'. That's what the nVidia Control Panel says, but still lets me activate G-Sync, and the Pendulum Demo looks fine, too. (why does that peak to 100% GPU usage tho, revving up the fans?)
@@nonyabusiness-f9e I just assumed it was standard behavior (definitely heard of random flickering/black screens in some apps before - disabling HW acceleration fixes them - and the colour issue is just dumb implementation by HP), and I got it on a good discount. It wasn't really new back when I got it, and couldn't check VRR back then cause I was still on my good old 960Ti for some time afterwards. It's on 11000+ hours SOT now, and I'm still quite okay with it until an upgrade that truly feels worthy and worth it, and I actually don't really use/miss VRR on a day to day basis.
Hey there, is there a way to make a guide on how to properly setup freesync/gsync in games? Because there's a lot of conflicting information online, like for example, which kind of vsync you need to use (double, triple buffered or none at all), do i need to keep the game in fullscreen mode or windowed maximized is enough, do i need to limit my frames below a certain treshold via RTSS. Thanks for your work!
Premise: playing without vsync, freesync, and gsync, is the gold standard, because it guarantees the best latency in competitive games, and this should be the ideal configuration, unless you have screen tearing that you can notice. If you want to play with gsync active, you have to make sure that the game’s frame rate never exceeds the monitor’s refresh rate! So if you have a particularly heavy game (or a screen with a lot of Hz), and you want to play with gsync active, there is no problem in setting a limit to the maximum fps. If instead with gsync active, you expect to exceed the maximum fps, this could definitely introduce a lot of input lag, so it is better to limit the maximum fps. There are different ways to limit the maximum fps. The best one is from the game engine, from the game settings, as it adds latency, but less than the alternatives, if this is not possible, the second best method is from the nvidia panel (or from the panel of the gpu brand you are using), from it you can limit the fps, it will add more lag than the game limiter, but better than not using anything. The last alternative is to limit from RTSS, RTSS could be advantageous in some cases, because it allows very precise limitations on frames per second. Also, if with gsync you want the maximum reduction of tearing and the maximum consistency of frames, you should enable vsyc, and consequently also triple buffering. Obviously this scenario adds latency considerably. The only advice I can give you, that nvidia, nor amd will ever give you, is: do not use technologies like gsync or freesync, if you want to play with low latency or competitive games, and above all, use them only if screen tearing is well present, if you do not have screen tearing, it does not make much sense to enable these settings. Remember that you are playing games, not watching games, so a better latency is preferable in almost all cases.
I just want to know when will the VRR flicker be fixed. My Samsung Q80T has intense flickering below 80fps and after spending countless hours online searching for a fix the conclusion is this is still unsolved. It's what is keeping me from switching to OLED, I read the issue is more present there due to deeper blacks and higher contrast.
Of the two Adaptive sync monitors I've owned, none have flickering. AOC 24g2 is a Freesync Premium product, no flickering. The MSI G2774QPX I bought recently is G-sync compatible AKA generic adaptive sync. No issues there.
Plz add a series of videos where u make the same content, how to setup a gaming monitor, for different brands. U could make one such video even every 3 or 6 month. Once an asus monitor, next aoc, lg, gigabytes, acer and so on. This helps a lot and has other benefits as well if u think about it. Thanks
Is VRR bad for competitive games? I mean it is better to see a half frame as soon as possible to see an enemy. Yes, u have this split between frames, but it is the fastest method.
One note, I see people saying to lock a 144hz monitor at 144fps, and this is WRONG. At the max refresh rate, Adaptive Sync stops and V-Sync (if you have it on) takes over. Always lock your FPS below the max refresh rate of your monitor and tbh, you should lock at whatever FPS you can HOLD, not peak. This constant lock will help VRR flickering, espcially on OLED panels.
I just assumed my freesync tv wouldnt work with gsync so I didn't even try turning it on. I've been gaming on my new PC since February but only started using gsync a couple days ago... what a difference! I kept turning graphics settings down before to be able to hit 120fps consistently. Now I can just max everything out and even if its only 90-100fps it still feels smoother than before and looks WAY better.
Are there actually monitor that still use Gsync? Not Freesync with Gsync compatibility. Actual native Gsync. Back then it was a premium feature for expensive monitors. Nowadays even 1000€ monitors just go for Freesync haha
Have had the same 144hz 1080p monitor from ASUS for like 6 years now and never knew FreeSync would work with my then 10 series, and now 20 series GPU, so I just never enabled it/looked into it at all. Super curious to see how it works, and really surprised it's just a pretty simple process. Really glad I found this video.
No mention that AMD introduce Freesync as an Open Standard and forced Nvidia to abandon their strategy? Ah yeah, good old Tim not telling the whole history and defining this as marketing.
And monopoly is dangerous. Don't get me wrong there isn't any goodies and baddies. They are companies and their purpose is just to milk the max money they can. That's also why it's illegal for them to make agreements (spolier alert : LOL)
One of the Nvidia GSync module's biggest downside is the lack of support for HMDI 2.1 and DisplayPort 2.0+ Didn't there also used to have a heat issue and so require fans that generate noise in some cases? Recent implementations don't seem to have that issue Also G-Sync Ultimate refresh rate range used to be wider, but likely part of a requirement that changed
You may have mentioned this but there is a TON of information so maybe I missed it. Its worth noting if you have one of the first GSYNC (that have the chip) monitors and buy an AMD card you will not be able to activate gsync.. and your refresh rate could potentially be limited. Found out the hard way when I bought a 7900 XTX (first time I've ever bought an AMD card). My monitor (Asus PG278Q) would no longer go to 144hz, couldn't use gsync, and max refresh was limited to 120hz. Thanks for the high quality videos and I appreciate all the monitor videos. Really helped me find an upgrade from that Asus monitor.
It felt glossed over, but I can't understate the benefits of G-Sync's adaptive overdrive, at least on IPS panels. I tried the switch from G to Free, and found myself immediately disappointed with how much overshoot I experienced in the lower refresh range. I ended up selling it and buying a gsync panel, and all my woes went away. I don't mean to say Freesync is bad per se, the price difference and better interface support are no small benefit and may tip the decision for the right people. However as with most things AMD vs Nvidia in my experience, the AMD solution always feels like the "budget" solution to Nvidias implementation. Until monitors all have perfect response times down the entire refresh range, I can't bring myself to switch back to Freesync regardless of price, the experience simply isn't up to par.
i totally agree and went with gsync (hw) for the same reason, even before misbuying. but to clarify, nobody is stopping manufacturers from implementing variable overdrive to their scalers, like it's in hw gsync. there's even a few non-gsync monitors that exist with variable overdrive. manufacturers just like to omit this feature, because it is added cost on the bom/engineering.
G-Sync's adaptive overdrive sounds really neat but I feel like these licenses tend to have less and less meaning as time goes on :^( I have a display that switched from the G-Sync HW module (AW2721D) to a FreeSync Premium Pro license (AW2723DF) and under VRR there is no (significant,
That is totally on that particular model of monitor. As Tim said. Several times in fact. And he even spelled it out that a hardware G-Sync module isn't even a guarantee that the overdrive modes are set up properly, though it is less common. So basically you didn't listen to a word he said.
It's important to mention that adaptive sync is so prevalent right now because AMD decided to make their VRR implementation a part of VESA standard for displays, that is why it's called FREEsync, as opposed to Nvidia' implementation.
" right now because AMD decided to make their VRR implementation a part of VESA standard for displays" No it is the other way around. Vesa VRR was based upon nvidias contributions and then AMD marketed their proprietary implementation as freesync.
@@ABaumstumpf No, you're the one who has that backwards. It was AMD that submitted Freesync to VESA, which later became what we now know as VESA VRR on TVs. It was even part of AMD's original presentation. Nvidia announced G-Sync first. But it required a compatible card, licensing and a dedicated expensive module in the monitor. "Freesync" was literally coined as a direct result of how expensive and proprietary G-Sync was. AMD then announced and demonstrated Freesync shortly after and then specifically announced during that same presentation that they had submitted Freesync to VESA for inclusion as an open standard so everyone, including TVs could use VRR. Look, I have an Nvidia card too, but don't be a fanboy. Nvidia doesn't "contribute".... ever. They develop and propriatize everything for as long as they can get away with it. The only things they contribute to, are the things they don't really have much choice on. It's mostly AMD that contributes to the industry with open standards anyone can adopt. Tessellation, Mantel (now known as DX12 and Vulkan), Freesync/VRR, etc. I could make a huge list but I don't think I need to, because you can as soon as you think "what features did Nvidia contribute "openly"?", you basically stop. PhysX? Nope, locked, never opened and dead now. G-Sync? Nope, was dedicated hardware, proprietary and took years of losing to AMD's Freesync before they had no choice but to open it (even after they were already using Freesync on laptops but still calling it G-Sync, but continued to block their desktop GPUs because "money"). You remember when AMD developed Tessellation and invited Nvidia to use it. Nvidia refused, basically saying it had little value, then Microsoft included it in DirectX 11 and then it turned out Nvidia's hardware was better at Tessellation than AMD's hardware. You remember when AMD developed Mantle and literally invited Nvidia to join the endeavor? Now look at RT Cores and Tensor cores for ray tracing and DLSS. Nvidia does NOT "contribute"... Ever. They exploit. Unless they're forced to contribute. Nvidia does NOT play nice nor share... Ever. Nvidia's greed will never allow them to share nor contribute. And Nvidia's pride will never allow them to work with the competition even when invited and even when said endeavor would directly benefit Nvidia with an advancement that would benefit the industry as a whole (ie Mantle). Nvidia does make good products, but don't ever delude yourself into thinking Nvidia is a good company.
@@valhallasashes4354 Nvidia is not a good company , its a for profit mutli billion corp. So is AMD. Lets not kid ourselves . AMD does what it does because they are not the market leader, because open source technologies are good for PR and has bigger chance of getting adapted . You cant demand for something when you have less than 20% market share .
Totally incorrect, VESA Adaptive Sync predates GSync and was not developed by AMD at all. It was used in embedded displayport as a power saving feature, gsync was introduced as a way of getting VRR on external displays specifically for gaming. Until GSync was released, no-one considered using VRR for its gaming benefits, only for its power benefits. A year after the release of GSync, VESA Adaptive Sync was ported to the external displayport standard and was relabeled as Displayport Adaptive Sync. Literally straight out of the original press release: "Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync." Adaptive sync is not "AMD's standard", as much as they would like to claim it.
@@Jas7520 Pulled from AMD's own FAQ page about FreeSync: What is DisplayPort Adaptive-Sync and how does it differ from AMD FreeSync™ technology? DisplayPort Adaptive-Sync is a new addition to the DisplayPort 1.2a specification, ported from the embedded DisplayPort v1.0 specification through a proposal to the VESA group by AMD. DisplayPort Adaptive-Sync is an ingredient feature of a DisplayPort link and an industry standard that enables real-time adjustment of display refresh rates required by technologies like AMD FreeSync™ technology. AMD FreeSync™ technology is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync protocols to enable user-facing benefits: smooth, tear-free, low-latency gameplay and video. Bottom line. Everybody is claiming credit and I am not about to get into a he said, she said argument. And I sure as hell am not going to get into a debate about derivatives of technologies in their earliest "forms" regardless of what their capabilities may or may not have been because I have no way of knowing what the full scope of those capabilities were. And just because it "existed" in "some" form, doesn't mean it was "used" in any "meaningful" form. That's like a child screaming "I knew the answer the whole time" even though they never used that answer in the way that mattered to the actual event in question. All I can tell you is what I saw when I was there. The events I listed in my post are what I saw. Both as things were announced, when they were announced and how events developed and transpired over the years as they happened. It doesn't matter who made the propeller first. What matters is who took the propeller off the mobile and put it on the plane first.
Rule of thumb i go by: VA panels: try to aim for a gsync module monitor with dynamic overdrive that scales with refresh rate on the fly. IPS panels: matters less due to lower need for overdrive, but i prefer gsync module. OLED: doesn't matter, go freesync. OLED has no use for overdrives.
I bought an Alienware AW3423DWF and with g-sync enabled it flickers in everything, but its gone if i disable g-sync. I decided to return the monitor due to the flickering.
Also AMD Freesync became the VRR standard adopted by VESA, anything VRR even without Freesync is under the VESA standard which adopted AMD's VRR implementation of the regular display scaler doing the work instead of the nVidia propietary module back in the day.
Why can't everything be made much simpler? 0) limit FPS below monitor Hz for selected games 1) the video card creates a frame 2) The frame includes the parcels and mails them to the monitor. 3) The monitor receives packets and collects frames 4)Enough time has passed since the final assembly of the frame monitor 5)enough time has passed for it to be able to show the new frame - interrupt (new_frame) { if (new_time_freme >= Constant_frame_time) show_frame();}
Thanks for this video. I'm in the process of looking for a wide-screen monitor to replace my current 5:4 ratio LG Flatron 19 inch monitor, as it doesn't perform well with 4K videos. I'm limited by space to 24 inch screens and have several in mind, but didn't understand the terminology of the types and whether they would be compatible with my AMD Ryzen 5 5600G based PC. I'm not a Gamer, so high frame-rate is not a problem to me, as long as it's better than 60 Hz. I'm looking at 75 Hz, or better, with 4K capability. Your video has now clarified the situation and I shall buy whichever monitor my local Computer Warehouse can supply at a reasonable price.
Fun Fact : I just recently bought an Asus FreeSync Monitor that was actually marked as "G-Sync Compatible", it even had a G-Sync Sticker on the front of it.... but it only had one HDMI port... no DP... i never felt so scammed in my life x)
@@Roarscia I swear my "gaming monitor" only has 1 hdmi port and that's it.. this is the first time i saw a monitor with only 1 port Even my second old TN office monitor has a hdmi AND a dvi port.. but not my main asus one
Good stuff. However, I am running a AW34DWF at 165Hz w/ a RTX4070, and when enabling VRR on the control panel, invariably (pun intended) flickering ensues. The games are running between 60 and 85 fps, so I wouldn't think it to be an fps issue. So I'm a bit confused by the statement that this panel properly supports G-Sync. Should the fps be higher (165)? Closer to the refresh rate? Or should I decrease the max. refresh rate (60, 100Hz)?
That pretty weird. Maybe try a different cable? Try different settings in the monitor OSD? Don't have that monitor but it doesn't sound like a frame rate issue. Whole point of GSync is that it shouldn't matter. Doubly so since you are smack dab in the middle of the range.
@@rossmarbow-combatflightsim5922 I think you're right, because it changes from game to game. Some do run great. Others present flickering on specific instances like hints, or menu overlays, and almost all flicker during loading screens (but there I can understand that the fps plummet)
I have AMD Freesync Premium Pro? And it's awful. It's incredibly hard to get it to work at all, and when it does your monitor turns into a stroboscope during loading screen or whanever the fps drop below the supported 48fps. G-sync good, amd bad. Such poor experience compared to Apple.
Had these issues to, its most likely because you use a VA or OLED Panel, which can vary in brightness on different Hz (Since FPS & Hz stay synchronized when using it) Nothiced it a lot on my old Samsung VA, especially in bad optimized games as the FPS would drop quick & randomly. After uppgrading CPU & GPU it was not noticable in any modern games, because of the more stable framerate = stable Hz :) But switched to a G7 Odysseu and not seen the issue anymore
What you're seeing is PWM flicker, which happens on EVERY non flicker free display. Look up reviews before you buy a display (RTINGS for example tests for PWM flicker).
@@Drubonk OLED indeed, why is that a OLED issue? Philips Evnia 34" ultrawide OLED. It's 175hz. I play at locked 120hz as I prefer a stable frame rate and at that rate it's fine everything above 48 is fine. Which in games is fine but menus or loading screens often drop below 48 for whatever reason.
@@MLWJ1993 No, I know what that is, I avoid screen with that due to migraine. I have an OLED monitor which does have the standard oled flicker, I can handle that. With Freesync on and below 48fps the monitor flickers very badly. Flashes almost. Very unpleasant and in badly developed games menus can be awful.
Very informative, thank you! Things have definitely changed since the last time I was in the market for a monitor and there's a lot of confusion about this online.
I have a question, do G-synch or FreeSynch add latency over no G-synch or FreeSynch? FPS gamers such as myself don't run either because we are running the game at the monitor's max refresh rate and there is no fluctuation or minimal fluctuation of frames.
honestly. I have a 4090 and I use gsync in every single kind of game + locking my fps, idk maybe I'm weird. but it also saves power and redzces heat for both components.
@@lawgamingtv9627 Exactly, I limit my 4070 as well even. It's already really efficient but I can almost half the power usage by limiting the fps to e.g. 90. For a lot of games I don't need to hit 120+ and locking it to 90 just makes it more consistent overall. Less power draw, less heat, less noice, barely less smoothness
@@olifantpeer4700 I do that with RTSS too. Cap the framerate to make .1% and 1% lows a lot better and more stable frametime. I don't like fluxation of frames unless its only a bit and not 20+ so I am still VRR but capped around my average so it dosnt go up/down by tons of frames.
This has to be one of THE most clear and concise explanations of fps vs. Hz I've ever seen, incl. coverage of the downsides of v-sync and why adaptive sync is relevant today. The first three minutes of this video alone should be required watching for anyone trying to understand gaming monitor refresh rate specs.
Because he skipped the whole context of frame buffering, without of which it's barely scratching the bottom of the topic.
I'd suggest watching something like "Refresh Rates, V-Sync Settings and Frame Buffers Explained" by SixteenThirtyTwo, that actually explains why which issue occurs, instead of just stating the outcome.
thanks. might give it a look but not everybody needs that level of detail, if you're a consumer trying to inform a purchase decision you're more likely to care about outcomes than the detailed technical explanation on how it occurs. there is a balance that needs to be struck based on your goals and audience.
@@wyntje83 Absolutely. But that's not just a theoretical knowledge - I think you do need to understand what's going on to know how to set your frame limits and buffering options to your needs if not using variable refresh rate.
oh please lol far from the mose concise
@@Taretsmay u help me ?
I have a adaptive sync monitor ,165hz
What would be the best settings in nvidia panel and In game for best game experience? Even my game runs fine 90% of the time ,I’m some scenarios even with high fps ,the game will lag and stuttering all way around ,maybe some help ?
Maybe something to clarify to all: In the nvidia conrtol panel, there is an option to force v-sync on or off. That has an 'adaptive' and 'adaptive half refresh' option for _YEARS_ now. That has NOTHING to do with VRR :). It's just an old-school option but with a duplicate name. It basically means 'soft vsync lock'. By setting it do adaptive sync, you force a framepaced fps limit to the refreshrate of your monitor, causing perfect framepacing without turning vsync on in games (and not having the double buffering). But the moment you drop below your refreshrate, vsync is dropped and you get tearing instead of a big latency hit. It's like consoles used to do, and it is the perfect way to get good smooth gameplay at a perfect locked 60 or locked 30 if you have a 60hz-non-VRR screen. But if you have VRR, its' useless, and the name 'adaptive' has nothing to do with 'adaptive sync' in monitors :). (It's 'adaptive on or off' so to speak)
Sir, this is a Wendy's.
aint nobody reading that shit
I have a question,
I have a gaming laptop (GTX 1650) but I want to buy a gaming laptop that has a freesync option, will it work anyway even if I have an NVIDIA card?
Ok, that's just adaptive v-sync. It actually works just like normal v-sync (double buffer), but the difference is that it will disable and allow the latest frame available to show if the GPU's FPS drops from the target frame rate of the monitor. This is why it causes a tear. It will lock again after performance increases making you have double buffer again. I think some fast paced games still uses that on consoles in order to get more responsiveness and avoid normal v-sync stutter when dropping FPS.
@@pajo5014 If you meant to say monitor, yes it will work just fine provided that you enable it in the Nvidia control panel and using displayport.
One thing to note is AMD FreeSync Premium Pro monitors have the display's color gamut / luminance values embedded in the EDID so HDR games can calibrate to the display's capabilities a lot better than some stupid brightness sliders ever could, AFAIK this is something NVIDIA could interoperate with instead of letting Dolby reinvent the same thing and charge royalties for every device in the chain....
Freesync Premium Pro is a completely proprietary extension and isn't a part of the VESA standard, Nvidia couldn't support it even if they wanted to.
It's not unique in embedding static metadata, that's just part of the base HDR10 spec, and it doesn't embed dynamic metadata like Dolby Vision/HDR10+. It was just a way of specific AMD-partnered games to tonemap directly to the display to improve latency. It's also been functionally abandoned by AMD, with no games added to the initial list of 8 that it launched with.
Freesync Premium Pro is most notable for breaking HDR on supported displays with AMD GPUs, which is why Tim didn't recommend the AW3423DWF for over a year if you owned an AMD GPU as it would completely break tonemapping. Dell has since fixed this in firmware updates by just using the default HDR pipeline with AMD GPUs instead of the broken and abandoned Freesync Premium HDR pipeline, but it's still an issue on a number of Samsung monitors.
Your entire comment is made up nonsense.
@RAM_845 AMD can't exceed 600 nits brightness with freesync enabled on monitors that hit 1000 nits with gsync enabled on nvidia. It's been talked about on this channel. You have to disable freesync to exceed 600 nits.
@@MrDutch1e You have the name of the video that talks about that. I would like to learn more on that as I have an AMD GPU along side FPP, but planning to get a TI Super early next year to go with my DWF
can anyone help here .. i have RX7900xtx and i play 4K 120HZ .. almost buying LG monitor with FreeCync premium Pro .. shall i buy it or not ? guys some advices needed here@@MrDutch1e
can anyone help here .. i have RX7900xtx and i play 4K 120HZ .. almost buying LG monitor with FreeCync premium Pro .. shall i buy it or not ? guys some advices needed here@@Jas7520
Thank you so much for this extensive deep and yet simple explanaition of the topic. If there are other resources like this online I have not stumbled upon them.
Great work!
Thanks so much for the clear explanation! Things have changed so much that I had myself in knots trying to understand the current differences in G-sync and freesync. Learned some new things too. Great vid!
Thank you for this, I wondered for ages wtf adaptive sync is compared to g-sync and freesync and I couldn't find a decent explanation anywhere. I thought adaptive sync was a frame sync tech for hdmi or something lol
Love how Nvidia removed the 1000 nit brightness requirement for "G-Sync Ultimate" qualifications lol.
Basically because most OLED is rated for 400 true black.
Yeah it's because oled can't make the rating, but are still a better hdr experience. They do put the module in 600 nits IPS displays too, though.
My QD OLED is rated for both, (and tested by Tim on this channel) 1000 nits is too bright for my gaming environment though. HDR on anything but OLED doesn't really work well across the board, especially with fine bright details like a starry sky. Can't beat light emitting pixels.
@@oktc68 it's not "rated for both". It has a 1000 nits mode, but that's only in a 2% percent window. This is not enough for an actual VESA rating.
HDR is still a shit show overall. Needs a few more years in the oven. Said from experience.
Thank you! Adaptive sync is one of these things that unnecessarily complicated thanks to marketing.
Here in Portugal, sometimes retailers don't even put basic specs in the spec sheet like Refresh Rate or Panel Type let alone Adaptive Sync or VRR. Fortunately we have your channel to save us from ignorance.
Those things are crucial to know about when buying a monitor
In my opinion, Adaptive Sync technologies is the greatest addition to gaming in the last ~10 years.
Agreed brother
It really does work like magic. I can still comfortably play games at 60 FPS even after being so used to higher refresh rates because of adaptive sync. The image just looks smoother and the lack of input lag and tearing even at higher frames is so nice to have.
does anyone know why it causes black screen?
does it require display port? sorry for asking.
@@Sarcaustikdepends on resolution and monitor some monitors can only use DisplayPort for high resolutions
I am lucky to come across this video because I was having a very hard time buying a monitor because I was afraid that I needed a G-sync monitor because I didn't want to lose out on a feature but this video helped me realized that what I thought was wrong. But in a good way because now it made it way easier for me to pick a monitor since I now know that I won't miss out on anything if I don't buy a monitor with G-sync.
Same. This should be everywhere
This video is misinformation. Displays with hardware G-Sync modules are objectively better. FreeSync doesn't have weird refresh rates with as much stability as a display with a hardware G-Sync module.
I would like to but a monitor with g-sync instead of freesync, but the monitors with g-sync to freesync ratio is like 1 to 80.
@@maxirunpl I got a 1080p 24in TUF gaming monitor that does have G-sync but I think what I regret is not getting a 2k monitor instead because 1080p is just ok. I'm not sure if it's because of the specific monitor I have or if it's because going from my laptop's 2k display to 1080p made me realize that downgrading the resolution was more of a compromise than I thought it would be.
@@boredomkid82711080p resolution is great on 24" monitor, but yeah once you tasted that 2k on a laptop screen which is probably smaller than 24" you can't go back.
Besides Freesync and G-sync, there is another form of VRR called HDMI Forum VRR. It is part of the HDMI 2.1 standard, but it doesn't need HDMI 2.1 to work. There are plenty of HDMI 2.0 monitors that support it. HDMI Forum VRR requires at least an AMD 6000 card or an Nvidia RTX 20/GTX 16 series card. Regarding game consoles, the Xbox series s/x supports both Freesync and HDMI Forum VRR. The PS5 only supports HDMI Forum VRR. For those that have Intel graphics cards, they only support adaptive sync through DisplayPort.
Edit: at 9:52, Tim forgot to mention that you need a DisplayPort connection in order to use G-sync on Freesync monitors. But how come that I'm using using G-sync through HDMI? your monitor supports HDMI forum VRR and that's what's being used. (just like Freesync, HDMI Forum VRR is also labeled as G-SYNC in the Nvidia control panel).
But it is HDMI - it would be best of that crap was completely abandoned. Their proprietary bullshit with all the requirements like having active chips in cables, needing specific hardware for every single part, disallowing other technologies being used at the same time or their DRM will make everything look like crap... nobody should support such scumbags.
The PS5 also does not feature LFC support, which is a major dissapointment for me as my 4k 120hz LG TV with HDMI 2.1 is flickering in games that go below the refresh window and back often, like FF XVI. My G-Sync Compatible monitor with HDMI 2.1 does not though, which implies more thorough VRR testing for it's certification..
Wait so if I use freesync with a monitor that uses hdmi forum vrr am I even experiencing vrr? Should I turn off freesync? I play on a ps5 with the Dell g2724d. This monitor uses hdmi 2.0 but it supports hdmi 2.1 features such as vrr.
@@cc-fz5ne Try it. maybe it works better, maybe worse. You never know until you try it. It really depends on how Sony, Dell and the HDMI cable maker implemented their parts (and yes - due to how stupidly anti-consumer HDMI is - cables can block such features).
@@cc-fz5ne The PS5 does not support freesync (with LFC) but supports adaptive sync through HDMI VRR, which means that if the fps gets lower than the refresh window of the monitor (Which would most often be 48hz-120hz) it will revert back to normal v-sync. This can result in stutter or flicker.
You don't need to turn off Freesync, but you still won't get it's benefits on the PS5, just the regular HDMI adaptive-sync without the software magic.
Nice & informative video Tim. As usual my well rounded PC knowledge tree has never spent much time in monitors. So this sort of content is a godsend!
It's not informative. Displays with hardware G-Sync modules are objectively better.
LFC - Gsync simply had that as it was part of nvidias module. So every Gsync monitor at the very least supported 1-60Hz. FreeSync had no such requirements and there were monitors with a Vrr range as low as 48-60Hz (the absolute worst of the worst) - such extreme crap was rare but LFC it self was also rarely a feature of early freesync monitors.
That's why should use FreeSync Premium as it requires LFC. If I understand correctly, G-Sync Compatible may also not support LFC.
Moreover, it is unclear whether TVs with HDMI VRR support do support LFC.
@@cube2fox”it is unclear whether TVs with HDMI VRR do support LFC” I have a LG C1 and can confirm LFC works all the way to 1hz.
@@EJM07 That doesn't say anything since it supports FreeSync Premium. I'm actually pretty sure HDMI VRR by itself doesn't require LFC.
I wish there was more attention given to the variable pixel overdrive of G-sync modules, as that can make a massive difference in ghosting and overshoot reduction for LCD panels while also just making it not be a pain in the ass to change overdrive modes everytime you switch to something running at a significantly different framerate. It was a shame that as big of a difference as it made, G-sync modules rarely showed up in monitors and added way too much to the cost for most people to justify.
I'm honestly glad that we're seeing more OLED gaming monitors hit the market, as there is no need for overdrive settings as each pixel is lit individually.
G-sync / freesync implementation in OLED panels is a joke. You get strobing and flickering with minimal framerate variables and it can't be resolved due to OLED technology itself. That's why there's no OLED panels with true G-Sync module as it also wouldn't help.
@@d.ryan96 - That's hyperbole and you know it. If you're talking about VRR gamma changes during large framerate changes, that's not just an OLED issue.
@@40Sec Sorry pal, my old XB272BMIPRZX doesn't have these issues at all, tested it extensively with emulators and other means to push some non standard variable framerate variations. It doesn't strobe even on loading screens.
After some experience with LG Oled CX tv, Samsung C27RG54FQR and MSI OPTIX G27C4 I can confidently say that non G-Sync module vrr just doesn't work well with my expectations.
I could even argue that vrr is broken outside of TN panels due to ghosting and input lag of IPS panels when they stray too far away from maximum hz range.
I might add that I'm early adopter of G-Sync and mostly follow the rule of capping the framerate slightly below max Hz of the screen for consistent input lag and perceived montion.
In more demanding games i tend to cap the framerate to something that's achievable 99% of the time. That's something that totally breaks f.e. 160hz ips panel, when you want to cap fps at something closer to 100 fps
@@d.ryan96 - Really not interested in anecdotes when speaking about inherent properties of technologies. My statement about VRR gamma shifts being an issue on both LCD and OLED wasn't an opinion, regardless of whether you've gotten lucky/unlucky with the panels you bought.
Thank you. Very good video. All my doubts are now cleared. I feel better. Cheers Australia ❤
To clear common misunderstandings of V-Sync:
* It's possible to play without input lag, it doesn't have any until the refresh rate is exceeded by the framerate or pre-rendered frames are not properly set. Input lag is highly influenced by the frame time which means whatever sync technology you're using, you'll be bottenecked by your monitor refresh if you want no tearing.
* Frames duplicated have no effect in the input, it's only a visual artifact which the higher the refresh, the less noticeable. This is completely alleviated with Free/G-Sync.
* V-Sync only works in fullscreen.
Finally V-Sync actually can help adaptive sync technologies by providing full frames, only that Nvidia users should know that Low latency needs to be enabled at Ultra for that, that used to be called Max pre-rendered and the default is 3 which is pretty bad, setting it ultra makes it 1 which should always be tbh.
Say for example you have 144hz, capping with RTSS to 120 fps in full screen with v-sync using free/g-sync will yield extremely good visuals with no compromises so long your hardware can keep up.
Thank you for this. I've been holding off on buying a new monitor for ages because I wasn't sure how any of this worked
Are you saying I have just spent $1,000 more on my monitor for nothing?
Going to have to edit the title in a dozen days!
This was great to hear as someone who has paid very little attention since getting my XB270HU, the first all-boxes-ticked 1440p display.
Wtf how is this video so complete?
Congrats guys! Insanely useful video 👏👏👏👏
Thanks so much for making this video and clarifying this for me, as I was up in the air about the Asus Rog strix 27 in LED panel that just recently came out that you reviewed in your top five, and it didn't say g-sync so I was worried it wasn't going to work well
The fact that these are essentially the same technology with different brand names on them is all I needed to know. I was between 2 of the exact same monitor wondering if I should get the one that’s “G-Sync compatible” since I have an NVIDIA card lol. Thank you sir.
One problem remains though. And this isn't a major problem but is a problem nonetheless for some of us users. While AMD Radeon GPUs have Freesync functionality over HDMI port and cable on some of the 60hz, 75hz and 100hz HDMI-only Freesync monitors, on such monitors with Nvidia, Freesync can't be enabled via HDMI cable-port.
Didn't he say in the video that gtx1660 supports freesync via hdmi?
@@Taha-o6s it doesn't though.
HDMI is lame, this tech needs to go away.
@@tmsphere ya, I should just throw away my hdmi cable and hdmi freesync monitor just cause some random mor0n on the internet thinks it should 'go' away".
kcuf off. Go troll somewhere else.
@@tmsphere Lame? wtf are you talking about. Do you play in 10k or something?
It's a good thing that finally things like this are getting better, a G-sync monitor at that time ware unaffordable at those times, all NVIDIA's peripherals ware ridiculous expensive. I didn't know that G-sync also offered their technology to other brands, I did know this about Free-sync, but I didn't know that G-sync was also released to other brands
G-sync monitors are still unaffordable and 2x the price they should be.
I would appreciate a clear explanation on v-sync options "fast" and "adaptive" and what you should use. Also the effect of low latency mode.
(Nvidia control panel)
Nvidia profile inspector is better, force resize bar, V-Sync to fast for less latancy
Fast v-sync:
Also known as true triple buffering. Lets the GPU run the game without FPS limit exceeding the monitor's maximum refresh rate to get less input lag without tear. It does that by always showing the most recent frame rendered by the GPU without any sequential order in its buffers (like normal triple buffer does) discarding the others in each refresh cycle of the monitor. This type of v-sync causes nonlinear motion, meaning that the objects motion speed appears to slightly "accelerate" or "decelerate" at different locations when moving from point A to B. There's a way to make it less distracting by always running the game with a FPS above the monitor's refresh that's divisible by it. It also works just like normal triple buffering v-sync when running the game below the maximum refresh rate of the monitor.
Adaptive v-sync:
Works just like normal v-sync (double buffer) at the monitor's native refresh rate, but the difference is that it will auto disable and allow the latest frame available to show if the GPU's FPS drops from the target frame rate of the monitor. This causes a tear, but makes it have less input lag than normal v-sync or regular triple buffer v-sync. It does that in order to get more responsiveness and avoid normal v-sync stutter when dropping FPS, but will lock again after performance increases making you have double buffer again. Unlike fast-sync it has a consistent motion and it's better to use when your GPU can provide a stable locked frame rate.
Personally, I would use adaptive v-sync (from Nvidia inspector) with a locked frame limiter at around 0.01~0.05 decimals to remove normal sync buffer at the target refresh and get better latency without any screen tearing.
In regards to low latency mode, it tries to remove the number of frames the CPU can pre-render before the GPU takes over. It may cause stutter in CPU bound games, but maybe provide better latency in GPU bound ones. Turning it on means it only lets the CPU pre-render 1 frame and Ultra eliminates all frames the CPU can pre-render. Try each to see if you get better results.
TLDR; If you don't have a VRR screen but have a GPU that vomits FPS, use fast. If you want consistent locked frame pacing, but wants a little less input lag in performance intensive moments, use adaptive. LLM may cause stutter depending on the type of game.
So there are no disadvantages when I use my gaming PC with Nvidia GPU instead of an AMD to play on a new TCL TV? The TV says "AMD FreeSync Premium Pro" and "HDMI 2.1 VRR" are supported but G-Sync not. But I got everything right that shoud not be a problem, right?
Thank you very much :D
So, THX to your recommendation I got myself 34" aw3423dw. OMG - I am like 2 weeks in this game and even looking at my desktop makes me smile. Free Sync all the way (6900XT) no tearing, no input lag, no ghosting, colors from another world, perfect black and jawdropping HDR content. (driver brightness -15, saturation 135)
❤ Thank you ❤
P.S. Only drawback is pixel refresh every 4h (6-8 minutes break) - what suppose to keep panel healthy.
You can delay the pixel refresh until standby mode is engaged (at least you can on the DWF model) 3 year warranty, unless you're spending an awful lot of time with static white images i wouldn't worry.
@@oktc68 no warranty for me - bought myself a cheap returned piece. I am at risk so I'd rather not gamble with endurance just yet ;)
Once you get used to "recommended" breaks it ain't that bad. I actually turned auto prompt off and just occasionally check if panel health went 🟡 - then I move to side screen for a bit and turn refresh on.
You know there's shortcut for switching between SDR and HDR on windows? Windows Key + Alt + B, I don't think you need to change any saturation and brightness settings to see HDR.
@@pxwxontop it's not about HDR, more like personal pref. for oversaturated picture :)
+ AW released a new firmware and color profile for this screen as they were both somewhat flawed. much better but still running mine bit oversaturated by choice ^^
Thank you for making this very educational video.
Excellent video. Thank you very much. It was very clear regarding the various standards etc.
I wish I could like this video twice. The first 3 minutes especially were so concise and helpful.
Thanks for the easy clarification.
You are the one of the fews tech RUclipsrs that don't make misleading video end talk about subjects they don't knows.
Thanks for that.
Thanks for making this video because having all this info into one well made video is hard to find. I have the gsync ultimate oled Alienware, mainly got it because of a great deal at the time and was concerned if it would work on my AMD gpu's and works just fine. You just need to enable Adaptive Sync in the AMD Software and you're good to go.
Nice. I have it the other way around. I will buy a monitor with freesync and have a nvidia gpu. Because g-sync is too expensive for me and 99%(not really, but a lot) of the monitors use freesync, so I couldn't find any that would fit me.
I work in film and video post-production and often need to mix different formats, so I'm aware of these issues (plus interlacing, don't get me started on interlacing). I often _avoid_ watching videos on this subject, because even when RUclipsrs actually have a clue about the technical details (and let's be honest, a lot of them don't), they tend to be sloppy with language in a way that misleads viewers that don't _already_ understand these issues, and my forehead gets sore from all the facepalming.
Anyway, long story short: congratulations (and thank you) for wording everything absolutely perfectly. 🤓👍
I would have added a little graphic showing a strip of "frames being rendered" and "frames being shown", with arrows pointing from one to the other (to illustrate the discontinuity caused by mismatched frame and refresh rates), but I think the description and slow motion were clear enough to get the point across.
was loving g-sync until i upgraded to a 45" LG oled. now it's just constant brightness flicker until i turn adaptive sync off. Gonna have to make do with fast sync and low latency mode.
Look out for firmware updates for that monitor which may fix it. Sometimes it's either that, a less than premium cable, using HDMI instead of DP or turning the feature on then fully restarting the monitor. Because it shouldn't do it.
@@j3m638 No all OLEDs does this in VRR as it's shifting the gamma through different refresh rates, I got it on all my LG C-series and the Alienware QD-OLED. Nothing you can fix other than turning off VRR/G-Sync.
@@j3m638 latest 45gr95QE software 3.09 installed, using the DP1.4 cable it came with. I even tried the one my previous 34GN850 came with. I could restart, turn the monitor off, do this every day and it won't change.
Love the monitor, but while everyone is touting OLED as 'the way forward', n o b o d y is mentioning that having VRR (and especially a costly power-hungry g-sync module) in an OLED panel is completely pointless as the second you launch any game, half of the frames are, entirely at random, a completely random different gamma.
same here, might drop vrr as well.
its not just a oled issue va and ips have the same issues, its just more obvious with oleds@@DJHEADPHONENINJA
God bless you! Very comprehensive. These companies are shooting themselves in the foot by making this so confusing!
Freesync is for Nvidia and AMD. No need for extra gsync monitors!
Good to know, thanks. I am changing my setup and may be purchasing a monitor in the near future. Also, I finally understand Vsync and Adaptive sync a bit better now.
According to my experience even with new FreeSync labeled monitor you can have problems with G-Sync enabled. One of examples here is Dell G3223Q (real garbage TBH). I'd recommend to avoid ignoring G-Sunc compatibility label.
So what is stated in the video (that all current FreeSync monitors are compatible with NVIDIA G-SYNC) isn't 100% true.
Good to know!
Good explanation. But I think u guys should also mention about needing to use DP cable instead of HDMI when using Nvidia GPU and Freesync monitor.
I have yet to see a G-Sync monitor with sharpness settings, which is why I go with Freesync instead, as its nice to have in those games where FXAA or such is forced, or when used with a console where textures might often be blurry because of forced FXAA or other blurry settings.
Dude, you are better of using your GPU's sharpening features. Much better sharpening algorithms than what is built in to most monitors and TV's.
@@MaaZeus Sure, and thats a semi new feature, but that wont work for consoles on the same monitor. Im not going to buy 2-3 different screens for the same area/space of gaming.
@Zyanide09 Im using one right now. I have an Hp omen 27qt with 8 levels of sharpness control.
@@windowsseven8377 Cannot find any info on a 27qt model, just the 27q and it says its a freesync monitor, so no g-sync module.
lg gn800
Some very solid info here on the differences between G-SYNC , FreeSync and Adaptive Sync.. I am in the market for a 4K monitor in the New Year so watching this video was very helpful.
same here did any catch your eyes? im barely getting into pc gaming builds and pc gaming. im trying to invest in the one that can handle all the top end and newest games
@@BillaBong1 Yes, one did. Unless I see something online that changes my mind in the next few weeks, my money is on the Gigabyte M32U. This is a 32 inch 4k 144 hz IPS monitor. This looks to be a good all rounder which should provide a significant upgrade to my present setup. I will also be buying one of the new Nvidia Super4070 Ti graphics cards when they come out to power this new monitor. I am eagerly looking forward to some buttery smooth gaming on this new rig.
@@HDEFMAN1 Heck yeah bro!👍you hit the right check boxs. im also looking for a 32inch 4k screen to fit in a spot im plan on turning into my lil gaming area. i got the rtx 4070 but gonna save for the supers when they come out. got the i9 14900k paired with a msi z790 mother board still debating on 48gb 2x24gb or 64gb 2x 32 domintator titaniums, or the gskill trident z5s. what u think i should go with? any other recomendations?
@@BillaBong1 If you plan on doing any video editing or content creation or any task that requires some heavy lifting then I would say that you could find a use for 48GB or even 64GB, if not then 32 GB may well suffice. As to memory brands I imagine that any well known brand will work as long as it is compatible with your motherboard. One thing that often gets overlooked when putting together a new build is sourcing a good psu. I went with Seasonic for my last build. You don't want to spend all that money on expensive components and then use a cheap psu to power everything. Sounds like you are going to have a sweet system when you put it all together.
this is a fantastic video -- thank you.
i am looking for PC parts for my next build & i was thinking of leaving the monitors last just because i was worried whether i was going with AMD or NVidia...
The worst part about monitors:
"Hey, the LG 27GP850 seems nice - lets look if i can get it nearby.... LG 27GP850P-B .... 2 other letters - oh, very different specifications".
That number/letter-salad and docents of variants just makes it a chore to find a good deal.
I would have liked if you showed judder issues with camera-pan demos, adaptive sync is a very important features, way more than modern and currently marketed ones.
The problem with VRR is OLED flicker. Would love to see you make a video about this. It straight up ruined my Alan Wake experience on my LG C1.
My va lg monitor with freesync flickers as well, so not only oleds)
Sad that the problem is very rarely discussed and you have no clue what panels are affected. ( in my case is happened when fps drops below vrr range, maybe “low frame rate compensation” feature is fixing it?)
In some cases, flickering can be caused by any type of performance overlay (like rivatuner overlay), so you should try disabling it completely and retry. Might do nothing, but it's worth a shot.
@@TheMaximusPride - You're right. I have flicker with GSync on my TCL in some games (but not others) with my Nvidia GPU, while the Freesync Ultimate in my Samsung panel is literally perfect in almost every game with no flicker at all on my AMD GPU. Both of these displays are VA. At least with my two particular panels, Freesync works considerably better than GSync, but it really seems to be on a panel-by-panel basis how well VRR works for a given person. In some displays it's flawless (my Samsung), in some it has varied results (my TCL) and in some it's just plain awful and doesn't work right.
I wish the industry would get a better handle on VRR because it seems like right now there's too wide of a variance of VRR performance depending on the panel type, manufacturer and processing in the display itself, and when you do get flickering, it ruins the game.
I don't get flicker on my C2 in Alan Wake with my rx 6800 but I do get flicker sometimes on my Dell S2721DGF in select games at certain refresh rates.
So glad your here to answer my questions without me even asking!!👍
This is a really excellent explanation. Clear and concise. Thanks!
Boy are you behind the news with respect to Free Sync vs GSync. When Nvidia developed Gsync it was a proprietary technology. The monitor manufactures had to buy the special Gsync hardware from Nvidia making these monitors more expensive. About a year later AMD developed Freesync that was an open source and open hardware technology (this means that there were no royalties) when this came out for every true Gsync monitors sold a lot more Free Sync monitors and all tv's were Free Sync. All the Nvida GPU owners complained to Nvidia so Nvidia quietly adopted Free Sync and they were able to make their GPU's compatible with Free Sync through software and included this in their drivers. The other thing that Nvidia did was to bribe most TV manufactures to market their TV's as supporting GSync on their packaging. However when reading the fine print you discover that these TV's also support AMD GPU's. This means the actual technology is Free Sync. I doubt that any true GSync monitors are made today and all TV's are Free Sync.
My monitor is from the early days of G-Sync Ultimate back in 2018, before they quietly lowered the standards for the qualification in 2021. As such, it has an older G-Sync module in it, and I've tested my monitor and found it doesn't work well with AMD cards. I bought it for the incredible HDR performance and color accuracy, and I honestly don't regret buying it, even if it only hits those values at or below 98Hz. I usually frame cap all my games to 90Hz, anyways.
Which monitor is that?
@@johnboylan3832 The ASUS ROG Swift PG27UQ.
Yay nothing is more fun as when ech companies undermine their own standards >.
@@Quast LOL Right? According to Nvidia, the qualification would have been impossible for OLED monitors to meet since it required 1000 nits peak brightness as a minimum, so they changed it to "lifelike HDR". I think they could have just added a seperate qualifier for OLEDs.
@MasterZoentrobe or just stated that for oleds it's only 1000nits @ max 2% or something, and let's say 300nits @ 100% window size.
In my experience, an oled with max 300 nits but peaks of 900 at up to 5 or 10% window size, far beats any other panel type.
The ability to have 900nits right next to a 100% black pixel is just something else.
This was extraordinarily informative and succinct. Thanks!
I do think that VRR struggles when the frame to frame variance becomes too high though. Might be worth looking into.
Additionally, you still need to perform some tricks (rivatuner frame limit, or radeon chill, nvidia reflex) to keep it below the VRR range in high fps scenarios.
Playing below vrr makes no sense (it's probably 48fps most often), and you won't generate more frames - practically every game now has limitations in settings...
flickering is such a big issue and none talks about it enough
@@rossmarbow-combatflightsim5922 Yeah I know, having a VA panel myself its why I use Radeon Chill, because it gives the best frame pacing.
I have oled so I just turn VRR off
only thing I turn on is in-game frame cap
VRR relies on games having level performance which most don't
And if the game has level performance then its usually good performance @@The_Noticer.
This video is so important, ty for making it
g-sync? freesync? adaptative sync?
none of the above works for me since 2018 on multiple hardware
hello vsync my old friend!
This channel was great. I have been following it for the past couple of month as guide to my monitor buying decision. Along the way I learned a lot of stuff. Gsync vs free sync, HD10 vs HDR400, 10 bit, 8bit, HDMI 2.1 vs HDMI 2.0, refresh rate 165 htz, ghosting vs overshooting, Srgb, DCI-P3, mini response time, 4k vs 2k vs FHD display (heck that was hard), IPS vs VA vs TN vs OLED vs nano IPS . Yesterday i bought a monitor and now that journey is coming to an end. I feel sad. I would miss these folks and wish them the very best. Go break a leg guys !
i went through THREE different 7900XTX, at least 7 windows re-installs, 4 freesync premium monitors (including a $1000 alienware monitor) and i couldn't get rid of the issue with my monitor randomly flashing a black screen. no matter what i tried, it kept happening. in the end i bought a msi 4090 suprim and it solved everything right away. i didn't even have to do a clean windows install or remove the AMD drivers from my previous GPU. i'm currently using the same alienware QD-OLED freesync monitor that i was having black screen flickering on with 2 different AMD cards, but i'm not having the issue with my nvidia card.
the amount of time and money i wasted troubleshooting, packing/shipping/returning/driving/RMAing to get rid of this issue was not worth the "savings" of buying an AMD card over nvidia. i don't really care what the prices are, next time i upgrade i'm buying only nvidia. the time and stress i won't have to put in is worth the extra $600 that the RTX 4090 cost me
A bullshit.
That's crazy. Never had a problem on my m32q on a 3080 or my current 7900xtx.
I just bought an WOLED ASUS monitor and had the random black screens aswell. I didn’t know it was free-sync. I have a 6900xt
Actually a phenomenal video. Thank you so much for putting this all together!
It's not phenomenal. It's misinformation. Displays with hardware G-Sync modules are objectively better.
I've just recently built a new gaming PC, ended up going full Team Red with AMD for both CPU (R5 7600X) and GPU (RX 7800 XT).. my monitor choice was based more on price than it was on features, and I ended up with the LG 34GP950G, which is branded with "G-Sync Ultimate"...
here's the clincher: I have serious stuttering issues with virtually all full-screen videos when I enable "Adaptive Sync" in my AMD driver software... I can't say I've noticed issues with local media files (not many to test with), but in RUclips, Twitch, and Netflix, severe stuttering starts up mere seconds after entering full screen, sometimes getting noticeably worse in stages every few seconds thereafter. this stutter goes away instantly when I leave full screen. I must point out that this issue simply does not happen when I have this setting turned off.
furthermore, enabling the "refresh rate overclock" on my monitor (allowing up to 180Hz) or leaving that off ("only" up to 144Hz) does not seem to make any difference, it still happens with Adaptive Sync enabled. FWIW, I have heard the current beta drivers (23.30.something..? I think?) do apparently fix this, though I haven't tried them myself as yet.. I'm currently still on 23.20.01.10.
so with regards to the theory behind the implementation of VRR technology, this was indeed a very interesting and informative video, but it definitely doesn't paint the full picture.... at least not smoothly (sorry, couldn't resist that one).
regardless, I just jumped from FHD at 60Hz to UWQHD at 180Hz, and it's blowing my mind... I don't think I could ever go back after this. (:
Just a theory but in my case I have an LG CX Oled and one of the first thing's I noticed when watching content via inbuilt apps on Netflix, RUclips etc is that I have terrible stutter on panning shots and after going down the rabbithole, I found out because the response time is so instantaneous on Oled panels, anything that doesn't reach it's native 120hz has a really noticeable slideshow effect due to 24hz, 30hz & 60hz showing a direct presentation of the smoothness of the content. My only saving grace is the motion interpolation on board, but the artifacts can be pretty bad at times so it has become a pet peeve for me.
When reading your comment I just had the idea that when you used adaptive sync, it must have lowered the monitors refresh to match the fps of the video which inturn produced the stuttering due to your panels instant response time not masking those frame changes like a traditional LCD does, in my research I learned that due to LCD's response time in the pixels turning on or off being quite a bit higher, that it in turn makes any motion or panning alot smoother as the low refresh rates are just basically less perceivable than if it was instantaneous like it is on oled and possibly because your monitor is displaying a fixed refresh rate when it's turned off maybe helps it display smoother?
I maybe wrong though as I don't use dedicated PC monitors, G-sync works with LG CX & 4090 and I have tested it out with it on & off in content but it still seems pretty unsmooth to me in both scenario's. Joys of Oled!
Turn off hardware accelleration. I know its stupid but it works.
@@09RHYS I've never used any super high refresh rate OLEDs before, so that is very interesting indeed.. the weird thing is that I would expect my monitor refresh rate to drop to either 60Hz or 30Hz in the case of RUclips vids in full screen, but not only does it feel slower than that, it also feels inconsistent, like it's constantly "hunting" for the right refresh rate to use. really annoying..
@@The_Noticer. I'm familiar with that issue, but unfortunately that is very different. I've tried that, and tested with numerous browsers (Chrome, Edge, Brave, and Firefox). this is specifically VRR causing havoc.
@@AndreGreeffIf I'm not mistaken I believe non 1080p60 streams from YT are at 24hz? Maybe with adaptive sync turned on, it could be having a weird mismatch between the G-Sync Ultimate module & your AMD card and it's possibly overcompensating the sync maybe and not enabling the 1-180hz VRR range properly?
It does sound pretty annoying after paying a good chunk of money for it all and then having to manually switch when you game or play media.
It could also well be the browser or heck even the stream causing it to happen where maybe it isn't a constant fps but instead a variable thus making the stream feel unsmooth when adaptive sync is turned on, making it seem faster or slower than it is meant to be, I know I have issues with any variable fps media I play using my built-in media player on the CX via DLNA casting from my phone, where because it isn't a fixed fps, I get really bad motion handling with the aforementioned interpolation as well as with any 29hz/25hz content, which presents as a tiny stutter every 5secs and is jarring when things get very fast in motion or panning shots.
Honestly monitors and TV's will never cease to baffle me when it comes to the source material's fps rates and it's handling of it! haha
Also, for some reason I read your first paragraph thinking you were referring to owning the new alienware oled! Sorry it was a little late and I think the person above your comment at the time said they had one I believe 😅 But I guess it still is relevant to the previous theory due to your panel still having super low response times with a Nano IPS. Also, one last thing is have you tried any kind of disabling of the monitor features itself when using Adaptive Sync? That maybe worth a try to see if any of the features are what's causing the issue.
Anyways, I hope you have some luck with figuring it out, maybe the beta drivers will be the best solution until an official driver comes along of all else fails! 👍🏻
Would you say its always worth it to turn adaptive sync on? There are no downsides to it?
@15:44 lol, basically the differences are irrelevant i've learned now
That was great Tim, thanks a lot!
I'm still a bit scared that the issues with my HP Omen 25 will repeat - when switching it to FreeSync (using a 3060Ti) it absolutely messes up the colour saturation, especially greens, and lowers the brightness to 90.
When manually upping it again, it sometimes switches back to a non VRR 'custom' mode - so I usually don't use that feature.
I mainly just noticed that some applications cause it to fizzle out, with the screen going black, which is super annoying, and I'm not sure if that's just cause they're technically 'incompatible'.
That's what the nVidia Control Panel says, but still lets me activate G-Sync, and the Pendulum Demo looks fine, too.
(why does that peak to 100% GPU usage tho, revving up the fans?)
you should have rma'd that monitor.
@@nonyabusiness-f9e I just assumed it was standard behavior (definitely heard of random flickering/black screens in some apps before - disabling HW acceleration fixes them - and the colour issue is just dumb implementation by HP), and I got it on a good discount.
It wasn't really new back when I got it, and couldn't check VRR back then cause I was still on my good old 960Ti for some time afterwards.
It's on 11000+ hours SOT now, and I'm still quite okay with it until an upgrade that truly feels worthy and worth it, and I actually don't really use/miss VRR on a day to day basis.
Hey there, is there a way to make a guide on how to properly setup freesync/gsync in games? Because there's a lot of conflicting information online, like for example, which kind of vsync you need to use (double, triple buffered or none at all), do i need to keep the game in fullscreen mode or windowed maximized is enough, do i need to limit my frames below a certain treshold via RTSS. Thanks for your work!
This.
Premise: playing without vsync, freesync, and gsync, is the gold standard, because it guarantees the best latency in competitive games, and this should be the ideal configuration, unless you have screen tearing that you can notice.
If you want to play with gsync active, you have to make sure that the game’s frame rate never exceeds the monitor’s refresh rate! So if you have a particularly heavy game (or a screen with a lot of Hz), and you want to play with gsync active, there is no problem in setting a limit to the maximum fps.
If instead with gsync active, you expect to exceed the maximum fps, this could definitely introduce a lot of input lag, so it is better to limit the maximum fps. There are different ways to limit the maximum fps. The best one is from the game engine, from the game settings, as it adds latency, but less than the alternatives, if this is not possible, the second best method is from the nvidia panel (or from the panel of the gpu brand you are using), from it you can limit the fps, it will add more lag than the game limiter, but better than not using anything. The last alternative is to limit from RTSS, RTSS could be advantageous in some cases, because it allows very precise limitations on frames per second.
Also, if with gsync you want the maximum reduction of tearing and the maximum consistency of frames, you should enable vsyc, and consequently also triple buffering. Obviously this scenario adds latency considerably.
The only advice I can give you, that nvidia, nor amd will ever give you, is: do not use technologies like gsync or freesync, if you want to play with low latency or competitive games, and above all, use them only if screen tearing is well present, if you do not have screen tearing, it does not make much sense to enable these settings. Remember that you are playing games, not watching games, so a better latency is preferable in almost all cases.
I just want to know when will the VRR flicker be fixed. My Samsung Q80T has intense flickering below 80fps and after spending countless hours online searching for a fix the conclusion is this is still unsolved. It's what is keeping me from switching to OLED, I read the issue is more present there due to deeper blacks and higher contrast.
afaik flickering is only a problem for VA panels. OLEDs and IPS exhibit no flickering, correct me if I'm wrong though.
Of the two Adaptive sync monitors I've owned, none have flickering. AOC 24g2 is a Freesync Premium product, no flickering. The MSI G2774QPX I bought recently is G-sync compatible AKA generic adaptive sync. No issues there.
Maybe it's more of an issue with VRR TVs?
@@charno_zhyem Given most of them are VA (my Q70R is) I would say yes, although mine does not flicker at all so who knows.
never its just mostly useless tech
Plz add a series of videos where u make the same content, how to setup a gaming monitor, for different brands. U could make one such video even every 3 or 6 month. Once an asus monitor, next aoc, lg, gigabytes, acer and so on. This helps a lot and has other benefits as well if u think about it. Thanks
Thanks for the GSync Compatibility tips. I had no idea I needed to go into control panel to enable the settings for my new monitor 😅
Great video Tim. Very easy to understand.
Is VRR bad for competitive games? I mean it is better to see a half frame as soon as possible to see an enemy. Yes, u have this split between frames, but it is the fastest method.
subjective. try both and see what you prefer.
No, better to have smooth and consistent experience
One note, I see people saying to lock a 144hz monitor at 144fps, and this is WRONG. At the max refresh rate, Adaptive Sync stops and V-Sync (if you have it on) takes over. Always lock your FPS below the max refresh rate of your monitor and tbh, you should lock at whatever FPS you can HOLD, not peak. This constant lock will help VRR flickering, espcially on OLED panels.
I just assumed my freesync tv wouldnt work with gsync so I didn't even try turning it on. I've been gaming on my new PC since February but only started using gsync a couple days ago... what a difference! I kept turning graphics settings down before to be able to hit 120fps consistently. Now I can just max everything out and even if its only 90-100fps it still feels smoother than before and looks WAY better.
Are there actually monitor that still use Gsync? Not Freesync with Gsync compatibility. Actual native Gsync. Back then it was a premium feature for expensive monitors. Nowadays even 1000€ monitors just go for Freesync haha
Yes, there are still Native G-Sync monitors, Free-Sync is just the standard and more affordable option for everyone now.
Have had the same 144hz 1080p monitor from ASUS for like 6 years now and never knew FreeSync would work with my then 10 series, and now 20 series GPU, so I just never enabled it/looked into it at all. Super curious to see how it works, and really surprised it's just a pretty simple process.
Really glad I found this video.
No mention that AMD introduce Freesync as an Open Standard and forced Nvidia to abandon their strategy? Ah yeah, good old Tim not telling the whole history and defining this as marketing.
Thats why competition is good
It's not really relevant
And monopoly is dangerous. Don't get me wrong there isn't any goodies and baddies. They are companies and their purpose is just to milk the max money they can. That's also why it's illegal for them to make agreements (spolier alert : LOL)
Vsync, Freesync, Freesync Premium Pro, G-sync, G-spots.. this is one big djungle man. Thank you for providing clearity!
One of the Nvidia GSync module's biggest downside is the lack of support for HMDI 2.1 and DisplayPort 2.0+
Didn't there also used to have a heat issue and so require fans that generate noise in some cases? Recent implementations don't seem to have that issue
Also G-Sync Ultimate refresh rate range used to be wider, but likely part of a requirement that changed
Wait, is this Asus monitor(VG28UQL1A) a lie ? I remember seeing it has HDMI 2.1 and both gsync and freesync.
@@SOG989 it'll have gsync compatibility, not regular or ultimate, as it won't have a gsync module
You may have mentioned this but there is a TON of information so maybe I missed it. Its worth noting if you have one of the first GSYNC (that have the chip) monitors and buy an AMD card you will not be able to activate gsync.. and your refresh rate could potentially be limited. Found out the hard way when I bought a 7900 XTX (first time I've ever bought an AMD card). My monitor (Asus PG278Q) would no longer go to 144hz, couldn't use gsync, and max refresh was limited to 120hz.
Thanks for the high quality videos and I appreciate all the monitor videos. Really helped me find an upgrade from that Asus monitor.
It felt glossed over, but I can't understate the benefits of G-Sync's adaptive overdrive, at least on IPS panels. I tried the switch from G to Free, and found myself immediately disappointed with how much overshoot I experienced in the lower refresh range. I ended up selling it and buying a gsync panel, and all my woes went away.
I don't mean to say Freesync is bad per se, the price difference and better interface support are no small benefit and may tip the decision for the right people. However as with most things AMD vs Nvidia in my experience, the AMD solution always feels like the "budget" solution to Nvidias implementation. Until monitors all have perfect response times down the entire refresh range, I can't bring myself to switch back to Freesync regardless of price, the experience simply isn't up to par.
i totally agree and went with gsync (hw) for the same reason, even before misbuying.
but to clarify, nobody is stopping manufacturers from implementing variable overdrive to their scalers, like it's in hw gsync. there's even a few non-gsync monitors that exist with variable overdrive. manufacturers just like to omit this feature, because it is added cost on the bom/engineering.
G-Sync's adaptive overdrive sounds really neat but I feel like these licenses tend to have less and less meaning as time goes on :^(
I have a display that switched from the G-Sync HW module (AW2721D) to a FreeSync Premium Pro license (AW2723DF) and under VRR there is no (significant,
That is totally on that particular model of monitor. As Tim said. Several times in fact. And he even spelled it out that a hardware G-Sync module isn't even a guarantee that the overdrive modes are set up properly, though it is less common.
So basically you didn't listen to a word he said.
@@andersjjensen I don't want or need to listen to facts when I'm justifying my purchasing decisions, buddy.
@@andersjjensentypical Nvidia shill lmao 😂
So clear! Extensively deep and simple to understand. Awesome work!
It's important to mention that adaptive sync is so prevalent right now because AMD decided to make their VRR implementation a part of VESA standard for displays, that is why it's called FREEsync, as opposed to Nvidia' implementation.
" right now because AMD decided to make their VRR implementation a part of VESA standard for displays"
No it is the other way around. Vesa VRR was based upon nvidias contributions and then AMD marketed their proprietary implementation as freesync.
@@ABaumstumpf No, you're the one who has that backwards. It was AMD that submitted Freesync to VESA, which later became what we now know as VESA VRR on TVs. It was even part of AMD's original presentation. Nvidia announced G-Sync first. But it required a compatible card, licensing and a dedicated expensive module in the monitor. "Freesync" was literally coined as a direct result of how expensive and proprietary G-Sync was. AMD then announced and demonstrated Freesync shortly after and then specifically announced during that same presentation that they had submitted Freesync to VESA for inclusion as an open standard so everyone, including TVs could use VRR.
Look, I have an Nvidia card too, but don't be a fanboy. Nvidia doesn't "contribute".... ever. They develop and propriatize everything for as long as they can get away with it. The only things they contribute to, are the things they don't really have much choice on. It's mostly AMD that contributes to the industry with open standards anyone can adopt. Tessellation, Mantel (now known as DX12 and Vulkan), Freesync/VRR, etc. I could make a huge list but I don't think I need to, because you can as soon as you think "what features did Nvidia contribute "openly"?", you basically stop. PhysX? Nope, locked, never opened and dead now. G-Sync? Nope, was dedicated hardware, proprietary and took years of losing to AMD's Freesync before they had no choice but to open it (even after they were already using Freesync on laptops but still calling it G-Sync, but continued to block their desktop GPUs because "money"). You remember when AMD developed Tessellation and invited Nvidia to use it. Nvidia refused, basically saying it had little value, then Microsoft included it in DirectX 11 and then it turned out Nvidia's hardware was better at Tessellation than AMD's hardware. You remember when AMD developed Mantle and literally invited Nvidia to join the endeavor? Now look at RT Cores and Tensor cores for ray tracing and DLSS.
Nvidia does NOT "contribute"... Ever. They exploit. Unless they're forced to contribute. Nvidia does NOT play nice nor share... Ever. Nvidia's greed will never allow them to share nor contribute. And Nvidia's pride will never allow them to work with the competition even when invited and even when said endeavor would directly benefit Nvidia with an advancement that would benefit the industry as a whole (ie Mantle).
Nvidia does make good products, but don't ever delude yourself into thinking Nvidia is a good company.
@@valhallasashes4354 Nvidia is not a good company , its a for profit mutli billion corp. So is AMD. Lets not kid ourselves . AMD does what it does because they are not the market leader, because open source technologies are good for PR and has bigger chance of getting adapted . You cant demand for something when you have less than 20% market share .
Totally incorrect, VESA Adaptive Sync predates GSync and was not developed by AMD at all.
It was used in embedded displayport as a power saving feature, gsync was introduced as a way of getting VRR on external displays specifically for gaming. Until GSync was released, no-one considered using VRR for its gaming benefits, only for its power benefits. A year after the release of GSync, VESA Adaptive Sync was ported to the external displayport standard and was relabeled as Displayport Adaptive Sync. Literally straight out of the original press release:
"Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync."
Adaptive sync is not "AMD's standard", as much as they would like to claim it.
@@Jas7520 Pulled from AMD's own FAQ page about FreeSync:
What is DisplayPort Adaptive-Sync and how does it differ from AMD FreeSync™ technology?
DisplayPort Adaptive-Sync is a new addition to the DisplayPort 1.2a specification, ported from the embedded DisplayPort v1.0 specification through a proposal to the VESA group by AMD. DisplayPort Adaptive-Sync is an ingredient feature of a DisplayPort link and an industry standard that enables real-time adjustment of display refresh rates required by technologies like AMD FreeSync™ technology. AMD FreeSync™ technology is a unique AMD hardware/software solution that utilizes DisplayPort Adaptive-Sync protocols to enable user-facing benefits: smooth, tear-free, low-latency gameplay and video.
Bottom line. Everybody is claiming credit and I am not about to get into a he said, she said argument. And I sure as hell am not going to get into a debate about derivatives of technologies in their earliest "forms" regardless of what their capabilities may or may not have been because I have no way of knowing what the full scope of those capabilities were. And just because it "existed" in "some" form, doesn't mean it was "used" in any "meaningful" form. That's like a child screaming "I knew the answer the whole time" even though they never used that answer in the way that mattered to the actual event in question. All I can tell you is what I saw when I was there. The events I listed in my post are what I saw. Both as things were announced, when they were announced and how events developed and transpired over the years as they happened.
It doesn't matter who made the propeller first. What matters is who took the propeller off the mobile and put it on the plane first.
Rule of thumb i go by:
VA panels: try to aim for a gsync module monitor with dynamic overdrive that scales with refresh rate on the fly.
IPS panels: matters less due to lower need for overdrive, but i prefer gsync module.
OLED: doesn't matter, go freesync. OLED has no use for overdrives.
I bought an Alienware AW3423DWF and with g-sync enabled it flickers in everything, but its gone if i disable g-sync. I decided to return the monitor due to the flickering.
You can buy the AW3423DW and enable g-sync ultimate and it will still flicker. It’s an issue with OLED
nah its an issue with all monitors
vrr is just useless most of the time@@MrAlex26069114
Wow, very interesting video!
Super clear explanations of a rather complicated topic!
where is N-Sync ?
IT AIN'T NO LIE BABY BYE BYE BYE, BYE BYE
Also AMD Freesync became the VRR standard adopted by VESA, anything VRR even without Freesync is under the VESA standard which adopted AMD's VRR implementation of the regular display scaler doing the work instead of the nVidia propietary module back in the day.
It's not a comparison but there's "vs" in the title🤦
He spent the entire video comparing the two.
Why can't everything be made much simpler?
0) limit FPS below monitor Hz for selected games
1) the video card creates a frame
2) The frame includes the parcels and mails them to the monitor.
3) The monitor receives packets and collects frames
4)Enough time has passed since the final assembly of the frame monitor
5)enough time has passed for it to be able to show the new frame -
interrupt (new_frame) {
if (new_time_freme >= Constant_frame_time) show_frame();}
can i get a rtx 4090
Yes definitely just work a job for 2 months with no expenses
Thanks for this video. I'm in the process of looking for a wide-screen monitor to replace my current 5:4 ratio LG Flatron 19 inch monitor, as it doesn't perform well with 4K videos. I'm limited by space to 24 inch screens and have several in mind, but didn't understand the terminology of the types and whether they would be compatible with my AMD Ryzen 5 5600G based PC. I'm not a Gamer, so high frame-rate is not a problem to me, as long as it's better than 60 Hz. I'm looking at 75 Hz, or better, with 4K capability. Your video has now clarified the situation and I shall buy whichever monitor my local Computer Warehouse can supply at a reasonable price.
Fun Fact : I just recently bought an Asus FreeSync Monitor that was actually marked as "G-Sync Compatible", it even had a G-Sync Sticker on the front of it.... but it only had one HDMI port... no DP... i never felt so scammed in my life x)
Nah i call bs… every monitor after freesync/gsync release time came with native dp support… dp is standart since like a decade :D
@@Roarscia I swear my "gaming monitor" only has 1 hdmi port and that's it.. this is the first time i saw a monitor with only 1 port
Even my second old TN office monitor has a hdmi AND a dvi port.. but not my main asus one
Good stuff. However, I am running a AW34DWF at 165Hz w/ a RTX4070, and when enabling VRR on the control panel, invariably (pun intended) flickering ensues. The games are running between 60 and 85 fps, so I wouldn't think it to be an fps issue. So I'm a bit confused by the statement that this panel properly supports G-Sync.
Should the fps be higher (165)? Closer to the refresh rate?
Or should I decrease the max. refresh rate (60, 100Hz)?
That pretty weird. Maybe try a different cable? Try different settings in the monitor OSD? Don't have that monitor but it doesn't sound like a frame rate issue. Whole point of GSync is that it shouldn't matter. Doubly so since you are smack dab in the middle of the range.
hardware can't fix bad software
if the game can't run straight and level your only option is to turn off vrr or put up with the flickering
@@rossmarbow-combatflightsim5922 I think you're right, because it changes from game to game. Some do run great. Others present flickering on specific instances like hints, or menu overlays, and almost all flicker during loading screens (but there I can understand that the fps plummet)
Nvidia G-Sync = dirty marketing
This guide was so helpful.. I didn't even enabled the freesync and I thought its working...
I have AMD Freesync Premium Pro? And it's awful. It's incredibly hard to get it to work at all, and when it does your monitor turns into a stroboscope during loading screen or whanever the fps drop below the supported 48fps. G-sync good, amd bad. Such poor experience compared to Apple.
Had these issues to, its most likely because you use a VA or OLED Panel, which can vary in brightness on different Hz (Since FPS & Hz stay synchronized when using it)
Nothiced it a lot on my old Samsung VA, especially in bad optimized games as the FPS would drop quick & randomly. After uppgrading CPU & GPU it was not noticable in any modern games, because of the more stable framerate = stable Hz :) But switched to a G7 Odysseu and not seen the issue anymore
What you're seeing is PWM flicker, which happens on EVERY non flicker free display.
Look up reviews before you buy a display (RTINGS for example tests for PWM flicker).
@@Drubonk OLED indeed, why is that a OLED issue? Philips Evnia 34" ultrawide OLED. It's 175hz. I play at locked 120hz as I prefer a stable frame rate and at that rate it's fine everything above 48 is fine. Which in games is fine but menus or loading screens often drop below 48 for whatever reason.
@@MLWJ1993 No, I know what that is, I avoid screen with that due to migraine. I have an OLED monitor which does have the standard oled flicker, I can handle that. With Freesync on and below 48fps the monitor flickers very badly. Flashes almost. Very unpleasant and in badly developed games menus can be awful.
Pure bs
Very informative, thank you! Things have definitely changed since the last time I was in the market for a monitor and there's a lot of confusion about this online.
I have a question, do G-synch or FreeSynch add latency over no G-synch or FreeSynch? FPS gamers such as myself don't run either because we are running the game at the monitor's max refresh rate and there is no fluctuation or minimal fluctuation of frames.
I think there are videos being made on the subject. But if i remember correctly freesync/gsync did add slight latency.
nahh, just get such a powerful GPU, that you get more fps than your refresh rate :D :D
honestly. I have a 4090 and I use gsync in every single kind of game + locking my fps, idk maybe I'm weird. but it also saves power and redzces heat for both components.
@@lawgamingtv9627 Exactly, I limit my 4070 as well even. It's already really efficient but I can almost half the power usage by limiting the fps to e.g. 90. For a lot of games I don't need to hit 120+ and locking it to 90 just makes it more consistent overall. Less power draw, less heat, less noice, barely less smoothness
@@olifantpeer4700 I do that with RTSS too. Cap the framerate to make .1% and 1% lows a lot better and more stable frametime. I don't like fluxation of frames unless its only a bit and not 20+ so I am still VRR but capped around my average so it dosnt go up/down by tons of frames.
This Explanation is great, and browsing about the topic got me so confused. Its a bit long, but now, all questions are answered :)
Thank you this answered all the questions I had on the subject
Thanks for this! Really helpful and easy to understand.