40% Less Input Lag Without AMD Anti-Lag or NVidia Ultra-Low Latency
HTML-код
- Опубликовано: 17 сен 2019
- What should have been a simple and straight forward test of AMD's Anti-Lag and NVidia's Ultra-Low Latency mode took an unforseen turn when I discovered something else that has a major impact on input lag.
► Support Battle(Non)Sense:
/ battlenonsense
► Stable Frame-Time / Input-Lag: • How To Fix Stutter In ...
► Previous video: • Bufferbloat & Lag - Wh...
► Next Video: • CoD Modern Warfare Bet...
► Connect with me:
➜ FB: BattleNonSense
➜ twitter: BattleNonSense
➜ email: chris@battlenonsense.com
► Outro Music:
Many of you asked for the name of the outro song. Sadly I have to tell you that it is not a "song". It is a custom made music to be used for intros/outros that I bought a while ago.
#InputLag #Anti-Lag #Ultra-LowLatency Игры
► Stable Frame-Time / Input-Lag: ruclips.net/video/xsXFUVYPIx4/видео.html
11:20
So the reason you get more input lag when you're GPU bound is this. The game sends a bunch of commands to the GPU, which get executed asynchronously. Then the game calls present(). At this point the driver can force the game to either wait until the frame was completed, or it can just let the game render the next frame, even if the current frame isn't completed yet (leading to better GPU utilization, as there are always more rendering commands in the pipeline).
How often the game can call present() while the first frame still hasn't finished rendering without being stopped by the driver is essentially the "pre-rendered frames" setting. If the driver would *always* stop the game at present() even if no other frame was in-flight, performance would be terrible because you would potentially give up a huge amount of GPU/CPU asynchronicity. (I hope that's a word.) But stopping the game when a single frame is in-flight usually only incurs a small performance penalty. I guess what low-latency mode is trying to do is guess how long they have to block in present() so that the next present() comes in just as the GPU finishes the previous frame.
Of course if you're CPU bound (or simply *not* GPU bound through a frame rate limit), none of this really does much, because every time you call present() the previous frame is already done anyway. It's essentially perfect low-latency mode.
What can be done about this? Well, nothing really. Except for low-latency mode or frame rate caps. You could be even more aggressive than the current modes, but that would incur an even bigger performance penalty. And then you have to ask yourself, in (probably competitive) games where people care so much about latency, aren't users playing with low settings anyway to get the most FPS, and are therefore usually CPU bound anyway? It's probably not worth it to provide an even more aggressive mode.
Does this applies to CPU's aswell?? I mean if we limit CPU usage to 95% would i get lower input lag than CPU running at 99%??
@@futmut1000 with better cpus is hard to get over 50%
@@Linets. Some games like bf4 take cpu2 realy hard sometimes near to 100%
So basically...can we get a gpu load limiter? Essentially a variable fps limit so not to max out gpu load but still get max fps at each moment?
DAMN battle(non)sense, you need to do more videos..... your videos are some of the best. As a competitive gamer (not a pro, just decent player) i absolutely love the topics and information you provide.
I see that you tested other games other than Overwatch however did you test with those low latency features turned on for those other games? You only showed graphs where it was off and you were controlling framerate. I ask because not all games use the lowest possible render queue and I think by default BFV is set kind of high compared to other games. I don't doubt your results I'm just curious what the results would be with the low latency modes turned on for other games.
Oh hey Ravic, fancy bumping into you here
Doing this test with csgo would be interesting because input lag is so crucial for that game and also because it is so much different than the other games tested (heavily CPU limited, dx9 and very high fps)
Yeah my GPU is chilling at 40% even at 300+ FPS so I guess this doesn't really affect me, or most CS players with decent PCs. It is interesting that anti lag increased input lag under these circumstances tho. I just left it on all this time as i didnt feel a massive difference.
If it's heavily CPU limited, why would it matter? GPU will never reach 99%.
@@allongur Right now we don't know if the drop in latency is due to less frames being rendered or less GPU utilization. We also don't know how anti lag affects you at really high framerates. Testing CS can help answer these.
@@mt441 I initially thought I felt a difference but now I switched back and saw no difference soooo yeah, its impossible to tell without actual data. Never underestimate the power of placebo. That or when you're already getting 350-500FPS the change in input lag is so low that its impossible to notice.
Excellent work. Thank you so much to your patreons for funding this! And thank you to you!
what an awesome test, thank you for taking the time to do this
Danke für dieses video, deine arbeit ist extrem wichtig und hilfreich für eine ganze szene und hat mir sehr geholfen, auch diesmal wieder. mach weiter so!
Did you test this in fullscreen exclusive mode?
Windows updates have started making games run in their own optimised mode (allows for faster alt tabbing but causes input lag) when selecting the game to run in fullscreen, in some cases. To fix this you have to go into the properties of the games .exe file and disable fullscreen optimizations, that works for most games (some exceptions) and should put you in fullscreen exclusive mode and reduce input lag.
Nani?!
@@Kissislove17 Basically Windows run games with WDM Tripple Buffering and call it *fullscreen optimizations* to allow faster response to Alt+Tab and window switching
Welp, guess testing will need to be performed yet again.
Even when you alt-tab it causes Overwatch to have reduce buffering broken. The three dots on the fps counter in overwatch means the reduce buffering is broken and you have to re-enable it.
@@AssassinKID happen to know if disabling FSO still works in 21h2? Or if there's something else you have to do now to fully disable it? I use to disable it on every game when i was on 2004. Stupidy decided to format and update to the latest win 10 build, and now disabling it feels like it does absolutely nothing tbh.. It's bugging the hell out of me. Average accuracy has gone down in games.
I've always capped my FPS in Overwatch to 140, because it feels lower lag than uncapped, despite everyone saying uncapped is better. I never knew if I was imagining it or why it would work like that. But, this seems to confirm that I was right.
Ive always wondered when I played it back in 2016. Read somewhere that when your gpu starts struggling you get burst of input lag. Im pretty sure that was correct for overwatch. Mind you I used a 7870 AMD and a 770 gtx later back then.
Uncapped is still best input lag, just make sure anti lag or reflex is on so it manages your gpu% to not go too high
This is why I searched for your video when I was curious about this. You're the only who truly does in depth testing. Thanks Chris
Your RIva Tuner video was a godsend! I was getting stutter that I couldnt figure out how to get rid of it and using Riva and your tips to limit frame rate, eliminated it entirely. Youre videos are incredible and always a pleasure to watch
I just have to say that your videos are incredibly interesting and helpful !
Thank you so much for all this helpful information wich would be very hard to get without your channel.
You do some of the best work in the industry. Bravo man. I wish I could get as much value for the community out of my hardware as you are able to. Fantastic work. Thank you again
really proud that still are people like you that digs intro the real data and not just opinions. your work there is awesome.
This video and Battle(non)sense is probably why NVIDIA and AMD developed Reflex/Boost technologies which keep the GPU load below max. You sir are a treasure to the gaming community. Thank you for your content.
Amazing work. TY for the info. Input lag is one of my favorite issues to tackle when it comes to gaming optimization and you always surprise me with new findings!
To achieve the least inputlag you have to have a high amount of FPS even if you have low HZ screen for a better frametime but never letting GPU to reachh 99% usage, when gpu is overloaded it gives a huge input lag, so for example is better to have 120fps with gpu at 80% than 144fps with gpu at 99%
thank you
Wrong
It's insane that nobody has discovered this insane flaw yet! Nice job
But... it is discovered ?
@@norpanmekk yeah lol, i think he worded it wrong xd
You deserve much praise for this video...great work. How could amd or nvidia not explain this clearer for people? Subbing based on this video.
Hey Chris, thanks for the great content. Much love from Canada
this video is HUGE value
i always use capped fps because i prefer consistent frame times and a cooler gpu. it's amazing to know this is also giving me less lag.
capping framerate is something all gamers need to learn about
Agreed!
i hope they paid him for this, NVdia just came out with Nvidia Reflex option automatically to be enabled which i suspect might be from what he told them, i think it keeps the load from 97% +
I would have really liked to see the 5 tests shown at 9:20 repeated with G-Sync active.
EDIT: Also perhaps the first and last test repeated with Ultra Low Latency set to On (previously Maximum Pre-rendered Frames = 1) just for posterity.
Hope you do a follow up on this based on possible responses from AMD/Nvidia and factor in using RTSS vs in-game limiters, because my head hurts right now. RTSS limiting with those frametimes feels so smooth though. I must have missed your earlier video. Great work as always!
Battle(non)sense we miss you
keep the content coming
Love your testing. In your tests, you capped your frame rate so basically you are not CPU limited. For the sake of testing, I would love to see the effect of being CPU limited on input lag. Your video has shown that higher FPS does not mean lower input lag, it also depends on your GPU load. I am wondering if it applies to CPU load as well.
esport companies should hire this guy to do more testing of this sort.
thats insane! I love this scienentific approach to gaming and how to improve ones experience.. you're awesome!
Chris, if you are reaching high gpu usage that means that the cpu pipeline is faster than gpu at their maximum. The obvious conclusion is that it will pre-render the frame so the latency will increase. That explains why antilag worked. When you fix your fps then you Will run with pre-rendered frames of essencially 0 as both gpu and cpu are abble to render in-time and even sleep. This could be observable using GPUView. What is really interesting is antilag getting higher response time when the expected behaviour is that It should nit affect at all.
That makes intuitive sense, but the difference in frame rate is extremely small compared to the difference in lag when capping at slightly below the max gpu-limited fps. It seems like those extra buffers must not be adding much to the average fps, or maybe something else is going on? I'm also curious why capped 80 fps had more lag than capped 60 or capped 140 (with 150% render target).
It also seems like the anti-lag mode must be doing more than just eliminating buffers, since as you say you'd expect no effect for anti-lag off/on while capped if that was it.
But still having the GPU delivering the highest FPS (which should lower the input lag) and then delivering that frame immediately to the monitor without storing it in the buffer should give you the lowest input lag possible, but for some reason this is not the case. Its super weird imo
Oh boy, you must enable reduce buffering in Overwatch.
Tested that with G-Sync, V-Sync and G-Sync + V-Sync, and trust me, it reduced lag in every case .
You should have that enabled at all time.
Dyils
Yes, but also no.
Wether NULL is on or off, I experience the lowest latency with the in-game setting.
NULL works best when the FPS are exceeding monitor’s refresh rate.
It only makes a minor improvement when the FPS are capped at refresh rate or lower (4ms according to this video).
The G-Sync + NULL video shows NO improvement at all when FPS are capped to refresh rate with G-Sync + V-Sync ON.
I’m giving my point of view about something that haven’t been specified in any test.
Maybe, just give it a try and make your own conclusions ;)
Dyils I’m sorry for being confused in my explanation.
You don’t need to do the led trick with a high speed camera to mesure that.
I’m simply saying that, in Overwatch, there’s a noticeable input lag reduction when you enable the in-game reduce buffering, and this applies whatever your sync settings are.
When enabling low latency through the control panel, I see no difference at all, like there’s still one frame in the buffer.
I really see the difference when I move my mouse the fastest possible over a short distance. It more reactive by one frame (7ms for me).
You better see that by focusing your eyes on a point between your screen and your mouse, so you can have both in your sight.
Gameplay feels less spongy.
(I can hear a 7ms delay when making music, so why my eyes could not ? Everyone is different)
You forgot maybe the most important comparison: in-game frame limiter with anti-lag/ULL vs RTSS frame limiter with anti-lag/ULL. RTSS is by far the best solution to cap FPS and get stable frametime, however it is known to add 1 frame extra delay. Since RTSS holds both CPU and GPU to reach 100%, it may as well co-work with anti-lag/ULL and give the ideal maximum.
Nice video BTW, thank you!
Do you have csgo?
You should run desktop 1024x768 and run csgo in same resolution with all setting low as possible. See if you can feel the difference between the two.
Seems to be the best real world comparison without the equipment to actually test.
I already do this but my frames sometimes drops a bit from the cap value to a lower value and i get stuttering or frame stuttering or w\e you call it.
Lime I noticed csgo fps_max 0 much less latency then capping at 300 FPS. The issue people forget is source’s tuner will break and cause clock drift once FPS hits 1000 even if for a second (menu/loading map) so if u do fps_max 0 only do so once loaded into a server until then keep it capped at 300 or what ever. It’s a pain but keep csgo from breaking . P.s. fps_max 999 is trash!
@@PixelGod240 Without the equipment, I'm the "margin of error". We are talking about 8.3-16.7ms differences. Hard to notice, impossible to justify with bare eye.
I've suscribed right now to your channel because you've this particular, clever pedagogical method of going to the point. Thanks a lot for the quality and intellectual honesty of your presentation. You make us earn one thousand of hours of checking some unintelligible others pretending experts. Best regards with all the wishes for this year.
super interesting results! I always noticed a reduced input lag in OW, when my gpu wasn't maxed out, since I did several tests in the past as well. But I always thought this was an OW related (engine) thing, so thanks a lot for testing this with other games, too!
Was waiting for this since the discord announcement, cheers
Yet another essential video that is changing the way i'm playing. Thank you for bringing those complicated topic to the masses!
I'm trying this Low latency mode coupled with Fast V-sync and the results are promising. I can't calculate precisely, it's an overall feeling yet it feels stable and the picture is good.
My situation: 4K with 1080ti, it's not good. Really bad tearing, and the input lag is atrocious with Vsync, not mentioning the stutter when going below 60fps, i have a 60hz screen.
Well, Fast V-sync works as intented and eliminate tearing while not stuttering if i go below 60fps. What i like the most though, is that stabilized input lag. I didn't realize it was fluctuating that much and it deeply affected my gameplay. Thank you so much battle(non)sense!
Bro Fastsync only works when your fps goes above the monitors refreshrate.
what overlay do u use to check gpu usage?
watch the video...
Theus I did dumbass, I tried setting it up in the program and it wouldn’t show what his does, it showed only FPS and a few other things
Zeddbtw he’s using rivatuner
Thanks Chris! Dunno how I missed this one! Great info, as usual!
Cap the frame rate to the 1% low and/or lower the settings until your gpu is running under 95% utilization. This will give you the smoothest experience AND the most responsive. Thank you B(non)S
ya anti-lag is pointless... better to just lower settings or cap frame rate lower so you are below 99% gpu utilization... which will reduce input lag more than anti-lag.
@@resresres1 yeah but I'm not capping my 2080 to 40-50 fps to play warzone.
@@mryellow6918 cap it higher than that and make sure it doesnt reach 95%
@@Oo8SNAKE8oO the game runs like shit it doesnt reach that anyway
@@Oo8SNAKE8oO where can you find tour gpu utilization ?
Wow. That's awesome. Great video, as Always! I noticed this too when locking the fps in overwatch, it somehow felt snappier than using a higher but variable framerate)( gpu is maxed out). Keep up the good work!
Jup. I don't play Overwatch, but noticed that in other games.
I've noticed the same thing (also OW) when using freesync + 142fps cap on my 144hz monitor, it felt smoother then uncapping it to an avg of around 210fps. I just thought it was a side-perk of freesync rather then being below 99% gpu load.
this video is so well done, good research and nice presentation. good job man
You put an incredible amount of effort into this video, great job.
Sounds like an optimization issue somewhere along the frame processing pipeline, within the GPU driver, DirectX or the game engine itself. Obviously, frame rate limiting will not be an optimal solution since fps can vary wildly depending on gameplay situation. Now I'm pretty curious to hear AMD/Nvidia's explanation.
This was totally unexpected for me and I don't think many are aware of this, not even enthusiasts.
Also me i was expect the Opposite! I play a lot of Rainbow 6 Siege... and i was using the Low Latenzy mode ... I will try to put it off right now and see, what happens! I have a 144HZ Monitor (AOC) and a GTX 1080. I got 200 Frames and it jumps from 138-250 fps!
Anyone who has to pinch pennies to get by, and yet wants to play PC games is aware of this because of the research they do to make their gaming experience as good as possible only with the hardware they have, as upgrading isn't an option.
Great video, I had been waiting for your analysis of these new features ever since they were announced and now I'm glad you took the time to test them properly instead of just hyping them up at release like everyone else.
Props for all the work you put in. Liked and subbed.
Now test input lag with new Nvidia In-Driver FPS limiter in 441.87 drivers.
he just did 10 min ago ;)
Hardware Unboxed just made a video response to this video. Cool stuff!
Thanks for your hard work on this. Will definitely be trying ultra low latency mode on halo reach when it drops.
so if i have a 165 hz monitor but can only get 140 fsp do i leave refresh rate same and just cap frames a bit lower pus turn off low latency or turn hz down to match fps cap at say 90% gpu load with LL off or do i run 140 fps at 165 hz and turn on LL mode(low latency)?
Great video dude. This really helps
This is by far the best video I've watched on this. I've been researching for hours, thank you for this.
I was just looking for a video like this today. Great timing. :)
just found out the low input lag feature on the latest Nvidia drivers, thank you for letting know about the new features :)
Everything you say it's true and well documented. Input latency is maybe the hardest concept to grasp in gaming because you don't have metrics to measure it. You must understand the math behind it, limit your FPS, and test it yourself.
Also it can be counter intuitive, because uncapping the FPS make the game 'feel' more responsive, but in reality that makes that what you are seeing on the screen is not exactly what is happening, as massive input lag is introduced.
It better to cap fps just below what you need to attain 95% gpu usage.
Dude, thanks for these info! Very useful!!
finally the video I have been looking for about input lag, thank you for providing all of the data
Love your videos. I would never understand most of this stuff without your easy to understand explanations.
Fantastic testing. It echoes my experiences playing playing Battlefront 2 and Apex Legends recently.
If I leave the framerate uncapped, both games feel very jittery and uneven. On my 60hz, non-freesync monitor, 90-100 fps seems to worsen responsiveness rather than improve it.
However, as soon as I lock to 60 fps using RivaTuner, both suddenly feel really fluid and responsive. My GPU is no longer being tapped out.
Having consistent frame times in-line with your monitor refresh rate really is the top priority.
Thats because of the heavy tearing lines that you get. If you have 60HZ monitor and get FPS anywhere between 61-119, you will get tearlines that actually jitter around your whole screen. Those tearlines will sperate one image into many smaller ones. Those smaller sections will MASSIVELY stutter. Plus you see the tearlines moving up and down your screen. Then 60FPS wont actually look like 60, it almost feels like 30-50FPS both the image and responsiveness. I just tested this 2 days back, and I play MUCH better on 60 FPS 60HZ then 70FPS and 60HZ... always cap the FPS to 0.5, 1 or 2 times your HZ. Btw, this tearline-stutter effect is also called "stutter beat frequencies".
I went through like 4 videos and this is the only one I understood thank you
Really good video, as always! Concise (spelling?) and informative as usual. Well done ^^
This helped me so much, Sir you are a Legend, Salute, and thanks. Hope this motivates you to keep doing good work :)
How bizarre.... I wonder how nobody noticed this before?! So strange.
Some people did notice, it's just that most benchmarks indicate that the product with the highest FPS is the better product so most people believe higher FPS = better. IMO this throws shade on the whole "Intel is better for gaming" thing. AMD's ryzen CPUs have always gotten very consistent frame-times, makes we wonder how good they are in actual MS input lag.
5-10 ms is mostly an academic difference. I doubt most user can notice that; I know I can't..
Still people regularly pay hundreds or even thousands of dollars to shave off 5-10 ms from latency. They deserve to know why they are losing those milliseconds unexpectedly.
@@JeanFrancoisPambrun i can notice 15ms on an input lag test and im sure there are people that are more sensitive than me
@@rdmz135 5-10 ms is only about 10% of the typical input latency. Feeling such small delta is usually difficult for our senses. I wonder how you can state that you would notice with such confidence since testing this hypothesis is quite involved. I should probably say that I have studied human visual acuity as part of post graduate work.
Edit: In the end, the fact that nobody even notice this before strongly suggest that it is imperceptible to most, if not all, user..
So a basic summary would be: Don't let your GPU get fully utilized in competative games to reduce input lag. You want to be CPU bound instead of GPU bound basically?
Don't use ULL or Anti-Lag if your GPU isn't being fully utilized as it will just increase input lag.
So capping CSGO @ 300FPS (Which I roughly maintain about 80-90% of the time.) and using the default option for pre-rendered frames (I believe it's 2 or 3 in GO.) would be the best solution for input lag AND frame time stability, is that right? Or did I miss something.
Edit: I remember hearing that capping your FPS also increases your input lag. Have you ever tested the difference in input lag in a game like CSGO if you're getting 500+ fps and capping it at 144, 300, 500 and 999? With the ingame command fps_max.
This is what I really wanna know
@@TerpsiKo Some in-game limiters can give you inconsistent frametimes. RTSS will give you a smooth consistent one. The trade off is tiny tiny increase in input lag but that can be well worth it depending on the situation.
This is extremely interesting! Great video as always.
You are the channel I needed in my life. Thank you. Subbed
That's a sick test! Thanks for this.
Hope AMD/NVIDIA will do something, because if I understand it correctly, if you limit your GPU usage and so get less fps, you get a better input lag, and that suck...
Great analysis of Nvidia's new low latency feature. Can you also test this using 'Low latency Mode' set to ? It's great to know what Ultra yields, but I'm very curious how will perform.
Very cool, that’s worth a sub right there. Great work
yo thanks for this man... was waiting for this video... helped out alot.
I knew before that keeping your gpu load lower kept input delay lower. I didn’t have any numbers though, but this was preached on blur busters forum for some time. It’s great to have some numbers behind it.
Could you test out games that have dx12 and dx11 and run input lag tests where you are gpu bound in both cases? The developer and not the driver have the pre rendered frame control in those games.
Some input on the subject (I'm no game engine expert, so take this with a grain of salt):
In the past most game would work in the following way:
In each frame the game would calculate the logic step that evolved from the previous frame,
then it would draw the conclusion of this logical evolution. Let's these steps logical frame and graphical frame.
As game engines evolved, some elements which were part of the logical frame moved to the graphical frame, for example simple animations that had little to no effect over the game state besides graphics. Developers realized that you could improve the graphical experience by making the graphical frame independent from the logic one.
The logical frame in many situations need to be at a static pace (ex: 60 fps), otherwise some of the collision steps would become too complex.
Engines like unity separates them. You can have your game running at 240fps, but the logical frames will be set to 30.
This allow very smooth animations and better gpu utilization than if you locked both frames stages.
The reason why you must be getting better input lag when reducing the frame rate may be that the logical frames are set to a much lower value then the graphical frame, so by increasing the graphical frame rate you are also increasing the necessary CPU utilization for doing the graphical frames (best case scenario should be only the bureaucracy of sending information to the GPU). So by increasing the graphical frame rate, we may actually be causing a bad influence to the logic frame rate...
Games that have both of them locked together should always improve latency when you increase frame rate.
Otherwise it becomes complicated.
Well... that's my 2 cents.
Interesting. I wonder what the logical frame rates for the games tested were or if they are tied with gpu rates.
That makes a lot of sense, thanks for the explanation
very intresting topic, it's fascinating how many things there are we don't know about yet
Wow that was really unexpected. Great content!
I wish there was a way to just have the feature automatically turn on every time youre in a scenario with 99% gpu usage and then turn off again when below that, it would make more sense then manually going back and forth.
Thank you for doing this, I’ve never seen this correlation in a single other report, but I’ve FELT it. I’ve been capping my frame rates for years and things just feel smoother that way, it’s interesting to know the reduced GPU load is a big part of that effect. Even using Gsync, I try to set my in game settings so that they seldom drop below 60 (whatever your refresh rate is), so there’s plenty of “headroom” for the GPU left most of the time. Awesome findings
What is your monitor refresh rate and fps cap? Do you use RTSS?
darth_frager I used to use RTSS for a lot of games, but since getting a Gsync monitor I haven’t found it as necessary. Mainly because when you have Gsync set up properly, it’s already limiting your FPS to your monitor refresh rate. I’m on a 60hz monitor so that’s 60 FPS. From there I just try to adjust in game settings so that’s I’m always hitting the 60 fps cap outside of brief dips, maybe to 55 FPS in demanding but infrequent places. This way, it feels like you’re running with no vsnyc, zero noticeable input lag, but you don’t get any screen tearing.
@@mrnelsonius5631 You say it feels like "you're running with no vsync", but exceeding your refresh rate does just that. I'm confused. I have gtx1080+ 6700k and freesync 2 monitor 144Hz WQHD. In games that it matters there usually is frame limiter present so that would make rtss obsolete for me I think. I think I should use gsync/freesync. I am getting avg110, standard deviation 5 frames. This means 99,7% is within 95 and 125fps. I wonder if I should limit fps somehow and how would that affect lag. I feel like constant higher lag is preferable to lower avg, but variable lag. Or should I only limit it if some game exceedes my refresh rate?
darth_frager I’ve never used a high refresh monitor, but my understanding is that you shouldn’t need to frame limit unless you’re going over 144 FPS. Some games will stutter without any cap though, apparently. And you’re right about Gsync having a vsync wall to an extent, but I also frame limit to 60 in game settings when I can, and for whatever reason that combined with Gsync creates no noticeable input lag for me. I can turn on vsync proper and disable gsync and instantly feel it. I think gsync behaves differently as long as frames are being limited to refresh somehow. The site Blurbusters has the best info on all this. Check it out if you haven’t
@@mrnelsonius5631 You limiting in game framerate to 60 might effectively mean your frametime can't be lower than 16,(6)ms. With RTSS to achieve this some margin is needed.
I think I settled for: GSync on + VSync ON in NVCP. I understand that no Vsync results in tearing even with GSync on.
I don't know yet if I limit in game to 141, or lock framerate with always achievable number, so I don't have GPU maxed out causing this bizarre lag presented in the video (depending on the game. For Apex legends and other demanding games I'd go for like 100 and lower graphics settings if needed, Quake Champions or CS GO 141, but I gave up on these anyway)I'll test both ways as soon as I can.
As for Nvidia anti-lag I tink I'll settle for ON. Worst case it still shouldn't hurt much with GPU not maxed and could prevent queuing multiple frames as I understand it. What's the defaulf BTW?
I own Freesync2 144Hz monitor and Pascal GPU
@Battle(non)sense Does Low Latency Mode mean the GPU skips frames if it's not ready? And will that show visually (even slightly)?
Ooof. some Interesting findings! Think I need to invest some time to test my games. Thanks for your great videos!
The advantage of a stable framerate/frametime and lower input lag is obvious, therefore capping the framerate to target 95% GPU use is the way to go. I was capping my framerate previously, only to have stable FPS. Knowing I also get lower input latency is awesome. Thank you very much for your detailed and helpful analysis.
Definitive answer please: When not maxing out GPU or playing a CPU bound game, are there any advantages to leaving this on (not ultra, ON)? for years everyone (including battlenonsense himself) suggested max frames 1 to be the best (the equivalent of the ON function and not ULTRA function) is this info outdated? thanks
Wish someone would answer this
Great vid as always. I find this super interesting for 2 reasons. The best method you find here is similar to what I already use since I frame cap for Gsync to work so it is good to know this was a good setup for the input lag as well. But I see a lot of people that swear by uncapped frames and they don't care about the gsync/freesync options as they claim the higher frames gives them better input lag results. You seem to be shattering that idea.
Wow, great video, i can imagine the work put into this.
This explains why ultra low latency mode made CSGO feel not so smooth to play.
Surprising discovery, congratulations. So we are wrong to think that when playing the maximum fps we decreased the input lag, but when limiting the fps it would give us lower latencies and better fluency in the game.
Nice video. I've been doing this for a long time. I did it for pacing, and lower temps, but I always adjust resolutions and settings, so that the highest usage still doesn't max out the GPU. Frame capping is great. Even on adaptive sync. Just find a per game cap that makes sense.
Very Interesting. Your the 1st to go in-depth about this topic. Everyone else is just turn on and move on. I'm wondering now if I should turn it back to off as I cap my fps to 160 for most games.
Good analysts but incomplete. Perhaps it's best to wait for answers from AMD/Nvidia but more work is needed here. How does Radeon Chill compare to a capped framerate? How about testing a game where the input polling rate isn't asynchronous from the framerate? What is a better gaming experience for competitive gamers? Lower framerates with lower input lag, or perhaps higher framerates with more input lag? Do the lag characteristics change with G/Fsync enabled?. This is a interesting area to keep digging at, please keep up the great work!
great video. will you able to research also for monitor that using g-sync, freesync, and dyac and how the anti lag helps?
You just gained a new subscriber, Thank you so much for this sir.
You, man, are moving the boundaries of our understanding of gaming. THANK YOU
What happens if the CPU is the bottleneck 100% load and you cap fps lower in games which the cpu can handle would it make a difference too in cpu load heavy games?
ok. i am cpu bound so clearly using ultra low latency would be a bad idea. however should i turn low latency off entirely or just have it set to on? please respond as this is the only uncertanty i have after watching your video.
Good job Chris, thanks for sharing this knowledge!
Holy nuts i was like "what??, how???" while watching this, it's super interesting, gonna give it a try, thanks dude i'm a sub now
I recently discovered this for myself tuning and testing my acer nitro 5 with a gtx 1050. The frametime delay difference of a locked 60 fps 1050 at 80% GPU usage vs unlocked 150+ fps 1080 at 100% GPU usage. It legitimately broken my brain a bit having one so much smoother, while the other so much more blatantly response. Input lag reduction is where it's at, great video.
can you help a brother out? I play R6 Siege on 60hz. Should I use this mode? and if yes then should I do it with V sync off or on?
i got the same acer nitro 5 how i do all that crap?
@@W4leed. yeah you can use low latency mode, but I would turn off vsync for sure
@@W4leed. vsync literally always adds input lag unless you use gsync or freesync so it's recommended to keep it off.
Yo
I've known about this for a long time. It's mysterious. But about NULL/AntiLag, testing with uncapped framerate should do very little, the render queue only fills up when the CPU outpaces the GPU. Limiting the framerate is not a valid case because then the engine waits and does not prepare a new frame. The worst case is when you are limited by Vsync, the engine will then prepare new frames until the queue is fully filled up. This is the real cause of Vsync input lag. You can then understand why the "58fps trick" works, by making the engine wait instead of filling up the queue you reduce input lag. If you know how most game engines are put together you will know that calling the Swap() function in your graphics API with Vsync enabled will cause the thread to stall until the frame is presented. Since the driver and hardware is a black box and we don't know exactly what happens when you call this function this is where the mystery begins. In order for the render queue to work, that means the driver has to present a false backbuffer so that the engine continues with the next frame instead of stalling. For the render queue to fill up, the CPU framerate has to be higher than the GPU framerate. If Vsync truly capped your framerate to exactly 60 then the queue could not fill up. This presents another mystery, the game engine measures the framerate by using the high precision timer to count the nanoseconds between calls. If say the CPU produces frames faster than 60 to fill up the queue, then the delta time information should also be lower than 16.6ms but this doesn't appear to be the case. Does the driver fudge the timer as well? Now the most mysterious part, which is what you've run into here. Input is taken by the engine before the frame is processed so that it can be used in the next state of the game. Input is taken instantaneously at that moment and does not update until the next frame. The lag should always be at least the time between the frame is processed and it appears on the screen, which is Input + Wait + Swap() + Queue + Sync (even without vsync the monitor still has to cycle) + Pixel Response Time. Capping the framerate should make the engine wait a certain period of time before processing the next frame, that should be extra lag. BUT, when you cap the framerate, you get a lower input lag than if you are limited by hardware. This lower input lag appears to be proportional to your potential maximum framerate. This implies that the frame is actually processed just before it appears on the screen, *even in the case of Vsync with a queue of 1!!!!!* But the thread should be stalling, which should add lag, not reduce it. But the result is you can have input lag comparable to 100fps with a cap of 20fps, but only if you could reach 100fps without the cap.
This is what you'll truly have to explain to me, please forward my comment to your contacts.
Dude...I'll TRY to read this, but you NEED to make paragraphs. PLEASE. PLEASE make paragraphs. About every 8 lines of text, you should divide your text... I'll even show you. I'll copy and paste this, and then use notepad to screenshot it and link it here.
I love people like you.
Thanks for this detailed explanation and breakdown. Liked and subscribed.
Wow, good job! Best vid in a really long time!
You sir are a Legend ! Thx for making this Video
if your gpu usage is less than 99%, you can still be limited by the GPU a certain percentage of the time
it's the individual frames which will be limited by the GPU or not, so an "immediate gpu usage" or "average gpu usage in the last second" will not tell you what's up
which is exactly why with 80% gpu usage you can gain fps by upgrading your gpu, and with 99% gpu usage you can gain fps by upgrading your cpu, and you can often gain fps by upgrading your ram
omg , good job testing these!!
This cleared many doubts! great video