Tech Focus - V-Sync: What Is It - And Should You Use It?
HTML-код
- Опубликовано: 29 сен 2024
- V-Sync is one of the most important - and controversial - options in PC gaming, but what does it do, how does it work and should you use it? Join Alex for a new Tech Focus.
Subscribe for more Digital Foundry: bit.ly/DFSubscribe
Support the DF Patreon to help the team more directly and get pristine quality video downloads: www.digitalfou...
20 years of tearing, hopefully my grandchildren won't face this monster.
It has been a long battle, brother.
50.000 people used to experience tearing here ... Now it's a ghost town.
Hmmm... 20 years of vsync, never had to worry about tearing! Of course, I don’t pretend like I’m going to be a pro CS:GO player and obsess of negligible amounts of input lag, like it even matters unless you’re a pro player!
gsync
We just need those G-Sync or Freesync or whatever variable refresh rate monitors to improve.
Input latency with VSync happens when the input polling is blocked on the same thread as the rendering (inputs during Vsync are held until the start of the frame). id Tech 5 is a good example of an engine that moves the input polling onto its own thread; inputs are consumed the moment they happen rather than the start of the next frame, so there should be less perceived input latency.
Another cool thing I was investigating a couple of years ago was using late-frame re-projection with mouse-input to have absolutely, incredibly smooth first-person mouse-look - but this came at the cost of some subtle pixel smearing and increased GPU work-load, but it's something I think competitive FPS games should consider as an option - poll mouse input at the start of the frame, then poll a second time just before VSync and do re-projection. VR games use this technique (as does the PSVR).
Personally, I can't stand the input delay of VSync. It just absolutely ruins FPS games for me; as much as I hate tearing I will live with it for the sake of avoiding weird floaty mouse input.
Visiting in 2024 - this video answered sooo many questions I had. You covered all the bases - VSync AND adaptive sync/VRR. An evergreen video. Great job on this one, Alex!!
Really hope television sets with VRR become commonplace soon. AMD freesync on some Samsung TVs is a start but it's nowhere near as polished as PC freesync/gsync yet. HDMI 2.1 was supposed to lead the way but the industry is dragging its collective feet on it.
I feel the same way about HDR with monitors. One day we will get the perfect display.
HDMI 2.1 wont even be on 2019 TV's very frustrating.. HDMI 2.1 wont be ready for consumers till the end of 2019 so 2020 TV's will get it. VRR on my Samsung Q8 and Q9 is amazing. Sometims I hook up the pc and play 1440p @ 120fps(120Hz) 4k it holds 40-60 really well ... Xbox One X games are starting to take advantage too with unlocked frame rates..
Very helpful. Had no idea what v-sync did before this and why it said it would up my frame rate when it actually dipped it. Thank you.
i wish they would teach us cool stuff like this at school
a noteworthy disadvantage when using variable refreshrates (g-sync, free senc) is that you can no longer use backlight strobing / motion blur reduction.
how about turn off motion blur at first place?
This is an amazing explanation. Been looking for something like this video for years. Finally I can refer to this video every time someone asks what vsync is or does.
Digital foundry never misses a bear with their videos explaining technology.
Great video and thank you very much.
Next topic, optimization of PC components, choosing the right ones for the right resolution, airflow, positioning of the fans, A to Z guys, A to Z ... I want more like this ... Hell of a good work ;)
Another great Alex video with great music.
V-Sync is almost always on for me, I much prefer the judder from dropping frames to screen tearing, I can’t stand that. I only have it off if my game has issues with it and my PC can run it solidly and it has a framerate cap option.
Using a 144 hz has pretty much eliminated tearing. The display is fast enough that even if the GPU output does not match perfectly the distance is so short its no longer visually distracting for most situations. Also modern displays are often smart enough to wait for full frame before displaying pretty much syncing on monitor-end.
Frame limiting with MSI Afterburner works best with vsync, cuz it's smoothens frame time.
Dante GTX Can i use Afterburner to lock a 60 FPS game into 30 FPS?
@@CanaldoZenny You can type your desired fps in settings
What should i set fps limiter and vsync on on 75hz?
@@simorx580 same as monitor refresh rate(Hertz)
@@DanteGTX i have set 75hz and i ve still got tearing
This video is too underappreciated.
I have been a fan of adaptive v-sync myself
I noticed that you arent using the combat evolved refined mod for halo on PC to restore the lost visual effects. I highly recommend it with the chimera mod.
I want to know why, in certain games, enabling V-sync messes with the mouse control. If you move the mouse too quickly, which is kind of the point of using it instead of a gamepad, the actual input slows down instead of accelerating. Sometimes this can be ameliorated by enabling V-sync at the driver level instead of ingame, so that makes me question how the developer tied mouse input to frame output.
I seem to remember this being a problem with the first Dead Space, but they had fixed it by the second one.
Watching this video make me rethink about the knowledge I have about V-Sync and Double Buffering/Triple Buffering. My knowledge is pretty much primitive and obsolete I guess, as I learn these in the DirectX 5 era (and my game developer years was 10 years ago or so). Still remember that we would draw to a surface and then blit-ed it so it display on the screen (thus double buffering).
Personally I can't think of the reason why during double buffering, the frame rate drops by half. I think it's would only be swapped at the next frame. Is it because of how the hardware is implemented?
Blitting to the screen implies that actually no double-buffering is actually being done at all in the sense meant by this video, just a single framebuffer that's being scanned out at the display's frame rate, into which you are expected to draw. This is typical in non-fullscreen mode when not using a compositing window manager (e.g. Windows XP or classic Mac OS). In this case, to ensure no unfinished render-in-progress ends up being displayed you need* to render off-screen and once the render has finished blit it to the framebuffer, either immediately (no "VSync") or at the next vsync (or more accurately, "vblank", i.e. the start of the vertical blanking interval). The analogue of triple buffering would be using two offscreen buffers to be able to start rendering the next frame instead of just waiting.
* Of course technically you don't *need* to render off-screen if you can ensure you can render fast enough to do it entirely during the vertical blanking interval. If your game renders top-to-bottom (not typical for 3D games, but more common for 2D) you can actually still be rendering the frame _while_ it's being scanned out, as long as you stay ahead of it ("racing the beam"). This eliminates tearing, has less input latency than any of the techniques in this video, and avoids the memory use of double/triple buffering, but is incompatible with the way 3D is rendered (unless raytracing) and the very strict timing requirements tend to tie the game to specific hardware and require that there's no OS getting in the way of things. It was a typical technique on old consoles (e.g. SNES).
DF should test out other PC options like Enhanced / Fast sync, along with Rivatuner for capping to 60 / 30 fps.
Fast Sync was covered. 10:57
love the video, my only crit is the moving gameplay when you are trying to show something else on screen is super distracting.
With adaptive vsync I'm able to run Doom 2016 on a GTX 1050 at near constant 60 FPS at 1080p with everything on ultra, and the tearing isn't so distracting
It really is a miracle
I’m pretty sure you get lagless vsync if you run can a game at a high FPS and in a window. I do this with source engine games sometimes because they run fast. The way I tested if there was screen tearing for sure was using my camera since I usually don’t notice screen tearing. No tears and net_graph 1 showed me hundred of FPS so there wasn’t a cap. I have always overlooked the fact windows desktop doesn’t have screen tearing by default but honestly screen tearing isn’t very distracting to me at such a high FPS and full screen is nicer unless you have borderless windows.
I couldn't follow the technical stuff but at least I got the gist of it.
Love these sort of videos!
Do one for gsync free-sync and adaptivr sync. Then for AA and all types of AA
There are two things I don't get. What's that additional frame of latency in the first type of triple buffered vsync, and why does an adaptive refresh rate cause strobing?
Ah, another Tech Focus video - awesome!
Every game in existence should have an option for adaptive V-sync, it's objectively the best solution outside of investing in a G-sync monitor.
The game doesnt have to support it only your monitor and gpu does.
@@_Thred_ no. For example: Batman Arkham Nightmare uses adaptiv v sync. No Need for an amd graphic card or freesync monitor
I don’t completely agree, if you truly hate tearing (even infrequently) it’s not necessarily the best. I noticed this on the Xbox one x version of call of duty wwII where the in game cutscenes were plagued with tear, whereas the PS4 pro version used triple buffered vsync which meant stutter instead. I’d personally take the stutter over tearing, but that’s why I consider it to be more a subjective matter than an objective one.
If you have an Nvidia card you can use the control panel to turn it on in the vast majority of games.
@@Archivist42 I dont believe that's the same thing with adaptive v sync
Input lag with vsync can be adapted to, but constant frame tears are horrible no matter how long you play.
Thank you so much. How does VSync work with 240hz panels?
That codemned music at the start took me back a bit, wish Monolith would make a new one. New Fear would be cool as well, but good this time.
This is the best channel!!!
is there also a "h" sync? horizontal syncing could lessen tearing maybe as vertical lines are mostly straight up everywhere on a plane?
and could a variable starting sync point help anything? like masking where it will come in.
H-Sync is also a parameter of video displays, quite significant in the analog CRT days. For example, VGA consists of 5 lines: R, G, B, H-Sync, V-Sync. But it's very different from what you're trying to visualize. H-Sync happens after every (horizontal) pixel line is drawn. V-Sync only happens once per frame. If H-Sync goes, well, out of sync, then the display will look like skewed garbage. It can't be used to fix tearing.
Hey, Alex! Great video, but isn't "True" Triple-Buffered V-Sync, as you said, actually called Fast-Sync? I tried looking up "True" Triple-Buffered V-Sync but I couldn't find anything.
Yep.
Wait, was that a Heidegger reference in the beginning?
I could seldom use high end hardware in my life so the issue of games not finishing rendering in one frame time was a very common problem to me. Well, both tearing and sudden brutal drops in frame rate and increases in input/visual latency are problems to me but I think when frame rates go down tearing is the lesser evil. So I often ended up disabling v-sync.
I hope for adaptive sync gaining ground so even cheap displays get it by the time I replace my current one. My last monitor lasted for like 11 years (but I repaired it and have it as a backup) so there's time. Probably.
The very first game i experienced screen tearing was Ninja Gaiden (2004) on the original XBOX. Back then i had NO IDEA wtf this sh*t was. I tough my XBOX is about to die or something...
Great work!!!
17:15 Nvidia now supports Freesync; Not all monitors are compatible.
Screw Nvidia for ruining the majority of gamers experience by forcing them to buy a gsync monitor. Gsync could have just been a standard for high quality monitors with "superior smoothness". But it should be a choice. Not shoved down my throat. Thanks for holding gamers back. Now I'm saving up for an amd.......i will have to wait for that variable sync smoothness
You shouldn't even be able to ignore open standards. That is anti-consumer and anti-competitive.
You're trying to argue that, because Nvidia cards only support Gsync, if you want a variable refresh monitor and own an Nvidia card, you have to buy a gsync monitor, which is anti-competitive and anti-consumer, but it's actually the complete opposite, as is evidenced by the fact that you've said you're going to buy an AMD graphics card next time you upgrade. Nvidia having gsync has created a competition between free sync and gsync and further fuelled the competition between nvidia and AMD, which increases competition in the market, which will then drive prices down as they try and compete for consumers, which is pro-consumer. In other words, it's the exact opposite of what you've claimed it to be and it is an excellent thing for gamers.
@@David-ud9ju This might be true if both companies were on even footing to start with, but they're not. Nvidia is the only viable option for high-end gaming GPU's, and their introduction of exclusive proprietary technology was designed to make the status quo as sturdy as possible. People buy G-Sync monitors because they have Nvidia GPU's, and then choose Nvidia for their next GPU upgrade because they have a G-sync monitor. This is anti-competitive and pro-status quo.
I used vsync when I had a 60Hz monitor. At 144+ its useless.
True brøther
I have a 240hz monitor. Microstutters and tearing are still there, even if they are less noticeable. Vsync is still needed.
Same with 120Hz. Rarely do my games exceed 120fps on ultra settings. Usually 60-100 depending on game
not true. why do you think gsync monitors exist?
@@Relex_92 Minimising the problem to the point where you don't notice it.. is the same as fixing the problem completely
15:33 is that Timmy Tencent?
Enabling vsync in multiplayer games is a joke. The amount of imput lag you can get is insane
@@Neiva71 Depends on the game, and if you're also capping to 60fps (or whatever your screen refresh rate is), it can reduce the lag significantly
I'd prefer nearly anything over tearing.
You can also try this trick: disable some poorly implemented in-game V-Sync, then force V-Sync from NVIDIA/AMD control panel, works for some games.
@@Neiva71 as i said
My cousin doesnt feel the inputlag as well...but I definitely do, when I play with his pc...his senses are just not that sensible...I'm sure its the same with your people...some people still say there is no difference between 60 and 144hz
So you should still have vsync turn on even if you are using gsync? Vsync is enabled if you pass the refresh rate of the monitor. And if you are below the refresh rate it is gsync that is working?
Hi @Digital Foundry, please tell me how I can limit my games to smooth console 30FPS VSync with Radeon RX 480. My double buffered VSync has never hard locked to 30FPS as you described and averages any framerate less than or equal 60FPS, even in games that support it, I have only seen my Intel HD 3000 graphics do this with in-game VSync.
Yeah took me a while to find out, I remember playing doom and seeing lines appearing thinking my new laptop is broken until I found this out.
Anyone else having problems with DF videos on Chromecast? Videos plays nice but the audio sometimes freezes and go back again after a 2-3 seconds. Only with Digital Foundry videos. Weird.
Not sure it informed me when to use it...
Unless you play very fast paced FPS games, none of this matters, just go with regular vsync or adaptive sync.
I am eternally pissed off that nVidia pulled the GSync upgrade module from the market. I have one of the few monitors that can take it, and would love to have it installed. But that's a big no-no. At the time GSync enabled monitors were way out of my available income...
Great video.
Make a video about running freesync on nvedia gpu. Also get a CRT problemo solved.
So that's why Serious Sam looked so jittery, I was always curious about it.
15:15 I don't see adaptive sync here, it's just all standard v-sync options, iirc triple buffering.
Unless the game has insanely maddening tearing (only RAGE comes to my mind) I usually play with it off. I've grown to accept tearing through the years similar with AA. The methods to get rid of them (both tearing and aliasing) introduce a whole other host of issues (weirdo fps jumps for v-sync, very high performance cost for msaa, vaseline screen for fxaa/smaa and vaseline+ghosting with taa) so I look at those issues and go; eeeh I'll just stick with the jaggies and the tearing, they ain't too bad compares to the side effecs.
No longer have this issue, adaptive sync is a nice luxury.
but adaptive vsync is pretty useless, cause it's deactivating vsync below your refresh rate and you got tearing again instead of micro-stuttering. Not a good exchange for me.
@@Sultansekte adaptive sync, not adaptive vsync.
A.K.A. VESA adaptive sync (Freesync) / G-sync. Otherwise known as variable refresh rate.
Ironically I'm watching this on my work notebook with some arcane OPTIMUS setup... it has screen tearing down to 360p.
Informative video. Thanks.
Ah yes, nothing like waking up and learning about V-Sync.
but what is g sync
This literally just happened to me haha woke up, brewed coffee, checked youtube, opened this xD
@@melxb its basically a nvidia version of v sync thats built into the hardware and it has less input lag than v sync
@@t4ky0n how does free sync compare
@@melxb if you're asking about what it is, it's the AMD version of G-sync im not certain about the difference in performance but im about to look it up anyway so ill get back to you on that.
It’s shocking how many last-gen games suffered from screen tearing and the average person had no idea it was a thing
Loving the pc focused content keep up the good work DF.
Why does this "pc masterrace" thing shine thru every pc focused comment ? I dont get why so many pc players are such platform Nazis.
It should not matter where ones plays. Can't we all just enjoy games and technology ?
@@Ample17 seriously dude?? Have you read the console wars comments on their console content? btw nothing in that comment hinted at the "master race "perspective.
for this comment to be true you would have to include consoles in the category of 'pc focused content', this comment is a self-contradictory statement!
Ermmm. Not just pc
@@Ample17 Lmao I don't know how you got all of that from his comment but OK.
It's really funny though, console gamers are more rabid when it comes to being "platform nazis". But when PC gamers do it OH GOD NO PLZ THINK OF THE CONSOLE CHILDREN!!
I had to explain VSync & triple buffering to a developer the other day when they were perplexed about their engine not exceeding 60 FPS and their profile tools reporting a mysteries chunk of CPU activity that took up more than 12 milliseconds in their 16.666ms frame. Was a bit of a weird conversation to have with someone who was in charge of a game's development.
@@Relex_92 There's actually too much information too keep up with. If the developer has a small team then it's possible.
Unfortunately that, or usually even worse, is very common in software devs. I know many who’ve made applications for Windows for 20 years and STILL somehow don’t even know the very basics of how the OS works (and therefore no knowledge of how to make a decent application for it that does things “right” for that OS)... and I’m talking not knowing shit like “you need to reboot your PC every once in a while to prevent glitches and poor performance, or as one of the first things to try if something stops working properly”. Its that bad for MOST developers today, who - keep in mind - are usually developing apps and not games, since the talented ones often go into gaming development
@@zombievac the talented developers going into game dev is something I noticed recently. I was out with a friend and I met the owner of a software company, he asked me what languages do I know and I mentioned C/C++ and he lost his shit - saying that there's such huge demands for C/C++ developers right now. I thought it was weird because I'm used to everyone around me knowing C/C++, not knowing the language is a weird thing, but then I realised that the problem is the C/C++ developers these days move immediately into high performance game development or some critical application development for very large projects, leaving the rest of the software industry oddly starved of low-level, hard-core developers.
I honestly find it not that hard to believe. I can run a car but I had no idea how the thing actually works, for example
To be fair in the manual for most game engines it will just say prevents tearing and limits you framerate to device refresh.
It would to be great to have an updated video on this topic, since now we have things like Fast Sync and RTSS Scanline Sync. Both options are great to use with motion blur reduction (that can't be used with Freesync/Gsync).
what DF refers to as "true triple buffering" is nvidia fast sync / amd enhanced sync.
but yeah, i use RTSS scanline sync and it is like black magic to me. i've tried to look how it works multiple times, but its like a black box with zero documentation anywhere i could find online.
i feel like i mostly understand it... but how the hell does it just know how to move the tear off screen like that? and why does it work exactly the same on classic games and new games where i can have 60-400 FPS, but when i try to play a really old game like half life it breaks and needs to be reconfigured? its so bizarre.
I’m surprised this video was only made now
This video didn't need to be made. You just need a ten second clip saying "Vsync is the work of Satan and developers should just optimize performance better."
@@GuySocket they could do that if everyone had the exact same hardware, and wanted the same resolution and framerate. We don't.
@@GuySocket With high refresh rates you get even more tearing without V-Sync (or some other sync method).
@@GuySocket sounds like you need to watch the video because that's not what causes tearing.....
I've wondered what it does and have seen suggestions about what it does everywhere for performance.
It turns out the only correct option is disable for casual players who don't have those expensive graphic cards
Should have mentioned the new screen tear eliminating feature of Riva Tuner Statistics Server called "Scanline Sync". Works wonders, especially for older games!
Thunder Run explain si
This shit's perfect for emulators.
Sounds interesting, thanks for letting me know about it.
Yeah but it took SO MUCH time to deliver until now, they better have already included this back in the later 2000s.
yea for older games def, i believe you have to have 2x or more of your refresh for a smooth experience. otherwise i experience judder but its great when it works perfetct
I'm loving those Serious Sam tracks.
SS in VR is pretty awesome :)
Ss in VR Is SERIOUSLY awesome! I'm doing a vr LIVE walkthrough in those days, check my channel for those!
Serious Sam VR is magical. The ability to aim dual guns in any direction like you're in Equilibrium? Hell yeah!
VR is something I've never tried. Serious Sam in VR looks really awesome though!
I'm a PC gamer who pretty much *always* turns V-sync on no matter what simply because I legitimately *despise* screen tearing (it actually legit hurts my eyes)!
I know it adds input lag but honestly, I don't really care too much about that because since I'm not at all interested in speedrunning or competitive online multiplayer, my gaming tastes are such that the lag V-sync adds doesn't really affect me all that much.
I find that it depends on what framerate you are getting. If you are getting just a bit above your refresh rate, the tearing is horrible. If you are getting way above, I don't notice it. I always turn vsync off, and try to adjust my settings so I get 80-100 fps.
To get best results, before even changing in game settings get your PC performance settings and GPU control panel settings right for your setup.
I use it for singleplayer games. One thing that could help you out is fast sync, which eliminates screen tearing while not capping your fps to your monitors refresh rate and it dosen't add input lag. Its only for nvidia gpus from what I recall though, you could also just buy a gsync monitor if you have a big wallet.
Funnily enough I haven't had screen tearing in about 8 years or so. Not sure how but I guess I am doing something right.
FreeSync/GSync is the only sync I'll ever use, I hate screen tearing but I hate input lag even more
FreeSync/GSync is *better* than VSync, it makes the game feel smooth even when I hit low FPS.
I've always been using V-sync, whatever the outcome. Tearing is an absolute no-no for me.
ive used gsync for the past 3 years and never looked back.
Me too. V sync can be borked on some games, g-sync just bypasses all that nonsense and runs your games smooth and lag free, as long as you cap fps a few under the monitor hz limit.
Same. Been using a 1440p 165hz overclocked screen /w gsync. Its faaaantastic
word, gsync is the ssd of monitor upgrades.
@ so for 144hz you cap it at 142? or is it guesswork? Most games just have a 144 fps limit so do you use RTSS?
X34 predator with gsync here! Love it!!
On my pc, if I can reach over 90 fps I just disable vsync entirely as it generally doesn't distract me on the games I play. If I play a game that runs at 60-70fps on average with the chance of it going below that, I use adaptive vsync. If a game goes below 60fps, I care more about maintaining the best input latency I can rather than if there's tearing or not.
My PC currently has a gtx 980 in it, and since Nvidia seems to not want to make gsync affordable, I'll probably switch to an AMD gpu on my next PC so that I can get variable refresh rates without having to destroy my wallet.
Agreed, I just turn down settings until I get a solid 120fps, I would rather play 900p 120hz than 1440p 60hz
But why? Are you a pro with lots of money on the line? Vsync has been good enough for a long time now for the input lag to be nearly negligible if your PC has decent specs like you mentioned, and tearing is so much more distracting (and vsync has the nice benefit of capping your FPS to your refresh rate, because variable FPS and therefore variable input lag is even worse because you can’t adjust to it)
Well pro or not I still dont want to be handicapped by the input lag it introduces. It feels sluggish and I don't like that feeling.
On a 60 hz monitor in many games, if you get over 100FPS it can still look way way way more jittery than 60fps vsync. Sad reality.
@@zombievac That's probably why most games since its inception gave the option to turn V-Sync off, right?
You don't have to be a pro to appreciate occasional lower input lag over the same, but higher input lag all the time. Also that's not correct. If you drop down to the next V-Sync step your input lag is doubled within a single frame. That's more inconsistent than V-Sync off will ever get you.
what i still dont fully understand is why i get tearing on my 144hz screen even when i have constant 144 fps. no matter if vsync is on or if i use a framelock
Turn vsync on in nvidia settings
I love that you're using Serious Sam games in the background.
Criminally underrated gems! SS3:BFE alone was like the FAR superior, less casual nu-DOOM.
G-Sync & Free-Sync.
Thank you.
Yea, I did not like V-Sync. Free-Sync is amazing.
The higher the refresh rate of your display the better vsync is. 120 hz gives you 60, 40, 30, 24 FPS ect without the input lag penalty of running up against your displays max refresh rate or the stutter by running between those framerates with vsync.
This was explained very well. I’m someone with a rather limited knowledge of these behind the scenes type of information, and I found this easy to digest. Good job Alex!
Variable refresh rate is the way, G-sync cost a lot but damn it's good !!
My gaming experince totally changed since I got my gsync monitor. No more tearing or shutering when I am below my refresh rate. I would not go back
Wyatt Cheng I can't wait! 😂
Wyatt Cheng go play candy crush
Agree man, g-sync is glorious
im getting tired of trying to find a compromise between tearing and lag lol think i might just get one too
Hey Digital Foundry can you do a video on the new Scan Line Sync in Rivatuner Statistics Server?
Rivatuner Statistics server has now an option to control the tearline, it's quite interesting.
Really? I need to check that out.
Yes, I even heard you can get the tearline to appear "nowhere" by using that method, so it´s not on your screen at all.
Great video!
I didn't know how Triple Buffering worked before - it's great to know how it works
HOLY SHIT. SOMEONE FIXED THE ANIMATION INTERPOLATION IN HALO PC????
You can use Chimera and the 60FPS animation feature built into it.
@@KhangHoang-ks3nm I know it's stingy, but it's actually full on interpolation. So whatever your fps is, that is how many times the animations will update per second.
The down side to this shift to VRR is going to be the frame pace issues when recording a game… trust me it’s been an absolute pain, only fix so far is to disable G-Sync and lock V-Sync to fixed refresh of the recording capture framerate… so 60fps. ☹️
Yeah it is something we struggle with too
I find screen tearing, frame pacing stutters incredibly distracting. Years ago got a G sync but i dunno if its because the frame rates are not right for the G sync but i really do not notice a difference compared to v sync or adaptive V sync
If u never get dips in fps u will not notice a difference. Its when the framerate fluctuates u notice the power of g-sync. With a gsync monitor 45 fps will feel about the same as 60 fps with a normal monitor.
@@underflip2 That's not what G-Sync does, lol.
I think the video should have included more timeline graphs of the time it takes to render/refresh, to make it easier to understand. Also for people that are not familiar with this stuff, it should have been pointed out, that higher refresh rates significantly reduce tearing.
I know I am anserwing after a long time but is higher fps=lower tearing still a thing when fps is getting higher than monitor's refresh rate?
@@anteksadkowski5166 You may get more tearlines, but the difference between the frames, so the offset, is smaller so each tearline should be a bit less noticeable.
Alex this was an absolutely awesome video! This will be the definitive video I lead people to so they can better understand what V-sync is, its variations, and how they should be used. V-sync has to be the most misunderstood aspect about PC gaming because it's believed by many to be the enemy or just something that caps your frame rate. Great job again, Alex.
awesome work Alex, this year i started using G-Sync on 144hz display, its glorious.
im using fast sync for all my games and after that never used vsync... pls do a video on fast sync vs free sync and g sync...
Actually, what Alex called "True" Triple-Buffered V-Sync is Fast Sync. I'm not sure why he called it that to be honest.
@Wyatt Cheng im not playing diablo games.. and i dont like mobile games either..
@@sd19delta16 yep its kind of tripple buffer . Gpu produce infinte frame bcs its think that vsync off and then fast sync chose best frame and show it on screen.. correct me if im wrong.. !!
FastSync needs the frame rate to be at least double that of the refresh rate, preferably 3 times fasteror more so only best used for games like CS:GO, it massively reduces delay and removes tearing as it shows the most up to date frame every time, so say 120 Hz Display but the game is say 480 FPS, then the frame delay is not 8.333 ms but 2.083 ms, good stuff it is. Bit remembered, need at least double the frame rate of the display refresh rate, if not at least double, it pretty much is just normal vsync then so useless
sd19delta, basically the different are fast sync do not limit your gpu frame rate and still maintain viewable frame for your monitor.(to reduce stuttering and input lag)
I cannot play a single game without vsnc
i use G-sync. I can live without V-sync
Yup me neither, screen tearing is horrendous.
I use freesync, no need for vsync.
I also cant. But its because my laptop overheats and v-sync locks it at 60fps
Yeah, I am also very sensitive to tearing. Quake was the first game I turned VSync on (on 3DFX Voodoo1). Still remember the console command vid_wait 1. Since then I haven't played a single game without Vsync, cause it turns out I hate tearing much more than slow framerates or input lag or low resolutions. One of the worst and ugliest things that can happen to a game for me is if it starts tearing. Also the reason I don't bother with consoles, where you are basically left at the discretion of developers in this matter.
The clips with tearing were painful to watch
freesync master race (30-144hz range
When business monitors adopts it, it will be the end of tearing
@@madson-web
And hopefully the end of overpriced G-Sync technology.
@@madson-web No need in business, unless your powerpoint presentation really needs it lol. Creative yes.
@@tomtalk24 so yeah, it will never be a standard this way. This thing needs to be releasing from "gamer" tag somehow
Not everyone plays games, and not all gamers play on PC. So it wont ever be a "standard" as it has no use anywhere else other than in rendering graphics in real time. Thus a very small portion of the display market. We're talking about gsync/freesync, not HD/4K/HDR or whatever else thats in use (and usable) across the entire market.
I use it in every game, or the screen tearing gets really bad, it's distracting
@Wyatt Cheng where did diablo immortal came from???
my friend calls vsync shit, but his torn ass screen makes me sick
I remember last gen when consolers swore Vsync didnt exist. the human eye couldnt see it, just like above 720p, just like above 30fps.
Funny how they change their tune when consoles get it. Textures, i remember "debating" with consolers about improved textures and they said "who stands and looks at rocks when playing a game", now they bicker and argue with each other about which plastic boxes rocks look better. ha ha
console hypocrite race
Hilarious how they chop and change every other minute, pretend not to see stuff until they have it then all of a sudden its like the second coming of christ if the other console doesnt have it. then when they dont have it its back to being unable to see the difference.
Most console gamers are contempt with what they have. Engaging in a technical discussion is mostly useless.
So a game in borderless window mode automatically switch to triple buffering? That's explain a lot. I always have micro stuttering in borderless! :O
I usually turn on v-sync to get rid of screen tearing.
No shit! (-_-)'
Same if playing on TV instead of my G-sync monitors. But I also make sure I hit the constant 60 fps.
I usually use v-sync to lock the game at 60fps so my laptop wont overheat
Yeah lmao
I usually breathe air to survive. I know, I know... that’s way too interesting for this thread!
Nah, you should use Scanline Sync with RivaTunerStatisticsServer (Included with MSI Afterburner). All the benefits of vsync without input lag, at least for someone like me on a 1080p60 panel.
I cant be the only one that uses v-sync to lock the game at 60 fps so my pc wont overheat
Usually, people lock their framerate using rivatuner
No, only on older games like the first Witcher and Divinity.
Cap at 60, force v sync.
Otherwise they run terribly.
@@mrmagoo-i2l Exactly.
Sometimes when my laptop gets really hot
you can get rivatuner or even better, nvidia inspector
Digital Foundry: "[...] und auf Wiedersehen."
Me, a german: "Wait a minute"
Yeah man, a german in the team of DF :D
G-Sync is the best, but comes of the high licensing costs of Fu-King Nvidia…
Freesync to the rescue. All you have to do is give even less money to Nvidia and give some to AMD instead.
@Ignignokt did you delete your comment?
@@nextlifeonearth Freesync isn't as good as gsync and you can't use Nvidia GPUs with free sync, which means you'll have to use an AMD GPU and they're just not up to the standard of Nvidia's GPUs. AMD have really been playing catch up with Intel and NVidia for years and, until they start actually coming out with products that can compete with Nvidia and Intel, we should avoid them or otherwise they won't change their business practice.
Not sure how freesync is not as good as gsync, it 100% works, and the measured delay is the same as g-syncs best results (with gsync having inconsistent measured delays(source - bitwit and hardware unboxed). RX 580 competes directly with 1060 in price and performs on par with it. the next step up vega 56 and GTX 1070, they are the same price and the vega beats it convincingly in most games. add on to that the fact that you can buy a vaga 56 and freesync monitor for the same money as a 1060 and gsync monitor, and the only reason you cant use Nvidia GPUs with freesync is because Nvidia specifically wont let you (it would take nothing more than a firmware update) and then consider who should be avoided until they change their business practices.