Tech Focus - V-Sync: What Is It - And Should You Use It?

Поделиться
HTML-код
  • Опубликовано: 29 сен 2024
  • V-Sync is one of the most important - and controversial - options in PC gaming, but what does it do, how does it work and should you use it? Join Alex for a new Tech Focus.
    Subscribe for more Digital Foundry: bit.ly/DFSubscribe
    Support the DF Patreon to help the team more directly and get pristine quality video downloads: www.digitalfou...

Комментарии • 947

  • @felipeedu7980
    @felipeedu7980 5 лет назад +842

    20 years of tearing, hopefully my grandchildren won't face this monster.

    • @rodrigobasoaltoc.1743
      @rodrigobasoaltoc.1743 5 лет назад +44

      It has been a long battle, brother.

    • @peterGchaves
      @peterGchaves 5 лет назад +44

      50.000 people used to experience tearing here ... Now it's a ghost town.

    • @zombievac
      @zombievac 5 лет назад +16

      Hmmm... 20 years of vsync, never had to worry about tearing! Of course, I don’t pretend like I’m going to be a pro CS:GO player and obsess of negligible amounts of input lag, like it even matters unless you’re a pro player!

    • @misterio10gre
      @misterio10gre 5 лет назад +7

      gsync

    • @danielgomez7236
      @danielgomez7236 5 лет назад +6

      We just need those G-Sync or Freesync or whatever variable refresh rate monitors to improve.

  • @Xilefian
    @Xilefian 5 лет назад +16

    Input latency with VSync happens when the input polling is blocked on the same thread as the rendering (inputs during Vsync are held until the start of the frame). id Tech 5 is a good example of an engine that moves the input polling onto its own thread; inputs are consumed the moment they happen rather than the start of the next frame, so there should be less perceived input latency.
    Another cool thing I was investigating a couple of years ago was using late-frame re-projection with mouse-input to have absolutely, incredibly smooth first-person mouse-look - but this came at the cost of some subtle pixel smearing and increased GPU work-load, but it's something I think competitive FPS games should consider as an option - poll mouse input at the start of the frame, then poll a second time just before VSync and do re-projection. VR games use this technique (as does the PSVR).
    Personally, I can't stand the input delay of VSync. It just absolutely ruins FPS games for me; as much as I hate tearing I will live with it for the sake of avoiding weird floaty mouse input.

  • @seth4321
    @seth4321 4 месяца назад

    Visiting in 2024 - this video answered sooo many questions I had. You covered all the bases - VSync AND adaptive sync/VRR. An evergreen video. Great job on this one, Alex!!

  • @zedsdeadbaby
    @zedsdeadbaby 5 лет назад +78

    Really hope television sets with VRR become commonplace soon. AMD freesync on some Samsung TVs is a start but it's nowhere near as polished as PC freesync/gsync yet. HDMI 2.1 was supposed to lead the way but the industry is dragging its collective feet on it.

    • @jbap6981
      @jbap6981 5 лет назад +12

      I feel the same way about HDR with monitors. One day we will get the perfect display.

    • @DeeBatch
      @DeeBatch 5 лет назад +9

      HDMI 2.1 wont even be on 2019 TV's very frustrating.. HDMI 2.1 wont be ready for consumers till the end of 2019 so 2020 TV's will get it. VRR on my Samsung Q8 and Q9 is amazing. Sometims I hook up the pc and play 1440p @ 120fps(120Hz) 4k it holds 40-60 really well ... Xbox One X games are starting to take advantage too with unlocked frame rates..

  • @guy3nder529
    @guy3nder529 5 лет назад

    Very helpful. Had no idea what v-sync did before this and why it said it would up my frame rate when it actually dipped it. Thank you.

  • @omarsy5085
    @omarsy5085 5 лет назад +3

    i wish they would teach us cool stuff like this at school

  • @Sp00kyFox
    @Sp00kyFox 5 лет назад +7

    a noteworthy disadvantage when using variable refreshrates (g-sync, free senc) is that you can no longer use backlight strobing / motion blur reduction.

    • @GolemShadowsun
      @GolemShadowsun 5 лет назад

      how about turn off motion blur at first place?

  • @pandaumk3
    @pandaumk3 5 лет назад

    This is an amazing explanation. Been looking for something like this video for years. Finally I can refer to this video every time someone asks what vsync is or does.

  • @moustiboy
    @moustiboy 5 лет назад

    Digital foundry never misses a bear with their videos explaining technology.
    Great video and thank you very much.

  • @SmiFF_
    @SmiFF_ 5 лет назад

    Next topic, optimization of PC components, choosing the right ones for the right resolution, airflow, positioning of the fans, A to Z guys, A to Z ... I want more like this ... Hell of a good work ;)

  • @d_b_n_
    @d_b_n_ 5 лет назад

    Another great Alex video with great music.

  • @justanotheryoutubechannel
    @justanotheryoutubechannel 2 года назад +1

    V-Sync is almost always on for me, I much prefer the judder from dropping frames to screen tearing, I can’t stand that. I only have it off if my game has issues with it and my PC can run it solidly and it has a framerate cap option.

  • @StrazdasLT
    @StrazdasLT Год назад

    Using a 144 hz has pretty much eliminated tearing. The display is fast enough that even if the GPU output does not match perfectly the distance is so short its no longer visually distracting for most situations. Also modern displays are often smart enough to wait for full frame before displaying pretty much syncing on monitor-end.

  • @DanteGTX
    @DanteGTX 5 лет назад +24

    Frame limiting with MSI Afterburner works best with vsync, cuz it's smoothens frame time.

    • @CanaldoZenny
      @CanaldoZenny 5 лет назад +2

      Dante GTX Can i use Afterburner to lock a 60 FPS game into 30 FPS?

    • @DanteGTX
      @DanteGTX 5 лет назад +3

      @@CanaldoZenny You can type your desired fps in settings

    • @simorx580
      @simorx580 5 лет назад +1

      What should i set fps limiter and vsync on on 75hz?

    • @DanteGTX
      @DanteGTX 5 лет назад

      @@simorx580 same as monitor refresh rate(Hertz)

    • @simorx580
      @simorx580 5 лет назад

      @@DanteGTX i have set 75hz and i ve still got tearing

  • @Unexpectedstuff
    @Unexpectedstuff 5 лет назад +2

    This video is too underappreciated.

  • @goldgriffin100
    @goldgriffin100 5 лет назад

    I have been a fan of adaptive v-sync myself

  • @frostygraybob
    @frostygraybob 5 лет назад

    I noticed that you arent using the combat evolved refined mod for halo on PC to restore the lost visual effects. I highly recommend it with the chimera mod.

  • @moosemaimer
    @moosemaimer 5 лет назад

    I want to know why, in certain games, enabling V-sync messes with the mouse control. If you move the mouse too quickly, which is kind of the point of using it instead of a gamepad, the actual input slows down instead of accelerating. Sometimes this can be ameliorated by enabling V-sync at the driver level instead of ingame, so that makes me question how the developer tied mouse input to frame output.
    I seem to remember this being a problem with the first Dead Space, but they had fixed it by the second one.

  • @WutipongWongsakuldej
    @WutipongWongsakuldej 5 лет назад

    Watching this video make me rethink about the knowledge I have about V-Sync and Double Buffering/Triple Buffering. My knowledge is pretty much primitive and obsolete I guess, as I learn these in the DirectX 5 era (and my game developer years was 10 years ago or so). Still remember that we would draw to a surface and then blit-ed it so it display on the screen (thus double buffering).
    Personally I can't think of the reason why during double buffering, the frame rate drops by half. I think it's would only be swapped at the next frame. Is it because of how the hardware is implemented?

    • @MatthijsvanDuin
      @MatthijsvanDuin 5 лет назад

      Blitting to the screen implies that actually no double-buffering is actually being done at all in the sense meant by this video, just a single framebuffer that's being scanned out at the display's frame rate, into which you are expected to draw. This is typical in non-fullscreen mode when not using a compositing window manager (e.g. Windows XP or classic Mac OS). In this case, to ensure no unfinished render-in-progress ends up being displayed you need* to render off-screen and once the render has finished blit it to the framebuffer, either immediately (no "VSync") or at the next vsync (or more accurately, "vblank", i.e. the start of the vertical blanking interval). The analogue of triple buffering would be using two offscreen buffers to be able to start rendering the next frame instead of just waiting.
      * Of course technically you don't *need* to render off-screen if you can ensure you can render fast enough to do it entirely during the vertical blanking interval. If your game renders top-to-bottom (not typical for 3D games, but more common for 2D) you can actually still be rendering the frame _while_ it's being scanned out, as long as you stay ahead of it ("racing the beam"). This eliminates tearing, has less input latency than any of the techniques in this video, and avoids the memory use of double/triple buffering, but is incompatible with the way 3D is rendered (unless raytracing) and the very strict timing requirements tend to tie the game to specific hardware and require that there's no OS getting in the way of things. It was a typical technique on old consoles (e.g. SNES).

  • @Belfoxy
    @Belfoxy 5 лет назад

    DF should test out other PC options like Enhanced / Fast sync, along with Rivatuner for capping to 60 / 30 fps.

  • @salcarreiro6756
    @salcarreiro6756 5 лет назад

    love the video, my only crit is the moving gameplay when you are trying to show something else on screen is super distracting.

  • @arsnakehert
    @arsnakehert 4 года назад

    With adaptive vsync I'm able to run Doom 2016 on a GTX 1050 at near constant 60 FPS at 1080p with everything on ultra, and the tearing isn't so distracting
    It really is a miracle

  • @norisgello2523
    @norisgello2523 3 года назад

    I’m pretty sure you get lagless vsync if you run can a game at a high FPS and in a window. I do this with source engine games sometimes because they run fast. The way I tested if there was screen tearing for sure was using my camera since I usually don’t notice screen tearing. No tears and net_graph 1 showed me hundred of FPS so there wasn’t a cap. I have always overlooked the fact windows desktop doesn’t have screen tearing by default but honestly screen tearing isn’t very distracting to me at such a high FPS and full screen is nicer unless you have borderless windows.

  • @PSspecialist
    @PSspecialist 5 лет назад +1

    I couldn't follow the technical stuff but at least I got the gist of it.

  • @uniworkhorse
    @uniworkhorse 5 лет назад

    Love these sort of videos!

  • @GemayelDaniel
    @GemayelDaniel 5 лет назад

    Do one for gsync free-sync and adaptivr sync. Then for AA and all types of AA

  • @orangeguy5374
    @orangeguy5374 5 лет назад

    There are two things I don't get. What's that additional frame of latency in the first type of triple buffered vsync, and why does an adaptive refresh rate cause strobing?

  • @II2old4thisII
    @II2old4thisII 5 лет назад

    Ah, another Tech Focus video - awesome!

  • @vaugna1620
    @vaugna1620 5 лет назад +12

    Every game in existence should have an option for adaptive V-sync, it's objectively the best solution outside of investing in a G-sync monitor.

    • @_Thred_
      @_Thred_ 5 лет назад

      The game doesnt have to support it only your monitor and gpu does.

    • @naimcool36
      @naimcool36 5 лет назад +1

      @@_Thred_ no. For example: Batman Arkham Nightmare uses adaptiv v sync. No Need for an amd graphic card or freesync monitor

    • @Archivist42
      @Archivist42 5 лет назад +4

      I don’t completely agree, if you truly hate tearing (even infrequently) it’s not necessarily the best. I noticed this on the Xbox one x version of call of duty wwII where the in game cutscenes were plagued with tear, whereas the PS4 pro version used triple buffered vsync which meant stutter instead. I’d personally take the stutter over tearing, but that’s why I consider it to be more a subjective matter than an objective one.

    • @SteelSkin667
      @SteelSkin667 5 лет назад +1

      If you have an Nvidia card you can use the control panel to turn it on in the vast majority of games.

    • @madalinradion
      @madalinradion 5 лет назад

      @@Archivist42 I dont believe that's the same thing with adaptive v sync

  • @Birmanncat
    @Birmanncat 5 лет назад

    Input lag with vsync can be adapted to, but constant frame tears are horrible no matter how long you play.

  • @fablesquad9210
    @fablesquad9210 5 лет назад

    Thank you so much. How does VSync work with 240hz panels?

  • @AskeGuitar
    @AskeGuitar 5 лет назад

    That codemned music at the start took me back a bit, wish Monolith would make a new one. New Fear would be cool as well, but good this time.

  • @luisleon2124
    @luisleon2124 5 лет назад

    This is the best channel!!!

  • @TheTrumanZoo
    @TheTrumanZoo 4 года назад

    is there also a "h" sync? horizontal syncing could lessen tearing maybe as vertical lines are mostly straight up everywhere on a plane?
    and could a variable starting sync point help anything? like masking where it will come in.

    • @joesterling4299
      @joesterling4299 4 года назад +1

      H-Sync is also a parameter of video displays, quite significant in the analog CRT days. For example, VGA consists of 5 lines: R, G, B, H-Sync, V-Sync. But it's very different from what you're trying to visualize. H-Sync happens after every (horizontal) pixel line is drawn. V-Sync only happens once per frame. If H-Sync goes, well, out of sync, then the display will look like skewed garbage. It can't be used to fix tearing.

  • @sd19delta16
    @sd19delta16 5 лет назад

    Hey, Alex! Great video, but isn't "True" Triple-Buffered V-Sync, as you said, actually called Fast-Sync? I tried looking up "True" Triple-Buffered V-Sync but I couldn't find anything.

  • @midge_gender_solek3314
    @midge_gender_solek3314 2 года назад

    Wait, was that a Heidegger reference in the beginning?

  • @alvaroacwellan9051
    @alvaroacwellan9051 5 лет назад

    I could seldom use high end hardware in my life so the issue of games not finishing rendering in one frame time was a very common problem to me. Well, both tearing and sudden brutal drops in frame rate and increases in input/visual latency are problems to me but I think when frame rates go down tearing is the lesser evil. So I often ended up disabling v-sync.
    I hope for adaptive sync gaining ground so even cheap displays get it by the time I replace my current one. My last monitor lasted for like 11 years (but I repaired it and have it as a backup) so there's time. Probably.

  • @iPlaySEGA
    @iPlaySEGA 5 лет назад +1

    The very first game i experienced screen tearing was Ninja Gaiden (2004) on the original XBOX. Back then i had NO IDEA wtf this sh*t was. I tough my XBOX is about to die or something...

  • @ledjon
    @ledjon 5 лет назад

    Great work!!!

  • @albiss1164
    @albiss1164 5 лет назад

    17:15 Nvidia now supports Freesync; Not all monitors are compatible.

  • @y2everything223
    @y2everything223 5 лет назад +3

    Screw Nvidia for ruining the majority of gamers experience by forcing them to buy a gsync monitor. Gsync could have just been a standard for high quality monitors with "superior smoothness". But it should be a choice. Not shoved down my throat. Thanks for holding gamers back. Now I'm saving up for an amd.......i will have to wait for that variable sync smoothness
    You shouldn't even be able to ignore open standards. That is anti-consumer and anti-competitive.

    • @David-ud9ju
      @David-ud9ju 5 лет назад

      You're trying to argue that, because Nvidia cards only support Gsync, if you want a variable refresh monitor and own an Nvidia card, you have to buy a gsync monitor, which is anti-competitive and anti-consumer, but it's actually the complete opposite, as is evidenced by the fact that you've said you're going to buy an AMD graphics card next time you upgrade. Nvidia having gsync has created a competition between free sync and gsync and further fuelled the competition between nvidia and AMD, which increases competition in the market, which will then drive prices down as they try and compete for consumers, which is pro-consumer. In other words, it's the exact opposite of what you've claimed it to be and it is an excellent thing for gamers.

    • @halkon4412
      @halkon4412 5 лет назад +2

      @@David-ud9ju This might be true if both companies were on even footing to start with, but they're not. Nvidia is the only viable option for high-end gaming GPU's, and their introduction of exclusive proprietary technology was designed to make the status quo as sturdy as possible. People buy G-Sync monitors because they have Nvidia GPU's, and then choose Nvidia for their next GPU upgrade because they have a G-sync monitor. This is anti-competitive and pro-status quo.

  • @rdmz135
    @rdmz135 5 лет назад +69

    I used vsync when I had a 60Hz monitor. At 144+ its useless.

    • @dkmarcusdk867
      @dkmarcusdk867 5 лет назад +4

      True brøther

    • @Imgema
      @Imgema 5 лет назад +19

      I have a 240hz monitor. Microstutters and tearing are still there, even if they are less noticeable. Vsync is still needed.

    • @MysteryD
      @MysteryD 5 лет назад +2

      Same with 120Hz. Rarely do my games exceed 120fps on ultra settings. Usually 60-100 depending on game

    • @haggisman0812
      @haggisman0812 5 лет назад +2

      not true. why do you think gsync monitors exist?

    • @garytallowin6623
      @garytallowin6623 5 лет назад +2

      @@Relex_92 Minimising the problem to the point where you don't notice it.. is the same as fixing the problem completely

  • @bozitojugg3rn4ut
    @bozitojugg3rn4ut 5 лет назад

    15:33 is that Timmy Tencent?

  • @funbrute31
    @funbrute31 5 лет назад +37

    Enabling vsync in multiplayer games is a joke. The amount of imput lag you can get is insane

    • @jc_dogen
      @jc_dogen 5 лет назад

      @@Neiva71 Depends on the game, and if you're also capping to 60fps (or whatever your screen refresh rate is), it can reduce the lag significantly

    • @SupertoastGT
      @SupertoastGT 5 лет назад +7

      I'd prefer nearly anything over tearing.

    • @danielgomez7236
      @danielgomez7236 5 лет назад +5

      You can also try this trick: disable some poorly implemented in-game V-Sync, then force V-Sync from NVIDIA/AMD control panel, works for some games.

    • @4reazon331
      @4reazon331 5 лет назад

      @@Neiva71 as i said

    • @4reazon331
      @4reazon331 5 лет назад +1

      My cousin doesnt feel the inputlag as well...but I definitely do, when I play with his pc...his senses are just not that sensible...I'm sure its the same with your people...some people still say there is no difference between 60 and 144hz

  • @prahstik
    @prahstik 5 лет назад

    So you should still have vsync turn on even if you are using gsync? Vsync is enabled if you pass the refresh rate of the monitor. And if you are below the refresh rate it is gsync that is working?

  • @hitbm4755
    @hitbm4755 3 года назад +1

    Hi @Digital Foundry, please tell me how I can limit my games to smooth console 30FPS VSync with Radeon RX 480. My double buffered VSync has never hard locked to 30FPS as you described and averages any framerate less than or equal 60FPS, even in games that support it, I have only seen my Intel HD 3000 graphics do this with in-game VSync.

  • @TCapButterCup
    @TCapButterCup 4 года назад

    Yeah took me a while to find out, I remember playing doom and seeing lines appearing thinking my new laptop is broken until I found this out.

  • @netocavalcanti
    @netocavalcanti 5 лет назад

    Anyone else having problems with DF videos on Chromecast? Videos plays nice but the audio sometimes freezes and go back again after a 2-3 seconds. Only with Digital Foundry videos. Weird.

  • @Tjorriemorrie
    @Tjorriemorrie Год назад +1

    Not sure it informed me when to use it...

  • @Magnulus76
    @Magnulus76 2 года назад

    Unless you play very fast paced FPS games, none of this matters, just go with regular vsync or adaptive sync.

  • @hexpunK1
    @hexpunK1 5 лет назад

    I am eternally pissed off that nVidia pulled the GSync upgrade module from the market. I have one of the few monitors that can take it, and would love to have it installed. But that's a big no-no. At the time GSync enabled monitors were way out of my available income...

  • @osvaldot2257
    @osvaldot2257 5 лет назад

    Great video.

  • @rohan6454
    @rohan6454 5 лет назад

    Make a video about running freesync on nvedia gpu. Also get a CRT problemo solved.

  • @FinalLuigi
    @FinalLuigi 5 лет назад

    So that's why Serious Sam looked so jittery, I was always curious about it.

  • @ExtremalMetal
    @ExtremalMetal 5 лет назад

    15:15 I don't see adaptive sync here, it's just all standard v-sync options, iirc triple buffering.

  • @judas_physicist
    @judas_physicist 5 лет назад

    Unless the game has insanely maddening tearing (only RAGE comes to my mind) I usually play with it off. I've grown to accept tearing through the years similar with AA. The methods to get rid of them (both tearing and aliasing) introduce a whole other host of issues (weirdo fps jumps for v-sync, very high performance cost for msaa, vaseline screen for fxaa/smaa and vaseline+ghosting with taa) so I look at those issues and go; eeeh I'll just stick with the jaggies and the tearing, they ain't too bad compares to the side effecs.

  • @MLWJ1993
    @MLWJ1993 5 лет назад +1

    No longer have this issue, adaptive sync is a nice luxury.

    • @Sultansekte
      @Sultansekte 3 года назад

      but adaptive vsync is pretty useless, cause it's deactivating vsync below your refresh rate and you got tearing again instead of micro-stuttering. Not a good exchange for me.

    • @MLWJ1993
      @MLWJ1993 3 года назад +1

      @@Sultansekte adaptive sync, not adaptive vsync.
      A.K.A. VESA adaptive sync (Freesync) / G-sync. Otherwise known as variable refresh rate.

  • @Marci124
    @Marci124 5 лет назад

    Ironically I'm watching this on my work notebook with some arcane OPTIMUS setup... it has screen tearing down to 360p.

  • @thybigballs
    @thybigballs 3 года назад

    Informative video. Thanks.

  • @ANTANTANT360
    @ANTANTANT360 5 лет назад +724

    Ah yes, nothing like waking up and learning about V-Sync.

    • @melxb
      @melxb 5 лет назад +5

      but what is g sync

    • @PrayTellGaming
      @PrayTellGaming 5 лет назад +2

      This literally just happened to me haha woke up, brewed coffee, checked youtube, opened this xD

    • @t4ky0n
      @t4ky0n 5 лет назад +4

      @@melxb its basically a nvidia version of v sync thats built into the hardware and it has less input lag than v sync

    • @melxb
      @melxb 5 лет назад +1

      @@t4ky0n how does free sync compare

    • @t4ky0n
      @t4ky0n 5 лет назад +1

      @@melxb if you're asking about what it is, it's the AMD version of G-sync im not certain about the difference in performance but im about to look it up anyway so ill get back to you on that.

  • @Poever
    @Poever 5 лет назад +150

    It’s shocking how many last-gen games suffered from screen tearing and the average person had no idea it was a thing

  • @Tasso-d2
    @Tasso-d2 5 лет назад +515

    Loving the pc focused content keep up the good work DF.

    • @Ample17
      @Ample17 5 лет назад +15

      Why does this "pc masterrace" thing shine thru every pc focused comment ? I dont get why so many pc players are such platform Nazis.
      It should not matter where ones plays. Can't we all just enjoy games and technology ?

    • @snozbaries7652
      @snozbaries7652 5 лет назад +32

      @@Ample17 seriously dude?? Have you read the console wars comments on their console content? btw nothing in that comment hinted at the "master race "perspective.

    • @georganatoly6646
      @georganatoly6646 5 лет назад +2

      for this comment to be true you would have to include consoles in the category of 'pc focused content', this comment is a self-contradictory statement!

    • @ReLapse85
      @ReLapse85 5 лет назад +1

      Ermmm. Not just pc

    • @DatGrunt
      @DatGrunt 5 лет назад +11

      @@Ample17 Lmao I don't know how you got all of that from his comment but OK.
      It's really funny though, console gamers are more rabid when it comes to being "platform nazis". But when PC gamers do it OH GOD NO PLZ THINK OF THE CONSOLE CHILDREN!!

  • @Xilefian
    @Xilefian 5 лет назад +294

    I had to explain VSync & triple buffering to a developer the other day when they were perplexed about their engine not exceeding 60 FPS and their profile tools reporting a mysteries chunk of CPU activity that took up more than 12 milliseconds in their 16.666ms frame. Was a bit of a weird conversation to have with someone who was in charge of a game's development.

    • @XenogearsPS
      @XenogearsPS 5 лет назад +98

      @@Relex_92 There's actually too much information too keep up with. If the developer has a small team then it's possible.

    • @zombievac
      @zombievac 5 лет назад +38

      Unfortunately that, or usually even worse, is very common in software devs. I know many who’ve made applications for Windows for 20 years and STILL somehow don’t even know the very basics of how the OS works (and therefore no knowledge of how to make a decent application for it that does things “right” for that OS)... and I’m talking not knowing shit like “you need to reboot your PC every once in a while to prevent glitches and poor performance, or as one of the first things to try if something stops working properly”. Its that bad for MOST developers today, who - keep in mind - are usually developing apps and not games, since the talented ones often go into gaming development

    • @Xilefian
      @Xilefian 5 лет назад +36

      @@zombievac the talented developers going into game dev is something I noticed recently. I was out with a friend and I met the owner of a software company, he asked me what languages do I know and I mentioned C/C++ and he lost his shit - saying that there's such huge demands for C/C++ developers right now. I thought it was weird because I'm used to everyone around me knowing C/C++, not knowing the language is a weird thing, but then I realised that the problem is the C/C++ developers these days move immediately into high performance game development or some critical application development for very large projects, leaving the rest of the software industry oddly starved of low-level, hard-core developers.

    • @tomstonemale
      @tomstonemale 5 лет назад +8

      I honestly find it not that hard to believe. I can run a car but I had no idea how the thing actually works, for example

    • @rasheedqe
      @rasheedqe 5 лет назад +2

      To be fair in the manual for most game engines it will just say prevents tearing and limits you framerate to device refresh.

  • @daniloberserk
    @daniloberserk 2 года назад +22

    It would to be great to have an updated video on this topic, since now we have things like Fast Sync and RTSS Scanline Sync. Both options are great to use with motion blur reduction (that can't be used with Freesync/Gsync).

    • @turmspitzewerk
      @turmspitzewerk Год назад +1

      what DF refers to as "true triple buffering" is nvidia fast sync / amd enhanced sync.
      but yeah, i use RTSS scanline sync and it is like black magic to me. i've tried to look how it works multiple times, but its like a black box with zero documentation anywhere i could find online.
      i feel like i mostly understand it... but how the hell does it just know how to move the tear off screen like that? and why does it work exactly the same on classic games and new games where i can have 60-400 FPS, but when i try to play a really old game like half life it breaks and needs to be reconfigured? its so bizarre.

  • @Edmundostudios
    @Edmundostudios 5 лет назад +169

    I’m surprised this video was only made now

    • @GuySocket
      @GuySocket 5 лет назад +5

      This video didn't need to be made. You just need a ten second clip saying "Vsync is the work of Satan and developers should just optimize performance better."

    • @forestR1
      @forestR1 5 лет назад +6

      @@GuySocket they could do that if everyone had the exact same hardware, and wanted the same resolution and framerate. We don't.

    • @nextlifeonearth
      @nextlifeonearth 5 лет назад +2

      @@GuySocket With high refresh rates you get even more tearing without V-Sync (or some other sync method).

    • @plutterton
      @plutterton 5 лет назад +1

      @@GuySocket sounds like you need to watch the video because that's not what causes tearing.....

    • @sand0decker
      @sand0decker 5 лет назад

      I've wondered what it does and have seen suggestions about what it does everywhere for performance.
      It turns out the only correct option is disable for casual players who don't have those expensive graphic cards

  • @thunderrun1528
    @thunderrun1528 5 лет назад +35

    Should have mentioned the new screen tear eliminating feature of Riva Tuner Statistics Server called "Scanline Sync". Works wonders, especially for older games!

    • @-AmirulNaim
      @-AmirulNaim 5 лет назад

      Thunder Run explain si

    • @pepengpantal1363
      @pepengpantal1363 5 лет назад +5

      This shit's perfect for emulators.

    • @GoldSrc_
      @GoldSrc_ 5 лет назад

      Sounds interesting, thanks for letting me know about it.

    • @leruty
      @leruty 5 лет назад

      Yeah but it took SO MUCH time to deliver until now, they better have already included this back in the later 2000s.

    • @coppercanyonhealth4594
      @coppercanyonhealth4594 4 года назад

      yea for older games def, i believe you have to have 2x or more of your refresh for a smooth experience. otherwise i experience judder but its great when it works perfetct

  • @xMTxcameron
    @xMTxcameron 5 лет назад +176

    I'm loving those Serious Sam tracks.

    • @snozbaries7652
      @snozbaries7652 5 лет назад +6

      SS in VR is pretty awesome :)

    • @butterbean_01
      @butterbean_01 5 лет назад +5

      Ss in VR Is SERIOUSLY awesome! I'm doing a vr LIVE walkthrough in those days, check my channel for those!

    • @fireaza
      @fireaza 5 лет назад +4

      Serious Sam VR is magical. The ability to aim dual guns in any direction like you're in Equilibrium? Hell yeah!

    • @xMTxcameron
      @xMTxcameron 5 лет назад +3

      VR is something I've never tried. Serious Sam in VR looks really awesome though!

  • @bluedragon9925
    @bluedragon9925 5 лет назад +81

    I'm a PC gamer who pretty much *always* turns V-sync on no matter what simply because I legitimately *despise* screen tearing (it actually legit hurts my eyes)!
    I know it adds input lag but honestly, I don't really care too much about that because since I'm not at all interested in speedrunning or competitive online multiplayer, my gaming tastes are such that the lag V-sync adds doesn't really affect me all that much.

    • @lobsterbark
      @lobsterbark 5 лет назад

      I find that it depends on what framerate you are getting. If you are getting just a bit above your refresh rate, the tearing is horrible. If you are getting way above, I don't notice it. I always turn vsync off, and try to adjust my settings so I get 80-100 fps.

    • @HalfdeadRider
      @HalfdeadRider 4 года назад

      To get best results, before even changing in game settings get your PC performance settings and GPU control panel settings right for your setup.

    • @OutlawedPoet
      @OutlawedPoet 4 года назад

      I use it for singleplayer games. One thing that could help you out is fast sync, which eliminates screen tearing while not capping your fps to your monitors refresh rate and it dosen't add input lag. Its only for nvidia gpus from what I recall though, you could also just buy a gsync monitor if you have a big wallet.

    • @Hanfgurkenhasser
      @Hanfgurkenhasser 4 года назад

      Funnily enough I haven't had screen tearing in about 8 years or so. Not sure how but I guess I am doing something right.

    • @guser436
      @guser436 4 года назад

      FreeSync/GSync is the only sync I'll ever use, I hate screen tearing but I hate input lag even more
      FreeSync/GSync is *better* than VSync, it makes the game feel smooth even when I hit low FPS.

  • @Jonehoo
    @Jonehoo 5 лет назад +31

    I've always been using V-sync, whatever the outcome. Tearing is an absolute no-no for me.

  • @xyz2theb
    @xyz2theb 5 лет назад +192

    ive used gsync for the past 3 years and never looked back.

    •  5 лет назад +25

      Me too. V sync can be borked on some games, g-sync just bypasses all that nonsense and runs your games smooth and lag free, as long as you cap fps a few under the monitor hz limit.

    • @jonathanjariani6029
      @jonathanjariani6029 5 лет назад +16

      Same. Been using a 1440p 165hz overclocked screen /w gsync. Its faaaantastic

    • @haggisman0812
      @haggisman0812 5 лет назад +14

      word, gsync is the ssd of monitor upgrades.

    • @metalpizza123
      @metalpizza123 5 лет назад +5

      @ so for 144hz you cap it at 142? or is it guesswork? Most games just have a 144 fps limit so do you use RTSS?

    • @AlexElectric9001
      @AlexElectric9001 5 лет назад +5

      X34 predator with gsync here! Love it!!

  • @goggamanxp
    @goggamanxp 5 лет назад +78

    On my pc, if I can reach over 90 fps I just disable vsync entirely as it generally doesn't distract me on the games I play. If I play a game that runs at 60-70fps on average with the chance of it going below that, I use adaptive vsync. If a game goes below 60fps, I care more about maintaining the best input latency I can rather than if there's tearing or not.
    My PC currently has a gtx 980 in it, and since Nvidia seems to not want to make gsync affordable, I'll probably switch to an AMD gpu on my next PC so that I can get variable refresh rates without having to destroy my wallet.

    • @chrisrichfield8906
      @chrisrichfield8906 5 лет назад +3

      Agreed, I just turn down settings until I get a solid 120fps, I would rather play 900p 120hz than 1440p 60hz

    • @zombievac
      @zombievac 5 лет назад +4

      But why? Are you a pro with lots of money on the line? Vsync has been good enough for a long time now for the input lag to be nearly negligible if your PC has decent specs like you mentioned, and tearing is so much more distracting (and vsync has the nice benefit of capping your FPS to your refresh rate, because variable FPS and therefore variable input lag is even worse because you can’t adjust to it)

    • @DatGrunt
      @DatGrunt 5 лет назад +16

      Well pro or not I still dont want to be handicapped by the input lag it introduces. It feels sluggish and I don't like that feeling.

    • @awesomeferret
      @awesomeferret 5 лет назад

      On a 60 hz monitor in many games, if you get over 100FPS it can still look way way way more jittery than 60fps vsync. Sad reality.

    • @nextlifeonearth
      @nextlifeonearth 5 лет назад

      ​@@zombievac That's probably why most games since its inception gave the option to turn V-Sync off, right?
      You don't have to be a pro to appreciate occasional lower input lag over the same, but higher input lag all the time. Also that's not correct. If you drop down to the next V-Sync step your input lag is doubled within a single frame. That's more inconsistent than V-Sync off will ever get you.

  • @dharkbizkit
    @dharkbizkit 5 лет назад +7

    what i still dont fully understand is why i get tearing on my 144hz screen even when i have constant 144 fps. no matter if vsync is on or if i use a framelock

  • @GugureSux
    @GugureSux 5 лет назад +7

    I love that you're using Serious Sam games in the background.
    Criminally underrated gems! SS3:BFE alone was like the FAR superior, less casual nu-DOOM.

  • @Venom_Byte
    @Venom_Byte 5 лет назад +49

    G-Sync & Free-Sync.
    Thank you.

    • @brajuhani
      @brajuhani 4 года назад

      Yea, I did not like V-Sync. Free-Sync is amazing.

  • @tygeezy
    @tygeezy 4 года назад +2

    The higher the refresh rate of your display the better vsync is. 120 hz gives you 60, 40, 30, 24 FPS ect without the input lag penalty of running up against your displays max refresh rate or the stutter by running between those framerates with vsync.

  • @balaam_7087
    @balaam_7087 5 лет назад +29

    This was explained very well. I’m someone with a rather limited knowledge of these behind the scenes type of information, and I found this easy to digest. Good job Alex!

  • @tontonzeed
    @tontonzeed 5 лет назад +8

    Variable refresh rate is the way, G-sync cost a lot but damn it's good !!

  • @alexanabolic5099
    @alexanabolic5099 5 лет назад +32

    My gaming experince totally changed since I got my gsync monitor. No more tearing or shutering when I am below my refresh rate. I would not go back

    • @alexanabolic5099
      @alexanabolic5099 5 лет назад +1

      Wyatt Cheng I can't wait! 😂

    • @znubionek
      @znubionek 5 лет назад +3

      Wyatt Cheng go play candy crush

    • @andrerevolution455
      @andrerevolution455 5 лет назад +2

      Agree man, g-sync is glorious

    • @coppercanyonhealth4594
      @coppercanyonhealth4594 4 года назад

      im getting tired of trying to find a compromise between tearing and lag lol think i might just get one too

  • @nicwalsh9537
    @nicwalsh9537 5 лет назад +6

    Hey Digital Foundry can you do a video on the new Scan Line Sync in Rivatuner Statistics Server?

  • @Zock3rB3ast
    @Zock3rB3ast 5 лет назад +5

    Rivatuner Statistics server has now an option to control the tearline, it's quite interesting.

    • @bodombeastmode
      @bodombeastmode 5 лет назад

      Really? I need to check that out.

    • @leruty
      @leruty 5 лет назад +1

      Yes, I even heard you can get the tearline to appear "nowhere" by using that method, so it´s not on your screen at all.

  • @chillnagasden6190
    @chillnagasden6190 5 лет назад +6

    Great video!
    I didn't know how Triple Buffering worked before - it's great to know how it works

  • @icyjiub2228
    @icyjiub2228 5 лет назад +24

    HOLY SHIT. SOMEONE FIXED THE ANIMATION INTERPOLATION IN HALO PC????

    • @KhangHoang-ks3nm
      @KhangHoang-ks3nm 5 лет назад

      You can use Chimera and the 60FPS animation feature built into it.

    • @ShellyHerself
      @ShellyHerself 5 лет назад

      @@KhangHoang-ks3nm I know it's stingy, but it's actually full on interpolation. So whatever your fps is, that is how many times the animations will update per second.

  • @VideoGameKillCounts
    @VideoGameKillCounts Год назад +1

    The down side to this shift to VRR is going to be the frame pace issues when recording a game… trust me it’s been an absolute pain, only fix so far is to disable G-Sync and lock V-Sync to fixed refresh of the recording capture framerate… so 60fps. ☹️

  • @EastyyBlogspot
    @EastyyBlogspot 5 лет назад +6

    I find screen tearing, frame pacing stutters incredibly distracting. Years ago got a G sync but i dunno if its because the frame rates are not right for the G sync but i really do not notice a difference compared to v sync or adaptive V sync

    • @underflip2
      @underflip2 5 лет назад +4

      If u never get dips in fps u will not notice a difference. Its when the framerate fluctuates u notice the power of g-sync. With a gsync monitor 45 fps will feel about the same as 60 fps with a normal monitor.

    • @brajuhani
      @brajuhani 4 года назад

      @@underflip2 That's not what G-Sync does, lol.

  • @EVPointMaster
    @EVPointMaster 5 лет назад +21

    I think the video should have included more timeline graphs of the time it takes to render/refresh, to make it easier to understand. Also for people that are not familiar with this stuff, it should have been pointed out, that higher refresh rates significantly reduce tearing.

    • @anteksadkowski5166
      @anteksadkowski5166 9 месяцев назад

      I know I am anserwing after a long time but is higher fps=lower tearing still a thing when fps is getting higher than monitor's refresh rate?

    • @EVPointMaster
      @EVPointMaster 9 месяцев назад

      @@anteksadkowski5166 You may get more tearlines, but the difference between the frames, so the offset, is smaller so each tearline should be a bit less noticeable.

  • @OldsXCool
    @OldsXCool 5 лет назад +4

    Alex this was an absolutely awesome video! This will be the definitive video I lead people to so they can better understand what V-sync is, its variations, and how they should be used. V-sync has to be the most misunderstood aspect about PC gaming because it's believed by many to be the enemy or just something that caps your frame rate. Great job again, Alex.

  • @andrerevolution455
    @andrerevolution455 5 лет назад +4

    awesome work Alex, this year i started using G-Sync on 144hz display, its glorious.

  • @Ajstyle48
    @Ajstyle48 5 лет назад +30

    im using fast sync for all my games and after that never used vsync... pls do a video on fast sync vs free sync and g sync...

    • @sd19delta16
      @sd19delta16 5 лет назад +11

      Actually, what Alex called "True" Triple-Buffered V-Sync is Fast Sync. I'm not sure why he called it that to be honest.

    • @Ajstyle48
      @Ajstyle48 5 лет назад

      @Wyatt Cheng im not playing diablo games.. and i dont like mobile games either..

    • @Ajstyle48
      @Ajstyle48 5 лет назад +5

      @@sd19delta16 yep its kind of tripple buffer . Gpu produce infinte frame bcs its think that vsync off and then fast sync chose best frame and show it on screen.. correct me if im wrong.. !!

    • @prich0382
      @prich0382 5 лет назад +4

      FastSync needs the frame rate to be at least double that of the refresh rate, preferably 3 times fasteror more so only best used for games like CS:GO, it massively reduces delay and removes tearing as it shows the most up to date frame every time, so say 120 Hz Display but the game is say 480 FPS, then the frame delay is not 8.333 ms but 2.083 ms, good stuff it is. Bit remembered, need at least double the frame rate of the display refresh rate, if not at least double, it pretty much is just normal vsync then so useless

    • @92trdman
      @92trdman 5 лет назад +4

      sd19delta, basically the different are fast sync do not limit your gpu frame rate and still maintain viewable frame for your monitor.(to reduce stuttering and input lag)

  • @kaushikkampli6911
    @kaushikkampli6911 5 лет назад +78

    I cannot play a single game without vsnc

    • @Inf7cted
      @Inf7cted 5 лет назад +9

      i use G-sync. I can live without V-sync

    • @hazy33
      @hazy33 5 лет назад +23

      Yup me neither, screen tearing is horrendous.

    • @nikobellic7455
      @nikobellic7455 5 лет назад +2

      I use freesync, no need for vsync.

    • @SummonerArthur
      @SummonerArthur 5 лет назад +2

      I also cant. But its because my laptop overheats and v-sync locks it at 60fps

    • @kosmosyche
      @kosmosyche 5 лет назад +7

      Yeah, I am also very sensitive to tearing. Quake was the first game I turned VSync on (on 3DFX Voodoo1). Still remember the console command vid_wait 1. Since then I haven't played a single game without Vsync, cause it turns out I hate tearing much more than slow framerates or input lag or low resolutions. One of the worst and ugliest things that can happen to a game for me is if it starts tearing. Also the reason I don't bother with consoles, where you are basically left at the discretion of developers in this matter.

  • @SirMatthew
    @SirMatthew 5 лет назад +15

    The clips with tearing were painful to watch

  • @antraxbeta23
    @antraxbeta23 5 лет назад +41

    freesync master race (30-144hz range

    • @madson-web
      @madson-web 5 лет назад +2

      When business monitors adopts it, it will be the end of tearing

    • @d1oftwins
      @d1oftwins 5 лет назад +5

      @@madson-web
      And hopefully the end of overpriced G-Sync technology.

    • @tomtalk24
      @tomtalk24 5 лет назад +3

      @@madson-web No need in business, unless your powerpoint presentation really needs it lol. Creative yes.

    • @madson-web
      @madson-web 5 лет назад

      @@tomtalk24 so yeah, it will never be a standard this way. This thing needs to be releasing from "gamer" tag somehow

    • @tomtalk24
      @tomtalk24 5 лет назад +1

      Not everyone plays games, and not all gamers play on PC. So it wont ever be a "standard" as it has no use anywhere else other than in rendering graphics in real time. Thus a very small portion of the display market. We're talking about gsync/freesync, not HD/4K/HDR or whatever else thats in use (and usable) across the entire market.

  • @saruboss18
    @saruboss18 5 лет назад +4

    I use it in every game, or the screen tearing gets really bad, it's distracting

    • @saruboss18
      @saruboss18 5 лет назад

      @Wyatt Cheng where did diablo immortal came from???

  • @aalidas8951
    @aalidas8951 4 года назад +2

    my friend calls vsync shit, but his torn ass screen makes me sick

  • @psndude101
    @psndude101 5 лет назад +7

    I remember last gen when consolers swore Vsync didnt exist. the human eye couldnt see it, just like above 720p, just like above 30fps.
    Funny how they change their tune when consoles get it. Textures, i remember "debating" with consolers about improved textures and they said "who stands and looks at rocks when playing a game", now they bicker and argue with each other about which plastic boxes rocks look better. ha ha
    console hypocrite race

    • @bonjovi7399
      @bonjovi7399 5 лет назад +2

      Hilarious how they chop and change every other minute, pretend not to see stuff until they have it then all of a sudden its like the second coming of christ if the other console doesnt have it. then when they dont have it its back to being unable to see the difference.

    • @snetmotnosrorb3946
      @snetmotnosrorb3946 5 лет назад +1

      Most console gamers are contempt with what they have. Engaging in a technical discussion is mostly useless.

  • @Gamer2Key
    @Gamer2Key 5 лет назад +1

    So a game in borderless window mode automatically switch to triple buffering? That's explain a lot. I always have micro stuttering in borderless! :O

  • @pepeiann
    @pepeiann 5 лет назад +39

    I usually turn on v-sync to get rid of screen tearing.

    • @dlbutters7164
      @dlbutters7164 5 лет назад +12

      No shit! (-_-)'

    • @madfinntech
      @madfinntech 5 лет назад +1

      Same if playing on TV instead of my G-sync monitors. But I also make sure I hit the constant 60 fps.

    • @SummonerArthur
      @SummonerArthur 5 лет назад +4

      I usually use v-sync to lock the game at 60fps so my laptop wont overheat

    • @pepeiann
      @pepeiann 5 лет назад

      Yeah lmao

    • @zombievac
      @zombievac 5 лет назад +6

      I usually breathe air to survive. I know, I know... that’s way too interesting for this thread!

  • @jimmybrite
    @jimmybrite 5 лет назад +1

    Nah, you should use Scanline Sync with RivaTunerStatisticsServer (Included with MSI Afterburner). All the benefits of vsync without input lag, at least for someone like me on a 1080p60 panel.

  • @SummonerArthur
    @SummonerArthur 5 лет назад +14

    I cant be the only one that uses v-sync to lock the game at 60 fps so my pc wont overheat

    • @danyCD17
      @danyCD17 5 лет назад +6

      Usually, people lock their framerate using rivatuner

    • @mrmagoo-i2l
      @mrmagoo-i2l 5 лет назад +4

      No, only on older games like the first Witcher and Divinity.
      Cap at 60, force v sync.
      Otherwise they run terribly.

    • @SummonerArthur
      @SummonerArthur 5 лет назад

      @@mrmagoo-i2l Exactly.

    • @chrisrichfield8906
      @chrisrichfield8906 5 лет назад

      Sometimes when my laptop gets really hot

    • @locxha96
      @locxha96 5 лет назад +3

      you can get rivatuner or even better, nvidia inspector

  • @MHT-112
    @MHT-112 5 лет назад +4

    Digital Foundry: "[...] und auf Wiedersehen."
    Me, a german: "Wait a minute"

    • @leruty
      @leruty 5 лет назад

      Yeah man, a german in the team of DF :D

  • @thingsiplay
    @thingsiplay 5 лет назад +12

    G-Sync is the best, but comes of the high licensing costs of Fu-King Nvidia…

    • @nextlifeonearth
      @nextlifeonearth 5 лет назад +4

      Freesync to the rescue. All you have to do is give even less money to Nvidia and give some to AMD instead.

    • @thingsiplay
      @thingsiplay 5 лет назад

      @Ignignokt did you delete your comment?

    • @David-ud9ju
      @David-ud9ju 5 лет назад +1

      @@nextlifeonearth Freesync isn't as good as gsync and you can't use Nvidia GPUs with free sync, which means you'll have to use an AMD GPU and they're just not up to the standard of Nvidia's GPUs. AMD have really been playing catch up with Intel and NVidia for years and, until they start actually coming out with products that can compete with Nvidia and Intel, we should avoid them or otherwise they won't change their business practice.

    • @forestR1
      @forestR1 5 лет назад

      Not sure how freesync is not as good as gsync, it 100% works, and the measured delay is the same as g-syncs best results (with gsync having inconsistent measured delays(source - bitwit and hardware unboxed). RX 580 competes directly with 1060 in price and performs on par with it. the next step up vega 56 and GTX 1070, they are the same price and the vega beats it convincingly in most games. add on to that the fact that you can buy a vaga 56 and freesync monitor for the same money as a 1060 and gsync monitor, and the only reason you cant use Nvidia GPUs with freesync is because Nvidia specifically wont let you (it would take nothing more than a firmware update) and then consider who should be avoided until they change their business practices.