People keep saying things along the lines of this so let me just inform you all that what happens in one engine has no bearing on another engine. Patch 1.1 Removed bit where I spoke poorly about the downsides of delta timing
"skyrim was made by bethesda" 💀 great video! this is the best explanation i've heard for what ticks actually are. i've never fully understood the difference between ticks and frames before, and people talk about tickrates etc. quite a bit when it comes to setting up servers for bhopping capabilities (in css at least).
Minecraft with the Carpet mod gives an excellent demo of ticks. /tick freeze allows you to stop all game ticks while still allowing movement /tick step does one step You can speed or slow time by changing the tick rate /tick rate 20 is the default rate of the game /tick rate 10 makes 1/2 speed Its an excellent way to introduce the concept of ticks to people, and very helpful for debugging redstone contraptions
/tick freeze stops the internal server ticks, and therefore doesn’t harm or affect in any way the client itself. I think it’s quite important to make clear how Minecraft works, where your world or save is actually a separate server. The client doesn’t necessarily rely on ticks, and only on the information provided by the internal server (your world). This means that while the server might be dying, your client is likely going to be just fine as it’s not affected by the server’s tick timings. It’s pretty much like getting the official Minecraft server JAR and running it on localhost. The results would be the same
In the minecraft however there is no delta timin, everything assumes there will be 20 ticks per second and changing that will make everything slower or faster
@@ShadowTheAge delta timing is tick, and tick is delta timing. I mean, because the code has to explain to CPU what "tick" is, and the meaning of ticks for a low-level programming is delta-time (as said in the video). Maybe some games call "ticks" as "ticks" in their game code, but it is merely an abstraction, instead there's a lower-level code that actually explain what "ticks" is, and it is delta-time. For example, let's do 10ticks per second, which is delta-time of 0.1second, so for a low-level code, such as C++, you do an infinite loop and for each pass you add a ""sleep" command to CPU for 0.1second (to let the CPU rest for 0.1second or let it to do other task), and you define this as 1 tick for higher-level code like LUA, and modders and devs only need to code on LUA.
Minecraft sets ticks on the server, not the client. This is also how most multiplayer games would do this; as baissicaly all of the game code is done on servers; and the client is just a dummy, making the graphics and sending out inputs.
idk why but i have a soft spot for these type of technical explanations of video game mechanics, especially in the source engine. looking forward to see what vids u come up with next!
Something else to consider: When dt is extremely small, (extremely high frame rates) floating point accuracy issues may occur, or in other words, rounding errors. If tiny values are being accumulated into a large value, the tiny value may be too small compared to the large value and get rounded down, and the large value doesn't change. For example, if your velocity is tiny (0.001 units/sec), and you're position is large (1000 units away from 0), and the delta time is tiny (1000fps) your position may not change. 1000 (position) + 0.001 (vel) * 0.001 (deltaTIme) = 1000.0000001 (nextFramePosition), but it might just get rounded to 1000. This is why when you fly away from the center of a map extremely far, you can start to notice bugs. There are ways to fix this but sometimes devs don't bother.
A lot of problems are caused when divisions are involved too. I remember seeing bugs in some old games where values proportional to dt were being used as the divisor for certain algorithms and when they get too small that's how you get the objects that fly around too quickly and so on. I believe in the simpsons hit and run, dt was calculated with such low precision that it could be 0 on modern hardware, which would just instantly set your coordinates to NaN and the world would disappear.
I love how you explained this ! I didn't know things like the airboat gun or the screen shaking! Splatoon's game engine also uses a tick rate, but runs at 60 frames while the game itself plays at around 15 ticks online. which can result in very wacky experiences where 1 player can splat another and half a second later be splatted by the dead player. I think the reason why it's so low is because Splatoon uses Peer to Peer rather than Server-Client like Source, and a lower tick rate helps keep connections more stable.
Since it's peer to peer, the issue with being splatted by a dead player is more likely that the other player's screen showed that they killed the first player before they received info that they died, resulting in both players saying "I killed you". Since there's no server to sanity check or to prioritize one player based off timestamps, both kills end up going through regardless.
Splatoon tickrate is 60, but only sends and receives data packets from other players every 4 ticks (so 15 ticks a second). This is why replays in Splatoon 3 look really floaty and not 100% accurate in terms of the movements and actions you've made during a battle as it's showing how you would of appeared to other players (as you and other players were only being captured every 4 ticks). The packets sent still contain certain info of events that occured between those other 3 ticks when packets weren't being sent however. OatmealDome has an excellent article on Splatoon (2)'s netcode if you're interested. It's a bit technical and has more info than what I've mentioned here, but it was an interestimg read.
It deserves to be pointed out how unique the Goldsrc, Source and (to a lesser extent) IDTech engines are regarding how they process game logic in general. Everything, even the menus and single player runs on a server, a local server in this case. In a way. When you open Half Life 2 you effectively start a local dedicated server just for your game session. This is the reason enabling cheats is a server command, same with changing the tickrate. IDTech games tie some game logic to the frame rate even to this day whereas Source relies on tickrate. Frame rate dependent movement is why you have super bounce bugs in Quake 3 and Call of Duty 4.
This is because Quake was made with multiplayer in mind, so in order to maintaining a singleplayer and multiplayer versions of the same game, it's easier to just start a local server that would either way have 0 lag. Also minecraft does exactly the same.
interesting, i knew there was a physics server, but didn't know these games where running entirely on a local server. Makes sense considering how you can't run more than one Source or Goldsrc game at once. It's kinda like Minecraft creating a dedicated internal server on singleplayer, just so they wouldn't have to add features to singleplayer and multiplayer separately, so they merged singleplayer with multiplayer
In my experience with modding older unreal games, which are basically a really strict implementation of deltatime with the tickrate matching your current FPS, every single "high FPS issue" has boiled down to bad code, or very rarely just not enough precision. Big example in one of the old unreal games I mod, Harry Potter and the Chamber of Secrets (Classic PC adventure game if any of y'all never played it, it's actually really well made) has a lot of random high FPS bugs, most of the ones that are game specific were made by one guy who always referred to DeltaTime (the normal way) as DTime. Basically all of his equations would produce highly inconsistent results to the point we've nicknamed him DTime guy, and some of his comments very confidently stated that "DeltaTime isn't accurate" and made way worse code because he just didn't quite understand what he was doing wrong lol. Wonder if he ever learned what was wrong in these 20 years haha But yeah this shit is like a super common problem when people overthink things
@@richardvlasek2445 Literally everything because we were given source code to a proto version of the game lmfao. There's a lot of really cool shit going on in the modding discord all the time if you're interested at all. As for what content people want to make, depends on your creativity ofc. People are getting more and more into custom code and full game mods, which weren't possible with previous tools
I didn't know Harry Potter and the Chamber of Secrets was considered a part of the Unreal series, but surely it makes sense when you think about it!!!!!!111
@@OmegaRC59 Remember Unreal is a game(series) and Unreal Engine is a game engine, if you ever disrespect the Unreal series like this again I will find you, and one day stand beside you at the bus stop making a mildly annoying noise for a few seconds! You hearing me dude!!!
Minecraft is, weirdly enough, an interesting example of such technique. As far as I know, it not only separated gameplay update and rendering update, it also puts them in completely different threads. Therefore, minecraft is like, the only game I know of, that can lag without drops in FPS. Stuff just won't move, but still render in 60fps .....Though maybe I'm misunderstanding why exactly those things happen
minecraft is a weird example, as you suggested, the client and server are isolated from eachother, if the server runs at 1/5 the TPS (ticks per second) (its also worth mentioning MSPT, milliseconds per tick, which is the measure of how long each tick takes to calculate, useful for lag scale testing and benchmarking) the default is 20 btw so 1/5 would be 4, the game will run 4 times slower on the server end, so mobs will move 4 times slower and etc... however the player will still be moving normally, its also possible you'll experience rubber banding as well just because of the nature of lagging, but there are some very cursed things you can do because of this. similarly your client can be experiencing 2fps while the server can be running at 20tps perfectly, this is really annoying from the player side but the server can be practically idling, for instance, idk about newer versions but in 1.17 if you place about 125000 composters and place a piston and start clocking it at a 5gt interval (fast observer clock) the client will absolutely kill itself, however basically nothing happens to the server. Why the client kills itself is a mystery to me but the suggestion that the client for some reason checks 125k composters every piston activation (on the client side) is fucking hilarious. In recent versions the lighting has been decoupled from the server thread and up until you could artificially slow down the lighting thread so much that it just quit processing updates all together, this allowed you to get cursed lighting sources as well as objects that should emit light, not emitting light (portals for instance) though this specific issue was patched fairly shortly after 1.16 iirc, the threads are still decoupled though. excuse the long winded rant, im a bit of a technical player and have seen some shit so i know a bit about the game lol.
The minecraft client's tickrate is actually tied to FPS, indirectly. The game attempts to tick every frame. It will do a max of 10 ticks per frame. This means that at 1 FPS, your game will run at half speed. 0.5 FPS, quarter speed. No clue what the fuck mojang was thinking.
IIRC the Quake III engine not only has separate threads for server & client, but they have to use the OS's network stack to communicate, as if the server was a dedicated instance on another system makes sense for a multiplayer focused game, saves a lot of unnecessary code, and apparently worked fine for the singleplayer games that ran the engine as well.
@@liquidextal iirc OSes are just smart enough to detect programs using network stack to communicate on the same machine and run the communications through internal messaging structures instead of full network stack.
I'm familiar with how this works in Unity so maybe I can offer a little insight as well. Like any modern game engine including Source Unity has a variable frame rate as well as a tick rate... this results in two different ways your scripts can run code regularly (well there are others but these are the types discussed in this video). "Updates" happen every frame. These are useful for updating things that only change visually. For example if I am running at 30 FPS I don't need to update my hud at a full 60 times per second since it is only rendered 30 times. So it makes sense to use the Update functionality. I can add an Update() function to any script and use Time.deltaTime to figure out how long its been since the last Update() call. "Fixed Updates" happen every 1/60 of a second by default, but this can be changed per project in Unity. This allows for code that must be run regularly at a fixed rate, such as physics updates. This calls the FixedUpdate() function on any script and Time.fixedDeltaTime can be referenced to determine the time since the last FixedUpdate() call. Since the interval is configured per-project, it makes sense to use fixedDeltaTime despite it being unchanging (well, if the code is slow enough, potentially it could miss a call I suppose) for plugins or even just your own project code to properly handle fixed intervals. Related, Unity also has time scale, which allows you to slow down and speed up the game (Source has a similar concept). The Time class can handle this too and provide unscaled real time intervals (the ones I documented above are scaled based on time scale). It's useful for any time-effects in your game as well as pausing the game with a zero time scale for a pause screen and such.
@Hoovy Simulator 2 host_framerate is a descendant from Goldsource. In there, if it's not zero, the engine overrides the "delta time" variable used for game logic with that command's value. So if you set it to 0.1, Goldsource would consider each frame to have taken 0.1 seconds. If you're running at 60FPS, which is 1/60 ≈ 0.0166 seconds per frame, this would result in the game logic running at 0.1/0.0166 ≈ 6 times faster than intended. Infact, if your FPS is a constant value like 60, you can set host_framerate to an equivalent value like 1/60 and the experience will seem basically the same. That's all for Half Life 1's Goldsource though. In Half Life 2's Source, I'd imagine it's the same thing, but in relation to the ticks and the tickrate lol
It’s worth noting that the tick system was made specifically for multiplayer. The server runs the game simulation and replicates the tick snapshots across the network so multiple source clients can view roughly the same game state. I suspect they use that model for single player simply because it’s easier to have unified game logic not because they thought it would fix bugs that are linked to framerate. You can read more about this tick system and source networking on the valve software wiki. This also means that the assumption that the tick rate would always be 66 is very sane. You shouldn't need to change the tick rate of a single player game at all so the option being exposed is a bit silly.
While it's true that this video is just a simplified explication of the client/server architecture, there's a reason games like Minecraft also use it in single player - it solves performance problems and is much more predictable. Before Minecraft separated the server and client, chunk generation used to briefly freeze the game. (Unrelated , but wayyy back when, Minecraft used to use the SYSTEM CLOCK to calculate delta time)
@pinsplash I mean that the Source engine’s tick system is much much more than a simple fixed rate function system. Other game engines have fixed rate functions but that alone is no where near enough to make multiplayer possible. You mentioned that Source does interpolation but you didn’t explain why. Source does interpolation because the game state is only updated on each tick. The vast majority of games update the game state on every frame so it’s always deterministic and there is no jitter. So why does source do it? Because you can’t instantly broadcast every state update over the network. It’s just too expensive and error prone. It’s a model that is developed solely for multiplayer and reused for single player because it’s easier than maintaining two distinct systems. I’m not saying that you’re wrong. It does help fix framerate dependency issues but talking about the broader context is fun and informative . :)
@Hunter M Using clocks to synchronize the client and server is really quite difficult. It means that the client and server need to have their clocks tightly synchronized and all clients need to have a timer with high resolution (1ms or less). Also, source clients don’t really “run” the server ticks. They just store the snapshots and use them for interpolation. The client can sample the ticks at a rate different than the server without much issues if tuned correctly. In fact the interpolation period is 100ms is so clients can continuously interpolate even if they’re receiving ticks slower than 66 per second. Why? Valve was accounting for packet loss, network jitter, etc
what bethesda games do is tie framerate to game logic. essentially the higher the framerate, the faster the game thinks, and the lower the slower, or framerate = tickrate. they actually fixed this problem in fallout 76, most likely using a separate parameter similar to valve's tickrate parameter in order to prevent people from getting free speedhacks via uncapping their framerate and looking at the ground, like goldeneye except to an extreme degree
I remember playing Danganronpa 1 on game pass and my pc ran it at like 1200 fps, it was so funny watching how fast it moved. It’s a lot less funny and more embarrassing when it’s done by a triple A company in the modern age (Bethesda)
This leads me to believe that using ticks almost singlehandedly futureproofed most Valve titles. It works unbelievably well with high FPS, and if the game was like CSGO where the devs intended the users to be able to switch tickrates, it also scales very well with higher tickrates, essentially being objectively superior to the frametime-based method most other games use. I think it "starts" to break when you get really close to 1000 FPS, such as your movement speed becoming out of sync with server in CSGO and your movement will get stuttery due to the game cosntantly fixing the desyncronization between client and the server. However, such high FPS isn't something any player is realistically going to be unable to live without, and it probably can be fixed as well if the devs cared about it. I wonder what causes THAT bug, but I guess it isn't easy to tell as CSGO's source code has never become available. By the way, as a competent CSGO player who happens to understand the basics of programming, I always knew what ticks were and how tickrates worked, but I never connected the dots and figured out that they were meant to seperate game logic from FPS. Thanks for the video!
You don't need to know how long a frame will take when using delta time, you only need to know the time that the previous frame started processing at and the current time at which the current frame is starting processing, this doesn't cause any instability by itself. It also means that what you see on the screen is actually X milliseconds in the past, where X is the time it took to process and display the frame (you COULD try compensating for it by guesstimating the time it will take to process the frame, but that's not mandatory and I frankly have no idea how well that works). The main sources of instability afaik is the delta time being too low (tickrate too high) causing floating point inaccuracies, or the delta being too high (tickrate too low) causing stuff like physics engines to freak out as object start moving such large distances per frame that they can phase through one another without technically colliding.
I use Delta Time in my games, and you actually can check how long the difference between each frame was (you can store the value in a variable and compare it). The issue with Delta Timing on the Spyro Triology is because the game simply does not take it into account correctly, it was implemented but not carefully implemented. When coding with Delta Timing you should always make sure to limit how effective Delta Timing is with where you apply it to avoid such things. Oh, also Donkey Kong 64 has the same issue. For example, a common solution to the lag issues breaking gameplay (allowing super high speeds or even clipping through walls) is to cap how much the game should compensate for it. For example, making it so if the duration is bigger than a certain amount, then cap it at a certain duration (if it's bigger than 0.20 then force it to NOT be bigger than this number). So you could make it be capped to 5 frames (about 0.2ms) as that's the lowest you want it to go, effectively making it so low framerates don't break it and pretty much fix most issues with it. I say 5 frames since usually the clipping issues or extreme boost are related to frames going below that number, or even your game freezing completely (for example, being 0 from a lag spike, causing really long frame difference making the game think it is "fine" to make you go super fast to compensate for it). But again, it requires proper knowledge on how to apply it to use it efficiently. Which funnily enough, isn't too different from what you talked about in the video, making it have a proper implementation to take into account the system correctly. lol
@@SuperShortAndSweet Yeah, that happens since windows, normally, just stop rendering and stop "thinking" completely when you move them around (to save on resources I guess). Applying the fix I mentioned helps with that, but of course it isn't perfect. One method that I saw to fix it from the root is to fake the window itself; basically making the app borderless but rendering the window with the engine, and adding all the functionalities real windows have, making it look exactly like the normal window but it isn't, and you can move it around without causing issues (since engine window movement doesn't really freeze the game when doing it).
@@Chillaxowo i think what is happening is windows grab event is constanstly beening called, and there is speed or something tied to the handle event function, you making boarderless is smart fix, i think thats why it works, another thing, ive heard filtering out the event completely also works
The issue with simply relying on deltatime, even failsafed implementations, is that the physics will be non deterministic. For example collision will be checked more frequently so interactions will still play out differently on different hardware.
I think describing DeltaTime(DT) as the time that has passed since the previous frame would be a little more accurate. A simple implementation of an object moving based on DT would be to move the object x amount of studs DT * speed/time period.
@@Pinsplash DT is from my understanding the time from the start of the previous frame to the start of the current frame, it is same for all calculations in the frame.
@Pinsplash Yes, the "delta" refers to the time elapsed since the previous frame. If your framerate is at a fixed 60fps, then your deltatime value every frame will be 0.01666666. This is generally useful, because it means that if your framerate is halved then your deltatime is doubled, meaning that objects will be moved twice as far per tick to compensate for the fact that each tick is happening half as frequently. However, it is not perfect by itself. Extremely high frame rates can lead to very tiny deltatime values which might cause weirdness due to floating point precision issues when fed into a highly sensitive physics engine. Very low framerates can result in collision detection issues because objects are moving too far in a single frame to correctly detect collisions with surfaces. On one frame they're on one side of the wall, the next frame they're on the other side, no chance to check for a collision in between. This is often exploited in older games like Donkey Kong 64 where they deliberately lower the framerate by spamming projectiles so they can clip through doors.
"I like speedrunning" *>Has a creepy yet incredibly soothing voice and makes videos that explains tiny game mechanics in a granular style; whilst also ensuring the video has an eerie vibe* Nooooooo you don't say.
The tick rate on the original HL1 engine was interesting, and caused some interesting issues with some popular mods of the time (CS springs to mind) (similar to the airboat) Unreal Tournament also kinda used tickrate, but it was more tied to the FPS, but not limited by it (i'm not exactly sure) but striving for a higher Tick/FPS in either HL1 or Unreal Tournament when running a Server actually made the server run better, for obvious reasons :D there was a small app I had once called mmtimer.exe that I used to run to 'unlock' the "multimedia timer" on windows to allow the fps to max out ;)
I played with tickspeed in Minecraft and got some amusing results. Setting the randomtickspeed to a high value will cause anything that relies on random ticks to happen _very_ quickly. I watched an automatic sugarcane farm begin to push out thousands of sugarcane every minute.
Pretty sure this isn't talking about that, I play minecraft after all :P randomTickSpeed is simply the amount of block updates(plants, liquids etc) per an ACTUAL tick The minecraft equivalent is the carpet mod
I've written a few engines (well it's more like one that was heavily reworked) with the first using deltaT and the second using a tickrate style system. I haven't bothered to introduced deltaT into the game logic since the plan is to never modify the number of times the game updates in a second, which in turn makes so many things _vastly_ simpler to program. Collisions being a prime example, and anything else where differences in rounding can have large impacts in determinism. This system also has the added benefit of being bananas crazy more performant. If you only ever update game logic 60 times a second, no matter how fast your computer is running the game, that game logic overhead doesn't change. It effectively means under most circumstances the game becomes very GPU bound, and it can kind of just crank out interpolated frames like crazy. It's been very satisfying getting a still somewhat unoptimized engine running at 100s of fps to well over 1000 fps.
We use delta time quite a lot in Unity, although the technical details are a lot more complicated because the engine actually runs many loops to make devs lives a lot easier. But for deterministic code, we use fixed delta time (or fixed updates.) That is, a loop that runs a fixed and predictable amount of times per second (50 calls per second). So, we use these different loops at the same time, depending on what we are trying to achieve. For example, physics should run at a consistent speed so we use fixed dt. But we want rendering to still be as fast as possible so the game looks smooth so the engine handles that on a delta time loop since any changes in FPS likely won't have serious adverse effects. Same with user input; we want that to feel buttery smooth, so we check for it every frame (but don't perform any physics based changes until the next fixed dt loop) Usually when a Unity game dev is talking about ticks they are referring specifically to an artificial loop used in networked games to handle client side prediction and server reconciliation. That is, a way for the server and client to communicate on when certain game actions were performed so they can stay synchronized and reconcile for any desync when needed. So I find it very interesting to hear how other games/engines handle similar issues.
Doom 1 & 2 ran with 35 gameticks (which means a tickrate of 35 i guess, not a programmer in any way) and AFAIK this idea was carried over to later itterations of id's engines. Given that Valve's Goldscource engine is a heavily modified quake engine (not sure which id Tech specificaly), me thinks they used ticks because they were already there? Btw. the whole 35 gameticks thing is actually important for DOOM speedrunning. Just saying since you're interessted in both. ;) Also, savage Bethesda burn 😂
"somebody had surely thought of this before and i just don't know them" i mean to be honest if you knew john carmack personally you'd be like, a legend by association
Correct me if I am wrong. Delta timing is usually implemented by setting the last_time (at the start of the program) to the posix function clock(), then setting current_time to clock() every frame. Delta then calculated by doing (current_time - last_time) / 1000, then at the end of the frame setting last_time to current_time. unsigned long long current_time = 0; unsigned long long last_time = clock(); double delta_time = 0; for(;;) { current_time = clock(); delta_time = (current_time - last_time) / 1000.0f frame_begin() frame_end(); last_time = current_time; } this is how i'd implement it in my own game engine.
for deltatime games I usually have a rolling average of the last 4 or 8 frames. It smooths spikes and keeps systems more stable. Not perfect but it helps if tick or framerate have fluctuations
As far as I'm aware, the use of ticks is something Source inherited from Quake. Id Software used ticks in Doom and Quake, so Valve used them in Goldsrc and then Source.
Yeah, I always thought tickrate was just framerate but for the physics simulation instead of graphics when the two are decoupled. Also time is actually probably also quantized in real life too
In the Godot Engine, there is the process function (used to update every frame) for physics itself, this runs on a specific framerate the developer can choose, and will always stay there, no matter what fps the player has. That makes delta time for code in the physics process more stable. So bugs like the one in gta shouldn’t happen afaik.
awesome video, this felt like that final piece of the puzzle that helped me, a complete noob, understand tickrates. the audio sounds lovely, your voice is soothing, and im looking forward to more technical videos in the future! just one small thing, the narration pace feels a touch too fast, relax the speed a little, and make the stops a tad longer, it will make the information more digestible, especially for non-native speakers :)
Funny thing, while Golden Eye is largely tickless, it actually attempts framerate compensation with delta timing. But doesn't get it quite right, the character controller acceleration logic is quite complex and doesn't quite align across the range of achievable framerates, which causes character to run slightly faster at increased framerate, but not to the same extent that the framerate is increased. You get on the order of a few percent speed difference in spite of framerate difference being much much larger, which is why this trait went unnoticed for years. Physics is normally extremely tickrate dependent even with delta-timing due to sigma sensitivity; but third party middleware is just fixed at whatever tickrate you set it at during initialisation and can be fed asynchronously, it has its own interpolation logic for the interface to the game code.
Do you know more precisely what happens in this "quite complex acceleration logic" , or know somewhere one can read about it? I've wondered about this stuff for some while now, I've searched a bit on Google in the past but haven't found anything(Might just be a search n00b though XD).
Older PC's had to have "turbo" buttons that'd limit the processor to 4.77Mhz which is the same as the original 8086 PC. Not only for games but software in general suffered issues from things not being at that speed. 3Kliksphillip did a video recently on getting over 1000fps in CSGO. With more than 1000fps, the game would stutter and teleport you around if you were on a server despite the tickrate. God knows why that happens.
Usually when I think of interpolation I think of Quake 1 models basically have the option to do so instead of the frame by frame model animation it was developed with so it animates and moves between frames and makes characters kind of wobbly to look at.
The original Doom games (Except for some console ports) are a good example of an earlier game using game ticks :P One way I've seen games calculate the DT is using the time between the last game frame and when the current game frame started. (this is basically only doable if you have something else running on the same thread as the game sim, like the renderer) On the topic of issues with DT, I'd say it can be narrowed down to a couple categories, assumptions, timestep instability and precision issues. (these actually affect ticks too, but I'll get to that later) Assumptions are where the developer assumes something about the delta time (e.g., it'll always be roughly X, or over Y or under Z). A good example is applying velocity to something every update without actually taking the DT into account. Timestep instability is when the DT varies wildly. Some things are sensitive to changes in the timestep and can just break if you change it constantly, or by too much or too little. (Or even if the timestep becomes too small!) An example of this is physics engines, they don't like it when you constantly make changes to the length of the timestep. Precision issues can be split into two subcategories: Unit precision, where your calculations break down because your timestep is too big or too small for the unit you're using (e.g., if you're using floats and have minuscule timesteps), and time precision, where the timestep is too large and things start to break down. Physics is also an example of time precision issues, motion that should be there can be "dampened" and disappear. Due to how physics engines work, this dampening can also cause stacked objects to start "bouncing" as gravity causes one object to go partially inside another, then those objects push each other, just slightly at first, then harder and harder as the physics fail to keep the objects both stacked and not intersecting. A typical way to deal with this issue in the case of physics is "substepping", where the timestep is split into X smaller steps instead of being use as a single span of time. (Bouncing and that phenomenon where a coin or bowl spins around on its rim after landing are good examples of motion that gets dampened by long timesteps) Another example of this is when a gun fires every X frames, but the framerate falls under that amount of frames. There's ways to deal with this, like firing multiple bullets per frame to account for the too-low framerate, but it's never perfect. In the case of ticking, these can all manifest too, but of course, it only happens when you change the tickrate. Timestep instability is pretty rare and usually only happens if you're constantly changing the tickrate for whatever reason. Assumptions, as you've noticed yourself, can still happen, such as the developer not accounting for the fact that the tickrate can be changed. Precision issues happen when the tickrate is set too high or too low, typically unit precision issues pop up with too high tickrates and time precision issues pop up with too low ticrates. (These tend to behave exactly the same as with DT)
@@hermitgreenn That's only the case with Quake and later games. Doom didn't have Client/Server netcode, too, only peer-to-peer, so it wasn't able to do that either.
You really said reality doesn't work in intervals, 1 thing at a time very fast, while you think a word a time before you speak a word at a time. The brain is truly oblivious to its design.
(afaik, i might be wrong) Source saves specific variables defined in the game code named "DATADESC" for the server. When loading the save, Source restores the datadesc, and calls every entity's (Restore()) function.
good video, basically the same explanation that everyone else gave as well, i thought you'd go into detail of how it is processed using which functions and INetworkable etc. they did it better in csgo
dt is typically some kind of rolling average of the last n frames, as using the previous frame's value causes some very obvious stuttering when a single frame takes longer than expected, since the "jump" is delayed until the next frame, it looks very silly. using a rolling average is much smoother
I emulated the ps2 Spongebob game and found out quickly just how much game logic was tied to FPS. Who would have thought 1440p & 360FPS would nuke a ps2 ear software?
There are vastly more ways to write code which has different results at different tick rates than there is to write code that is almost "tickrate independent". A lower tickrate is a bigger time step and a worse approximation. If you want similar results at a low framerate you'd need quite a bit of care and complication; like RK4 is to Euler's method.
8:04 this fix still has issues related to tickrate, if the tickrate was 11 it still would fired 11 times per second because the code executes every tick, i think thats why valve decided just leaving the fire method without calculating for shooting intervals
I76 Nitro Riders had an odd yet hilarious uncapped tickrate problem. They'd managed to get the game speed consistent, but not the force of gravity on the vehicle. The higher the CPU speed the stronger gravity was. If your computer was too fast you couldn't make some of the jumps making the game impossible to complete.
The frame system you described actually was the standard for most games all the way up until around 2015-2016, with many titles like Halo, Gears of War and so on using this method due to the fact that the system consoles' software was thought to only be that model exactly- it was until afterwards when we started porting games and remaking them for newer console generations that this practice stopped because there was a recognition that they needed to push the same games on multiple models of hardware. PC games like Half-Life didn't use this method because ultimately the PC-Desktop platform is versatile- and most PC's have completely different specs, not just custom-built, but even store-bought pre-made builds often updated their licenses and changed parts around for the same 'model' that was being sold. Valve, Gearbox, and other developers around this time had an edge when it came to things like this because they recognized a problem that wouldn't come into effect for other developers and publishers until much later. I imagine this made it easier for them to port their games simultaneously on several engines whenever they released a game, like the Orange Box, whereas other developers may have to rewrite their code to make an engine work on a different console- the modularity of a tick system became very convenient for someone like Valve, who was able to release the same game (Half-Life 2) on Xbox, Xbox 360, Playstation 2, Playstation 3, Mac, Linux and Windows all without much hassle. Meanwhile, a developer like 343, post Bungie split, was tasked with bringing the games that used this framerate system onto modern hardware- and probably found it rather quickly that the game ran poorly and had to be completely rewritten.
People keep saying things along the lines of this so let me just inform you all that what happens in one engine has no bearing on another engine.
Patch 1.1
Removed bit where I spoke poorly about the downsides of delta timing
"You won't understand what ticks until you understand the issues with the alternatives."
Ah yes, _lice._
I know exactly what a tick rate is. It's how many ticks bite you per minute and it influences how quickly you get Lyme Disease.
lmaoo
exactly
A disease that turns you into a lime? Horrifying
The midwest be like
I must have a really high tick rate then 😿
"skyrim was made by bethesda" 💀
great video! this is the best explanation i've heard for what ticks actually are. i've never fully understood the difference between ticks and frames before, and people talk about tickrates etc. quite a bit when it comes to setting up servers for bhopping capabilities (in css at least).
the credits at the end were funny as well lmao
2 days ago?
@@nodrognameerf unforeseen consequences of being a sponsor
@@olegmoki lol
That gave me a good laugh
Minecraft with the Carpet mod gives an excellent demo of ticks.
/tick freeze allows you to stop all game ticks while still allowing movement
/tick step does one step
You can speed or slow time by changing the tick rate
/tick rate 20 is the default rate of the game
/tick rate 10 makes 1/2 speed
Its an excellent way to introduce the concept of ticks to people, and very helpful for debugging redstone contraptions
/tick freeze stops the internal server ticks, and therefore doesn’t harm or affect in any way the client itself.
I think it’s quite important to make clear how Minecraft works, where your world or save is actually a separate server. The client doesn’t necessarily rely on ticks, and only on the information provided by the internal server (your world). This means that while the server might be dying, your client is likely going to be just fine as it’s not affected by the server’s tick timings. It’s pretty much like getting the official Minecraft server JAR and running it on localhost. The results would be the same
In the minecraft however there is no delta timin, everything assumes there will be 20 ticks per second and changing that will make everything slower or faster
@@ShadowTheAge delta timing is tick, and tick is delta timing. I mean, because the code has to explain to CPU what "tick" is, and the meaning of ticks for a low-level programming is delta-time (as said in the video). Maybe some games call "ticks" as "ticks" in their game code, but it is merely an abstraction, instead there's a lower-level code that actually explain what "ticks" is, and it is delta-time. For example, let's do 10ticks per second, which is delta-time of 0.1second, so for a low-level code, such as C++, you do an infinite loop and for each pass you add a ""sleep" command to CPU for 0.1second (to let the CPU rest for 0.1second or let it to do other task), and you define this as 1 tick for higher-level code like LUA, and modders and devs only need to code on LUA.
so now you can TAS minecraft, neat
Minecraft sets ticks on the server, not the client. This is also how most multiplayer games would do this; as baissicaly all of the game code is done on servers; and the client is just a dummy, making the graphics and sending out inputs.
idk why but i have a soft spot for these type of technical explanations of video game mechanics, especially in the source engine. looking forward to see what vids u come up with next!
You would love shounic's videos, if you don't already watch them
@@yarknark i'm aware of that channel and really like the vids but thank u anyway :)
same
Ooh, nice MSN Messenger PFP.
Something else to consider: When dt is extremely small, (extremely high frame rates) floating point accuracy issues may occur, or in other words, rounding errors.
If tiny values are being accumulated into a large value, the tiny value may be too small compared to the large value and get rounded down, and the large value doesn't change.
For example, if your velocity is tiny (0.001 units/sec), and you're position is large (1000 units away from 0), and the delta time is tiny (1000fps) your position may not change. 1000 (position) + 0.001 (vel) * 0.001 (deltaTIme) = 1000.0000001 (nextFramePosition), but it might just get rounded to 1000.
This is why when you fly away from the center of a map extremely far, you can start to notice bugs.
There are ways to fix this but sometimes devs don't bother.
and what you said can be seen on GTA 3 and VC's engine
also, the dt system is often the culprit of nondeterministic physics.
The valve equivalent to the farlands
A lot of problems are caused when divisions are involved too. I remember seeing bugs in some old games where values proportional to dt were being used as the divisor for certain algorithms and when they get too small that's how you get the objects that fly around too quickly and so on. I believe in the simpsons hit and run, dt was calculated with such low precision that it could be 0 on modern hardware, which would just instantly set your coordinates to NaN and the world would disappear.
@@jetfaker6666
>instantly set your coordinates to NaN and the world would disappear.
Lmao
I love how you explained this ! I didn't know things like the airboat gun or the screen shaking!
Splatoon's game engine also uses a tick rate, but runs at 60 frames while the game itself plays at around 15 ticks online. which can result in very wacky experiences where 1 player can splat another and half a second later be splatted by the dead player. I think the reason why it's so low is because Splatoon uses Peer to Peer rather than Server-Client like Source, and a lower tick rate helps keep connections more stable.
Since it's peer to peer, the issue with being splatted by a dead player is more likely that the other player's screen showed that they killed the first player before they received info that they died, resulting in both players saying "I killed you".
Since there's no server to sanity check or to prioritize one player based off timestamps, both kills end up going through regardless.
Splatoon tickrate is 60, but only sends and receives data packets from other players every 4 ticks (so 15 ticks a second). This is why replays in Splatoon 3 look really floaty and not 100% accurate in terms of the movements and actions you've made during a battle as it's showing how you would of appeared to other players (as you and other players were only being captured every 4 ticks). The packets sent still contain certain info of events that occured between those other 3 ticks when packets weren't being sent however.
OatmealDome has an excellent article on Splatoon (2)'s netcode if you're interested. It's a bit technical and has more info than what I've mentioned here, but it was an interestimg read.
When a company doesn't host servers but still asks you to pay for online
I guess it helps that ink moves a lot slower than bullets which are usually implemented as a single raycast
It deserves to be pointed out how unique the Goldsrc, Source and (to a lesser extent) IDTech engines are regarding how they process game logic in general. Everything, even the menus and single player runs on a server, a local server in this case. In a way. When you open Half Life 2 you effectively start a local dedicated server just for your game session. This is the reason enabling cheats is a server command, same with changing the tickrate. IDTech games tie some game logic to the frame rate even to this day whereas Source relies on tickrate. Frame rate dependent movement is why you have super bounce bugs in Quake 3 and Call of Duty 4.
This is because Quake was made with multiplayer in mind, so in order to maintaining a singleplayer and multiplayer versions of the same game, it's easier to just start a local server that would either way have 0 lag. Also minecraft does exactly the same.
@@sbritorodr
It's all just a really well made Quake mod
@@davisdf3064At this point all first person shooters are essentially a heavily modded Wolfenstein 3D game
interesting, i knew there was a physics server, but didn't know these games where running entirely on a local server. Makes sense considering how you can't run more than one Source or Goldsrc game at once. It's kinda like Minecraft creating a dedicated internal server on singleplayer, just so they wouldn't have to add features to singleplayer and multiplayer separately, so they merged singleplayer with multiplayer
In my experience with modding older unreal games, which are basically a really strict implementation of deltatime with the tickrate matching your current FPS, every single "high FPS issue" has boiled down to bad code, or very rarely just not enough precision. Big example in one of the old unreal games I mod, Harry Potter and the Chamber of Secrets (Classic PC adventure game if any of y'all never played it, it's actually really well made) has a lot of random high FPS bugs, most of the ones that are game specific were made by one guy who always referred to DeltaTime (the normal way) as DTime. Basically all of his equations would produce highly inconsistent results to the point we've nicknamed him DTime guy, and some of his comments very confidently stated that "DeltaTime isn't accurate" and made way worse code because he just didn't quite understand what he was doing wrong lol. Wonder if he ever learned what was wrong in these 20 years haha
But yeah this shit is like a super common problem when people overthink things
@@richardvlasek2445 Literally everything because we were given source code to a proto version of the game lmfao. There's a lot of really cool shit going on in the modding discord all the time if you're interested at all. As for what content people want to make, depends on your creativity ofc. People are getting more and more into custom code and full game mods, which weren't possible with previous tools
I didn't know Harry Potter and the Chamber of Secrets was considered a part of the Unreal series, but surely it makes sense when you think about it!!!!!!111
@@TarenGarond Lmfao I get the joke but I mean it's a UE1 game, same with HP1PC and HP3PC is on UE2
@@OmegaRC59 Remember Unreal is a game(series) and Unreal Engine is a game engine, if you ever disrespect the Unreal series like this again I will find you, and one day stand beside you at the bus stop making a mildly annoying noise for a few seconds!
You hearing me dude!!!
@@TarenGarond you know what i take it back hp2 is part of the unreal series
The Half-Life version of shounic
Or decino
Minecraft is, weirdly enough, an interesting example of such technique. As far as I know, it not only separated gameplay update and rendering update, it also puts them in completely different threads. Therefore, minecraft is like, the only game I know of, that can lag without drops in FPS. Stuff just won't move, but still render in 60fps
.....Though maybe I'm misunderstanding why exactly those things happen
minecraft is a weird example, as you suggested, the client and server are isolated from eachother, if the server runs at 1/5 the TPS (ticks per second) (its also worth mentioning MSPT, milliseconds per tick, which is the measure of how long each tick takes to calculate, useful for lag scale testing and benchmarking) the default is 20 btw so 1/5 would be 4, the game will run 4 times slower on the server end, so mobs will move 4 times slower and etc... however the player will still be moving normally, its also possible you'll experience rubber banding as well just because of the nature of lagging, but there are some very cursed things you can do because of this.
similarly your client can be experiencing 2fps while the server can be running at 20tps perfectly, this is really annoying from the player side but the server can be practically idling, for instance, idk about newer versions but in 1.17 if you place about 125000 composters and place a piston and start clocking it at a 5gt interval (fast observer clock) the client will absolutely kill itself, however basically nothing happens to the server. Why the client kills itself is a mystery to me but the suggestion that the client for some reason checks 125k composters every piston activation (on the client side) is fucking hilarious.
In recent versions the lighting has been decoupled from the server thread and up until you could artificially slow down the lighting thread so much that it just quit processing updates all together, this allowed you to get cursed lighting sources as well as objects that should emit light, not emitting light (portals for instance) though this specific issue was patched fairly shortly after 1.16 iirc, the threads are still decoupled though.
excuse the long winded rant, im a bit of a technical player and have seen some shit so i know a bit about the game lol.
The minecraft client's tickrate is actually tied to FPS, indirectly. The game attempts to tick every frame. It will do a max of 10 ticks per frame. This means that at 1 FPS, your game will run at half speed. 0.5 FPS, quarter speed. No clue what the fuck mojang was thinking.
@@DefineOutside yet the server is still independent from the client lol. Mojang does a lot of stupid shit.
IIRC the Quake III engine not only has separate threads for server & client, but they have to use the OS's network stack to communicate, as if the server was a dedicated instance on another system
makes sense for a multiplayer focused game, saves a lot of unnecessary code, and apparently worked fine for the singleplayer games that ran the engine as well.
@@liquidextal iirc OSes are just smart enough to detect programs using network stack to communicate on the same machine and run the communications through internal messaging structures instead of full network stack.
I'm familiar with how this works in Unity so maybe I can offer a little insight as well.
Like any modern game engine including Source Unity has a variable frame rate as well as a tick rate... this results in two different ways your scripts can run code regularly (well there are others but these are the types discussed in this video).
"Updates" happen every frame. These are useful for updating things that only change visually. For example if I am running at 30 FPS I don't need to update my hud at a full 60 times per second since it is only rendered 30 times. So it makes sense to use the Update functionality. I can add an Update() function to any script and use Time.deltaTime to figure out how long its been since the last Update() call.
"Fixed Updates" happen every 1/60 of a second by default, but this can be changed per project in Unity. This allows for code that must be run regularly at a fixed rate, such as physics updates. This calls the FixedUpdate() function on any script and Time.fixedDeltaTime can be referenced to determine the time since the last FixedUpdate() call. Since the interval is configured per-project, it makes sense to use fixedDeltaTime despite it being unchanging (well, if the code is slow enough, potentially it could miss a call I suppose) for plugins or even just your own project code to properly handle fixed intervals.
Related, Unity also has time scale, which allows you to slow down and speed up the game (Source has a similar concept). The Time class can handle this too and provide unscaled real time intervals (the ones I documented above are scaled based on time scale). It's useful for any time-effects in your game as well as pausing the game with a zero time scale for a pause screen and such.
@Hoovy Simulator 2
host_framerate is a descendant from Goldsource. In there, if it's not zero, the engine overrides the "delta time" variable used for game logic with that command's value. So if you set it to 0.1, Goldsource would consider each frame to have taken 0.1 seconds.
If you're running at 60FPS, which is 1/60 ≈ 0.0166 seconds per frame, this would result in the game logic running at 0.1/0.0166 ≈ 6 times faster than intended.
Infact, if your FPS is a constant value like 60, you can set host_framerate to an equivalent value like 1/60 and the experience will seem basically the same.
That's all for Half Life 1's Goldsource though. In Half Life 2's Source, I'd imagine it's the same thing, but in relation to the ticks and the tickrate lol
great explanation. i'd occasionally hear about the tick system in hl2/tf2 discussion but i never understood how it worked until now
It’s worth noting that the tick system was made specifically for multiplayer. The server runs the game simulation and replicates the tick snapshots across the network so multiple source clients can view roughly the same game state. I suspect they use that model for single player simply because it’s easier to have unified game logic not because they thought it would fix bugs that are linked to framerate.
You can read more about this tick system and source networking on the valve software wiki.
This also means that the assumption that the tick rate would always be 66 is very sane. You shouldn't need to change the tick rate of a single player game at all so the option being exposed is a bit silly.
This video contains 5 examples of singleplayer games that do have issues stemming from frame rate, so this doesn't make sense.
Games don't need to run the client and server at the same TPS to function correctly, as long as they track stuff with real time instead of ticks
While it's true that this video is just a simplified explication of the client/server architecture, there's a reason games like Minecraft also use it in single player - it solves performance problems and is much more predictable. Before Minecraft separated the server and client, chunk generation used to briefly freeze the game. (Unrelated , but wayyy back when, Minecraft used to use the SYSTEM CLOCK to calculate delta time)
@pinsplash I mean that the Source engine’s tick system is much much more than a simple fixed rate function system. Other game engines have fixed rate functions but that alone is no where near enough to make multiplayer possible.
You mentioned that Source does interpolation but you didn’t explain why. Source does interpolation because the game state is only updated on each tick. The vast majority of games update the game state on every frame so it’s always deterministic and there is no jitter. So why does source do it? Because you can’t instantly broadcast every state update over the network. It’s just too expensive and error prone. It’s a model that is developed solely for multiplayer and reused for single player because it’s easier than maintaining two distinct systems.
I’m not saying that you’re wrong. It does help fix framerate dependency issues but talking about the broader context is fun and informative . :)
@Hunter M
Using clocks to synchronize the client and server is really quite difficult. It means that the client and server need to have their clocks tightly synchronized and all clients need to have a timer with high resolution (1ms or less).
Also, source clients don’t really “run” the server ticks. They just store the snapshots and use them for interpolation. The client can sample the ticks at a rate different than the server without much issues if tuned correctly. In fact the interpolation period is 100ms is so clients can continuously interpolate even if they’re receiving ticks slower than 66 per second. Why? Valve was accounting for packet loss, network jitter, etc
Half life was based on the quake engine. The multiplayer “QuakeWorld” did ticks, which is how the first speed cheats worked.
what bethesda games do is tie framerate to game logic. essentially the higher the framerate, the faster the game thinks, and the lower the slower, or framerate = tickrate. they actually fixed this problem in fallout 76, most likely using a separate parameter similar to valve's tickrate parameter in order to prevent people from getting free speedhacks via uncapping their framerate and looking at the ground, like goldeneye except to an extreme degree
didn't they just fix it by adding more preset framerates?
I remember not being able to play Skyrim because my frame rate was too high and the horse would always go way off course in the beginning
When you turn off the fps limit in Skyrim the game just goes mad
I remember playing Danganronpa 1 on game pass and my pc ran it at like 1200 fps, it was so funny watching how fast it moved. It’s a lot less funny and more embarrassing when it’s done by a triple A company in the modern age (Bethesda)
This leads me to believe that using ticks almost singlehandedly futureproofed most Valve titles. It works unbelievably well with high FPS, and if the game was like CSGO where the devs intended the users to be able to switch tickrates, it also scales very well with higher tickrates, essentially being objectively superior to the frametime-based method most other games use.
I think it "starts" to break when you get really close to 1000 FPS, such as your movement speed becoming out of sync with server in CSGO and your movement will get stuttery due to the game cosntantly fixing the desyncronization between client and the server. However, such high FPS isn't something any player is realistically going to be unable to live without, and it probably can be fixed as well if the devs cared about it. I wonder what causes THAT bug, but I guess it isn't easy to tell as CSGO's source code has never become available.
By the way, as a competent CSGO player who happens to understand the basics of programming, I always knew what ticks were and how tickrates worked, but I never connected the dots and figured out that they were meant to seperate game logic from FPS. Thanks for the video!
You don't need to know how long a frame will take when using delta time, you only need to know the time that the previous frame started processing at and the current time at which the current frame is starting processing, this doesn't cause any instability by itself.
It also means that what you see on the screen is actually X milliseconds in the past, where X is the time it took to process and display the frame (you COULD try compensating for it by guesstimating the time it will take to process the frame, but that's not mandatory and I frankly have no idea how well that works).
The main sources of instability afaik is the delta time being too low (tickrate too high) causing floating point inaccuracies, or the delta being too high (tickrate too low) causing stuff like physics engines to freak out as object start moving such large distances per frame that they can phase through one another without technically colliding.
I use Delta Time in my games, and you actually can check how long the difference between each frame was (you can store the value in a variable and compare it).
The issue with Delta Timing on the Spyro Triology is because the game simply does not take it into account correctly, it was implemented but not carefully implemented. When coding with Delta Timing you should always make sure to limit how effective Delta Timing is with where you apply it to avoid such things. Oh, also Donkey Kong 64 has the same issue.
For example, a common solution to the lag issues breaking gameplay (allowing super high speeds or even clipping through walls) is to cap how much the game should compensate for it. For example, making it so if the duration is bigger than a certain amount, then cap it at a certain duration (if it's bigger than 0.20 then force it to NOT be bigger than this number). So you could make it be capped to 5 frames (about 0.2ms) as that's the lowest you want it to go, effectively making it so low framerates don't break it and pretty much fix most issues with it. I say 5 frames since usually the clipping issues or extreme boost are related to frames going below that number, or even your game freezing completely (for example, being 0 from a lag spike, causing really long frame difference making the game think it is "fine" to make you go super fast to compensate for it).
But again, it requires proper knowledge on how to apply it to use it efficiently. Which funnily enough, isn't too different from what you talked about in the video, making it have a proper implementation to take into account the system correctly. lol
I use delta time as well, you ever run into the problems of grabbing window (grab like when you move a window) the delta time gets all fucked up
I fixed it once , so I get what going on
@@SuperShortAndSweet Yeah, that happens since windows, normally, just stop rendering and stop "thinking" completely when you move them around (to save on resources I guess). Applying the fix I mentioned helps with that, but of course it isn't perfect.
One method that I saw to fix it from the root is to fake the window itself; basically making the app borderless but rendering the window with the engine, and adding all the functionalities real windows have, making it look exactly like the normal window but it isn't, and you can move it around without causing issues (since engine window movement doesn't really freeze the game when doing it).
@@Chillaxowo i think what is happening is windows grab event is constanstly beening called, and there is speed or something tied to the handle event function, you making boarderless is smart fix, i think thats why it works, another thing, ive heard filtering out the event completely also works
The issue with simply relying on deltatime, even failsafed implementations, is that the physics will be non deterministic. For example collision will be checked more frequently so interactions will still play out differently on different hardware.
Damn this is a whole programming lesson... I like it, do more technical stuff explanation
I think describing DeltaTime(DT) as the time that has passed since the previous frame would be a little more accurate. A simple implementation of an object moving based on DT would be to move the object x amount of studs DT * speed/time period.
i think that would actually cause more non-deterministic behavior since objects getting iterated on later would be using a slightly higher DT
@@Pinsplash DT is from my understanding the time from the start of the previous frame to the start of the current frame, it is same for all calculations in the frame.
so just the length of the previous frame?
@@Pinsplash yeah, i've only ever seen it as the length of the previous frame
@Pinsplash Yes, the "delta" refers to the time elapsed since the previous frame. If your framerate is at a fixed 60fps, then your deltatime value every frame will be 0.01666666.
This is generally useful, because it means that if your framerate is halved then your deltatime is doubled, meaning that objects will be moved twice as far per tick to compensate for the fact that each tick is happening half as frequently.
However, it is not perfect by itself. Extremely high frame rates can lead to very tiny deltatime values which might cause weirdness due to floating point precision issues when fed into a highly sensitive physics engine.
Very low framerates can result in collision detection issues because objects are moving too far in a single frame to correctly detect collisions with surfaces. On one frame they're on one side of the wall, the next frame they're on the other side, no chance to check for a collision in between. This is often exploited in older games like Donkey Kong 64 where they deliberately lower the framerate by spamming projectiles so they can clip through doors.
Wow! I'm amazed about the explanation on how the tick rate works in the Source engine now. This is such a helpful insight.
this guy, never fails to amaze me
"I like speedrunning"
*>Has a creepy yet incredibly soothing voice and makes videos that explains tiny game mechanics in a granular style; whilst also ensuring the video has an eerie vibe*
Nooooooo you don't say.
I have to give props to your writing style! Very clear to understand and to the point :)
I can finally show this to people so they understand why a tick is the way it is!
I'm so happy you made this
I feel like using a tick rate probably worked it’s way up from Quake to GoldSource to Source
fun fact: DooM (1993) and DooM 2 uses tics for it's logic, and they are 1/35th of a second, as the framerate cap _was_ 35
I like how most bugs are narrowed down to the programmer forgetting something..
Time to resubscribe.
I pretty much know all this stuff, but I enjoy the this kind of content.
Delta timing is actually perfect system if devs dont fuck up math (which they always do)
This was a well made explanation of game ticks in general, not just for source engines. Well done!
The tick rate on the original HL1 engine was interesting, and caused some interesting issues with some popular mods of the time (CS springs to mind) (similar to the airboat)
Unreal Tournament also kinda used tickrate, but it was more tied to the FPS, but not limited by it (i'm not exactly sure) but striving for a higher Tick/FPS in either HL1 or Unreal Tournament when running a Server actually made the server run better, for obvious reasons :D
there was a small app I had once called mmtimer.exe that I used to run to 'unlock' the "multimedia timer" on windows to allow the fps to max out ;)
I was expecting a classic "pls fix" under the valve logo at the end.
I love gamedev topics. Good job!
4:22 - I love that no more needs to be said here xD
insanely good video as always
I played with tickspeed in Minecraft and got some amusing results. Setting the randomtickspeed to a high value will cause anything that relies on random ticks to happen _very_ quickly. I watched an automatic sugarcane farm begin to push out thousands of sugarcane every minute.
Pretty sure this isn't talking about that, I play minecraft after all :P
randomTickSpeed is simply the amount of block updates(plants, liquids etc) per an ACTUAL tick
The minecraft equivalent is the carpet mod
I've written a few engines (well it's more like one that was heavily reworked) with the first using deltaT and the second using a tickrate style system. I haven't bothered to introduced deltaT into the game logic since the plan is to never modify the number of times the game updates in a second, which in turn makes so many things _vastly_ simpler to program. Collisions being a prime example, and anything else where differences in rounding can have large impacts in determinism. This system also has the added benefit of being bananas crazy more performant. If you only ever update game logic 60 times a second, no matter how fast your computer is running the game, that game logic overhead doesn't change. It effectively means under most circumstances the game becomes very GPU bound, and it can kind of just crank out interpolated frames like crazy. It's been very satisfying getting a still somewhat unoptimized engine running at 100s of fps to well over 1000 fps.
This is actually pretty cool, thanks for explaining it!
We use delta time quite a lot in Unity, although the technical details are a lot more complicated because the engine actually runs many loops to make devs lives a lot easier. But for deterministic code, we use fixed delta time (or fixed updates.) That is, a loop that runs a fixed and predictable amount of times per second (50 calls per second). So, we use these different loops at the same time, depending on what we are trying to achieve. For example, physics should run at a consistent speed so we use fixed dt. But we want rendering to still be as fast as possible so the game looks smooth so the engine handles that on a delta time loop since any changes in FPS likely won't have serious adverse effects. Same with user input; we want that to feel buttery smooth, so we check for it every frame (but don't perform any physics based changes until the next fixed dt loop)
Usually when a Unity game dev is talking about ticks they are referring specifically to an artificial loop used in networked games to handle client side prediction and server reconciliation. That is, a way for the server and client to communicate on when certain game actions were performed so they can stay synchronized and reconcile for any desync when needed. So I find it very interesting to hear how other games/engines handle similar issues.
Super interesting video, I've always wondered what the purpose of a tick system is after I saw your previous videos on it :)
Doom has a tick system, so I assume that's yet another thing that carried over from Quake to Half-Life
Doom 1 & 2 ran with 35 gameticks (which means a tickrate of 35 i guess, not a programmer in any way) and AFAIK this idea was carried over to later itterations of id's engines.
Given that Valve's Goldscource engine is a heavily modified quake engine (not sure which id Tech specificaly), me thinks they used ticks because they were already there?
Btw. the whole 35 gameticks thing is actually important for DOOM speedrunning. Just saying since you're interessted in both. ;)
Also, savage Bethesda burn 😂
I really liked this video! It was well edited and very entertaining! good job!
I love these technical videos, with a touch of humor.
3:57 THATS WHY THAT HAPPENS??? holy shit that was so annoying wondering wtf I was doing wrong and eventually just had to skip that section
"somebody had surely thought of this before and i just don't know them"
i mean to be honest if you knew john carmack personally you'd be like, a legend by association
wow! this is a super crazy, great video that is changing the way I view reality itself! I love it a lot!
As always I really enjoyed the video. Can't wait for more content.
Correct me if I am wrong. Delta timing is usually implemented by setting the last_time (at the start of the program) to the posix function clock(), then setting current_time to clock() every frame. Delta then calculated by doing (current_time - last_time) / 1000, then at the end of the frame setting last_time to current_time.
unsigned long long current_time = 0;
unsigned long long last_time = clock();
double delta_time = 0;
for(;;) {
current_time = clock();
delta_time = (current_time - last_time) / 1000.0f
frame_begin()
frame_end();
last_time = current_time;
}
this is how i'd implement it in my own game engine.
for deltatime games I usually have a rolling average of the last 4 or 8 frames. It smooths spikes and keeps systems more stable. Not perfect but it helps if tick or framerate have fluctuations
I love that you used KQ3 in this video. I put a lot of time into KQ3 on my 286 in the 80s lol
As far as I'm aware, the use of ticks is something Source inherited from Quake. Id Software used ticks in Doom and Quake, so Valve used them in Goldsrc and then Source.
Well, this was very interesting. I'm getting and studying programation and I love to watch context lile this.
We do experience the world in intervals too - think about the maximum framerate an eye can perceive!
Yeah, I always thought tickrate was just framerate but for the physics simulation instead of graphics when the two are decoupled. Also time is actually probably also quantized in real life too
4:20 "Skyrim was developed by Bethesda" *doesn't explain any further*
In the Godot Engine, there is the process function (used to update every frame) for physics itself, this runs on a specific framerate the developer can choose, and will always stay there, no matter what fps the player has. That makes delta time for code in the physics process more stable. So bugs like the one in gta shouldn’t happen afaik.
awesome video, this felt like that final piece of the puzzle that helped me, a complete noob, understand tickrates. the audio sounds lovely, your voice is soothing, and im looking forward to more technical videos in the future!
just one small thing, the narration pace feels a touch too fast, relax the speed a little, and make the stops a tad longer, it will make the information more digestible, especially for non-native speakers :)
Funny thing, while Golden Eye is largely tickless, it actually attempts framerate compensation with delta timing. But doesn't get it quite right, the character controller acceleration logic is quite complex and doesn't quite align across the range of achievable framerates, which causes character to run slightly faster at increased framerate, but not to the same extent that the framerate is increased. You get on the order of a few percent speed difference in spite of framerate difference being much much larger, which is why this trait went unnoticed for years.
Physics is normally extremely tickrate dependent even with delta-timing due to sigma sensitivity; but third party middleware is just fixed at whatever tickrate you set it at during initialisation and can be fed asynchronously, it has its own interpolation logic for the interface to the game code.
Do you know more precisely what happens in this "quite complex acceleration logic" , or know somewhere one can read about it?
I've wondered about this stuff for some while now, I've searched a bit on Google in the past but haven't found anything(Might just be a search n00b though XD).
i don't even know where the channel is going anymore and i don't think that that's a bad thing
The thumbnail to this video is just epic.
A lot of game engines implement both delta time and physics tick frames. (sometimes with a delta to double check the work of the physics process loop)
Doom source ports made tickrates first
"did you know that i like speedruns" came out of nowhere
Reminds me of the guide that used the source code leaks to show in painstaking detail why bunnyhopping and airstrafing work
I enjoy the way you explain things
2:49
That was so beautiful.
Doom and Quake also use ticks - Source probably originally used them because it was licensed off of Quake's code
The only thing that we care about is how far we can push the tick rate and see if crazy crap happens!
Older PC's had to have "turbo" buttons that'd limit the processor to 4.77Mhz which is the same as the original 8086 PC. Not only for games but software in general suffered issues from things not being at that speed.
3Kliksphillip did a video recently on getting over 1000fps in CSGO. With more than 1000fps, the game would stutter and teleport you around if you were on a server despite the tickrate. God knows why that happens.
> Skyrim was developed by Bethesda
Honestly that was a perfect explanation
Usually when I think of interpolation I think of Quake 1 models basically have the option to do so instead of the frame by frame model animation it was developed with so it animates and moves between frames and makes characters kind of wobbly to look at.
The original Doom games (Except for some console ports) are a good example of an earlier game using game ticks :P
One way I've seen games calculate the DT is using the time between the last game frame and when the current game frame started. (this is basically only doable if you have something else running on the same thread as the game sim, like the renderer)
On the topic of issues with DT, I'd say it can be narrowed down to a couple categories, assumptions, timestep instability and precision issues. (these actually affect ticks too, but I'll get to that later)
Assumptions are where the developer assumes something about the delta time (e.g., it'll always be roughly X, or over Y or under Z). A good example is applying velocity to something every update without actually taking the DT into account.
Timestep instability is when the DT varies wildly. Some things are sensitive to changes in the timestep and can just break if you change it constantly, or by too much or too little. (Or even if the timestep becomes too small!) An example of this is physics engines, they don't like it when you constantly make changes to the length of the timestep.
Precision issues can be split into two subcategories: Unit precision, where your calculations break down because your timestep is too big or too small for the unit you're using (e.g., if you're using floats and have minuscule timesteps), and time precision, where the timestep is too large and things start to break down.
Physics is also an example of time precision issues, motion that should be there can be "dampened" and disappear. Due to how physics engines work, this dampening can also cause stacked objects to start "bouncing" as gravity causes one object to go partially inside another, then those objects push each other, just slightly at first, then harder and harder as the physics fail to keep the objects both stacked and not intersecting. A typical way to deal with this issue in the case of physics is "substepping", where the timestep is split into X smaller steps instead of being use as a single span of time. (Bouncing and that phenomenon where a coin or bowl spins around on its rim after landing are good examples of motion that gets dampened by long timesteps)
Another example of this is when a gun fires every X frames, but the framerate falls under that amount of frames. There's ways to deal with this, like firing multiple bullets per frame to account for the too-low framerate, but it's never perfect.
In the case of ticking, these can all manifest too, but of course, it only happens when you change the tickrate.
Timestep instability is pretty rare and usually only happens if you're constantly changing the tickrate for whatever reason.
Assumptions, as you've noticed yourself, can still happen, such as the developer not accounting for the fact that the tickrate can be changed.
Precision issues happen when the tickrate is set too high or too low, typically unit precision issues pop up with too high tickrates and time precision issues pop up with too low ticrates. (These tend to behave exactly the same as with DT)
I might be wrong but every single IDTech game runs on a server, be it local or remote for single and multiplayer respectively.
@@hermitgreenn That's only the case with Quake and later games. Doom didn't have Client/Server netcode, too, only peer-to-peer, so it wasn't able to do that either.
a Tick is a annoying bug that ticks you when you walk through woods
You really said reality doesn't work in intervals, 1 thing at a time very fast, while you think a word a time before you speak a word at a time. The brain is truly oblivious to its design.
0:35 Are you sure? It seems we are in a simulation here too!
The highest the buggier, the lowier the worse.
7:04 Look over there!
man, I should be studying and writing my sh*t.
3 in the morning:
"oh look, sauce engine shenanigans"
4:22 underrated joke of the century
I guess I am the hypothetical viewer because that's exactly the question I asked myself even before the video
Have you thought of researching how save/load system works? I'm personally curious how Source manages to preserve the entire state of the game
Who said they save entire state of the game
It takes a snapshot of whatever the server state was during that tick.
(afaik, i might be wrong) Source saves specific variables defined in the game code named "DATADESC" for the server. When loading the save, Source restores the datadesc, and calls every entity's (Restore()) function.
If i recall right i quake 1 is where the tick system for source/gold source started.
Crap like that even happened in Grim Fandango, which made me mad.
good video, basically the same explanation that everyone else gave as well, i thought you'd go into detail of how it is processed using which functions and INetworkable etc.
they did it better in csgo
I may not understand ticks but I understand tucks. Let’s just say I had an unusual one night stand experience
dt is typically some kind of rolling average of the last n frames, as using the previous frame's value causes some very obvious stuttering when a single frame takes longer than expected, since the "jump" is delayed until the next frame, it looks very silly. using a rolling average is much smoother
This would not be enough to get deterministic behavior and creates issues with larger variations in FPS.
I emulated the ps2 Spongebob game and found out quickly just how much game logic was tied to FPS. Who would have thought 1440p & 360FPS would nuke a ps2 ear software?
Well, you could argue that not even real life is truly continuous. At least that's what Planck and Einstein tell us.
There are vastly more ways to write code which has different results at different tick rates than there is to write code that is almost "tickrate independent". A lower tickrate is a bigger time step and a worse approximation. If you want similar results at a low framerate you'd need quite a bit of care and complication; like RK4 is to Euler's method.
Please make more of these types of videos!
8:04 this fix still has issues related to tickrate, if the tickrate was 11 it still would fired 11 times per second because the code executes every tick, i think thats why valve decided just leaving the fire method without calculating for shooting intervals
that's true, though the same applies to all guns i'm pretty sure and 11 is just a terrible tick rate to begin with
"Skyrim was developed by Bethesda" is one of the jabs at Bethesda ever
I76 Nitro Riders had an odd yet hilarious uncapped tickrate problem. They'd managed to get the game speed consistent, but not the force of gravity on the vehicle. The higher the CPU speed the stronger gravity was. If your computer was too fast you couldn't make some of the jumps making the game impossible to complete.
Easiest way to describe tick rate:
It's frame rate for the in game engine.
Or:
It's frame rate for physics.
no
The frame system you described actually was the standard for most games all the way up until around 2015-2016, with many titles like Halo, Gears of War and so on using this method due to the fact that the system consoles' software was thought to only be that model exactly- it was until afterwards when we started porting games and remaking them for newer console generations that this practice stopped because there was a recognition that they needed to push the same games on multiple models of hardware. PC games like Half-Life didn't use this method because ultimately the PC-Desktop platform is versatile- and most PC's have completely different specs, not just custom-built, but even store-bought pre-made builds often updated their licenses and changed parts around for the same 'model' that was being sold.
Valve, Gearbox, and other developers around this time had an edge when it came to things like this because they recognized a problem that wouldn't come into effect for other developers and publishers until much later. I imagine this made it easier for them to port their games simultaneously on several engines whenever they released a game, like the Orange Box, whereas other developers may have to rewrite their code to make an engine work on a different console- the modularity of a tick system became very convenient for someone like Valve, who was able to release the same game (Half-Life 2) on Xbox, Xbox 360, Playstation 2, Playstation 3, Mac, Linux and Windows all without much hassle.
Meanwhile, a developer like 343, post Bungie split, was tasked with bringing the games that used this framerate system onto modern hardware- and probably found it rather quickly that the game ran poorly and had to be completely rewritten.
The solution to this by the way is to use fixedDeltaTime for gameplay physics, and deltaTime - for visual stuff.