Modern tech is in dire need of simpler solutions. Not over-engineered crap. It's Wirth's law at its finest : better hardware just lead to worst software.
And software quality will keep deteriorating even more once AI solutions become ever more prevalent. Take a look at how every game developer uses DLSS, XESS, FSR, and so on as a crutch for bad optimization and terrible image clarity and antialiasing
@@barrupa Good opportunity to tell you about the channel Threat interactive, a youtube channel about graphic optimisation. They made a video about how TAA is used for the worst in modern games, it's the first video on their channel.
My sentiments exactly. What they need to do is return to old, time-tested, tried and true game design and optimization techniques. Old techniques combined with the horsepower of new technology and hardware are a win-win combination. And I'm glad you mentioned Threat Interactive, because I love his videos! He's singing my song! I have gotten into the habit of calling AI upscaling, frame generation, atc. "software duct tape," because that's how devs are using them, hoping they can slap them over their lazy, shoddy work and call it a day.
@@guyg.8529 I had already seen some of his videos. I am glad I am not the only one who hates the overuse of TAA as a crutch for low resolution per-pixel shaders and their artifacts. It's no wonder why old forward rendered games are far clearer and sharper looking. Everything nowadays looks like I forgot to wear my glasses.
@@guyg.8529 hardware manufacturers are on this, if engines were optimised, you would not need an i9 and 4080 to run decent settings. Take a look at indie some indie games boasting great graphics but you can run it on decent settings even on 1060...meanwhile nshitia nad AMDumb pushing $1000+ hardware down your throats for shitty ue5 stutterfest
As someone who has worked with the Build Engine for 20 years at this point: Another reason why Build is a bit slower, is because everything is dynamic, there's no static BSP tree. This can be considered an engine feature, as all parts of a level can change. For example, you can add and remove sectors from a level in realtime if you know what you're doing, something IDTech1 has no hope of doing. There's also things like planar sprites (floor/ceiling and wall-aligned sprites, which can be used as platforms, decoration, etc. For Doom, the only ports that can do this are ZDoom-based.) Another thing to note about Build, is every two-sided line is a portal. This allows you crazy, overlapping, non-euclidean geometry without need of teleportation. The only condition for this is you can't render two overlapping pieces at the same time, requiring one to chop up overlapping sectors into smaller bits to prevent rendering issues. Duke3D uses teleportation in a lot of instances since it was easier and less intensive. The level "Tier Drops" is an example of this feature taken to the extreme, all of the different biomes in that level occupy the same physical space in the map.
That explains it. I remember watching a (I think it was) GamesDoneQuick TAS on Duke3D that was able to break the sectors in a hilarious way, And what you explained has now made me realize what was actually happening in that TAS. I try to not pit Doom against Build since the Doom engine definitely became more demanding after Raven added those extra features for Hexen, and it still used static BSP maps... I think. I just like how similar and yet different these two engines are from each other, BSP vs portals, etc. Even though I prefer speed over feature set, during my research however, I realized that Build isn't so demanding as I initially thought. A few weeks ago, I finally got my 486 running properly and when I ran Duke3D I was surprised how much faster it ran than expected. This is because LGR (when he first built his ultimate woodgrain 486 DX2 PC) tried to run Duke3D, he got terribly low frames. My 33mhz DX ran slightly better than that. I think there's something wrong with his woodgrain PC since it couldn't even run Quake after he installed a Pentium Overdrive. Go figure. I had to rewrite that section of my script after testing it on real hardware. I really appreciate the comment.
@@GTXDash Yeah, Build's speed is very dependent on the video adapter used rather than the CPU itself. It really prefers hardware capable of VESA LFB (Linear Framebuffer) mode, which is part of VBE 2.0 (VESA BIOS Extensions). With hardware that doesn't have it natively, SciTech Display Doctor (aka. UniVBE) can help a lot with improving framerates, but it won't be as good as the real thing obviously. Using a PCI or AGP video card rather than ISA will also go a long way to improving performance, if the system has the slots available. And yes, Hexen still uses the BSP tree. The swinging doors and whatnot are a trick called "PolyObjects". They are extremely limited in how they can move. I'm also pretty sure they need to be convex in shape or they'll have sorting issues.
@StrikerTheHedgefox yeah, I remember opening a hexen map in Doom Builder and noticing a bunch of duplicate swinging doors just outside of bounds. It took me a minute or two to realize what I was looking at.
@@StrikerTheHedgefox I had a feeling you were going to mention Tier Drops 😅. Build was fun to work with! Moving from it to Unreal Engine blew my childhood mind.
Pretty sure the root issue is, that a good optimized game engine (be it in house, or 3rd party) needs a few very passionate and involved developers, that are so comfortable with the engine + have enough carisma and influence, that they can identify and prevent performance issues and suggest alternatives already in the early stages of development. But these kinds of people are difficult in a big coorporation setting, because if you have 1000 employees, and your entire game can collapse if those 3 core people leave the company, you are fucked. So all big companies are either a) disincentivsed or b) don't even try or c) tried but lost it along the way, to cultivate such tech. ID software was always technology driven and they always had enough success to stay alive. So it's a rare gem.
And it only gets worse. Given the direction the industry is heading, traditional optimization techniques will soon be replaced by relying on AI upscalers as a crutch for achieving decent performance-at the cost of image quality, blurriness, and artifacts. It's no surprise that older games with traditional forward rendering, or per-vertex shaders instead of per-pixel, often appear much sharper and clearer. This is especially true when traditional MSAA or other forms of multisampling are used, compared to modern games that employ a multitude of deferred techniques and low-resolution per-pixel effects, which require temporal accumulation and spatial upscaling to avoid looking like a dithered mess. To be honest, the technology available when Crysis was released (2007) is more than sufficient to create good-looking games, especially if subsurface scattering and distance-based diffuse shadow maps (like those used in GTA V) are added. Everything developed since then seems to exist either to sell more hardware or to reduce the burden on developers and studios, such as AI upscalers or ray tracing. I think the future looks grim, especially considering how mainstream GPUs have become power-hungry monsters ever since NVIDIA realized it could push enterprise-level hardware to consumers at insanely high clock speeds in the hopes of making real-time ray tracing viable for game development. This shift places the burden on consumers, who are now expected to shell out $300 or more for a 250W "space heater" that can barely run a game at 1080p60 with upscaling
Man quake 3 and id tech 3 was truly a generation. Many great games running on that engine from childhood. Call of Duty, MOHAA, American McGee's Alice, Star Trek Elite Force, Return To Castle Wolfenstein
I've been in game development since the Playstation 2 days and one thing I can say that contributes to this is the move to those algorithmic optimization solutions over manual artist-driven methods. They are faster but sloppier in many ways. Nothing beats creating LODs by hand when the artist has way more discernment on how to go about turning a 200k mesh into a 50k mesh without losing the base silhouette of the model and keeping UVs clean. I've met many younger 3d modelers that can't optimize as well because they're leaning into these auto-decimation tech too much. It works in a pinch but you don't want depend on it too much. DLSS has also made us lazy with optimization as well. There's also the move to 100% realistic lighting over the Gen 7(PS3/Xbox 360) hybrid lighting where you had some realtime lit environments blended with pre-baked lighting. That alone is a major resource hog. When Unreal first introduced Nanite, I knew that they would lead to games running way slower. lol.
Oh, finally! Someone that actually works in the industry. I have a question. So, before modern methods, does LOD work kind of like mipmaps where you have to have multiple models of the same object or 3d mesh with verifying degrees of complexity? I know how id tech 3 does this, but I'm not entirely sure what is industry standard.
@@GTXDash Yeah, it's similar. It's distance based from the player camera and you can create as many LODs as you need(LOD0 is the default model, followed by LOD1, LOD2, LOD3, etc.). For skeletal meshes, it's multiple meshes skinned to the same skeleton, but the optimization can also happen with bones and skin influence count(how many bones can deform each vert). So you can cull fingers on a human mesh at LOD2, for example, as you wouldn't see the finger deformation/animation from that far away. Even on AAA games, I can always tell when the devs used solutions like SimplyGon or InstaLOD. The algo method doesn't look as clean, especially on VR games where the polymesh is more noticable. InstaLOD still IS a very great solution and saves you so much time in production.. so not knocking it entirely, but when you have the time to do optimization right across all disciplines in the pipeline(not just art but also the engineering side), you can do magic with performance. Good example are some of these "impossible ports" on the Switch like Wolfenstein 2 As far as industry standard? I'd say that is largely relative to your budget, team size and what your deadlines happen to be. Thanks for the interesting video! Cheers
what you've mentioned here is the number one reason i love doom eternal's id tech 7. the lighting uses very efficient shadow maps, but mainly relies on baked lighting to get the job done- as well as cubemaps for all of the blurrier reflections and viewmodel lighting. it's blazing fast, and it's done with artistic intent everywhere you look. i genuinely feel that this produces better visuals than something like immortals of aveum's unreal engine 5, which pushes for entirely on the fly- realtime lighting. it looks fine, but runs entirely terribly. in my experience, doom eternal runs about 4x the speed of immortals of aveum - on the same GPU...
@@railroadism I played through Sekiro again recently and that game uses 90% pre-baked lighting, cubemaps, all the older methods and it not only runs butterly smooth at over 100fps in many cases, but it's visually a more beautiful game with a more painterly look to the render than something like Immortals Of Aveum. The juice sometimes isn't worth the squeeze with the newer lighting.
Absolutely. Even to this day I am still impressed when I see artist driven LOD done right. And by see, I mean done in such as way that it cannot be seen. You know there is LOD going on but you cannot see the transitions. I fall in the 'Artist driven' ideal far more than the advanced tech idea. Both have their place but it feels like we are on the rely on tech to solve problems rather than figuring out more elegant solutions. We are brute forcing so much nowadays. Which is why when you see something that is artist driven (doom a great example), it is a reminder that we can still do great things while using great tech at the same time. And I agree, things like Nanite, reconstructive detail shaders, temporal upscaling etc has lead to some very sloppy work over the last few years. Why optimize things as much as we used to when you can just throw some upscaler at it complete with details fizzle? The amount of games now that have laughable base render resolutions is astounding. Some are approaching Ps2 level render resolutions and that is just wild to see. That said, there are things like that being done on titles like Doom (2016) on switch that you can only catch on rare moments. It was using that tech to near perfection were it made sense. That I can get on board with.
A lot of old games looked amazing because they would "bake" the lighting effects into the map. Basically it does a bunch of calculations at map compile time to see how the lightning from the lava should look on the walls, and it colors the walls like that. It would take like 20 hours to compile a map for Counter-Strike or Team Fortress Classic because it had all of these lighting calculations. The only downside of doing this is that baked lighting effects do not cast shadows. Most gamers really don't care about shadows. I can enjoy the environment with the shadows cast by the environment (baked) even if it doesn't have shadows cast by the player (dynamic). Modern games seem to skip all of that and just force the GPU to calculate everything on the fly. You get a game that looks the same but the GPU demands are 10x higher.
Rage is greatly underappreciated. The way they utilized large textures they broke up the static flat look of most engines at the time and made it look... real. The game itself was critically panned, but it offered something unlike what other games were bringing.
@@crestofhonor2349 that's true, but it was really the defining technology Rage brought. It helped still the immersion really well, but perhaps in a way relevant to this video, that feature alone wasn't contributing to gameplay mechanics which allowed the game to stand out. You spent a lot of time interacting with other characters in hubs, and folks wanted a Doom 2016. I think Rage was trying to be a blend of Fallout and Borderlands. No one was accusing iD for having too much story and dialog for Wolfenstein to Quake and those additional story elements were partly to blame for the public reception. I liked it, but I know that's not always a shared view.
The fact that they could do megatextures on consoles with only half a gigabyte of total memory and on a disc with only 9GB of space is also insane. Nowadays every game insists on having every surface be its own massive bespoke texture generated by Substance Painter's procedural generation with no tiling, and it's the main reason games have ballooned to over 100GB apiece. Whatever Id was doing in 2011 is still way beyond the rest of the industry.
@@stevethepocket mega textures had a ton of flaws which is why they were dropped in later iterations of ID Tech. Also they would not help modern games at all. How small do you expect modern games to be?
@@crestofhonor2349 Small enough that I can fit my whole collection onto one drive would be nice, especially given the risk that they'll be gone forever if they get delisted and I don't already have them downloaded.
I half remember a quote from Marcus Beer on Epic War Axe when Rage came out. "Id have some of the best technical artists and programmers in the industry; but they desperately need to hire some fucking good game designers" Which is funny because Id has a long history of mistreating its designers or id tech clients like splash damage; particularly when John was Ceo. I think the best games get a balance right between artists, designers, and programmers. Its just a shame the industry splits these disciplines up leading to imbalances in power.
And now they wanna replace these essential people with AI and sensitivity reading narrative designers. While it's hilarious to watch, I can still have fun with old games.
@@realhet I don't mind LLMs being used as assistive tools for artists to be more productive, but we ain't there yet due to issues with creating valid, lean, and legal models. Its bit like needing a million moneys on a million typewriters to produce the works of Shakespeare at random. Statistically possible, but the computation time is almost infinite. As for sensitivity reading narratives, I'd say its good in addressing historic misogyny, racism, propaganda tropes in games; but I find the focus on "relatable empathetic characters" is too much and ends up creating "cinematic character action games". The Dad of Boy reboot i find went too far in changing Kratos from a tragic anti-hero to a redemptive father figure. Had they gone with new norse viking protagonist I'd be all for it; but it felt like a large reteconing of the IP for "modern audiences". So I get what you mean. Like I said its all about getting the balance right and that a difficult thing to do in the game's industry.
@@OdysseyHome-Gaming I think there are more people to balance: management, sales, marketing, CEOs and shareholders from the balance. When you finish a game, their names are scrolling for 20 minutes. :D If they are able to replace the artists, developers, writers, programmers fully or partially with AI, then they will get more profit. They don't give a damn about how good is the game as long as it makes exponentially growing income each year. These people are essential to execute a business plan. Artists, developers, programmers, writers, actors are needed to invent stuff that is OUTSIDE the boundaries of common knowledge and imagination. Rage 2: Another game that utilized the color: 0xFF00FF was: FarCry New Dawn. But I should not have an opinion because I played neither. The start of Rage 1 when you step out of the crashed spaceship was amazing, tho'. The end of it indeed was just a grind.
While idTech is great for smaller maps, its not feasible for large seamless open worlds. Most of their releases have been linear smaller maps connected by loading screens
It's all about matching a game with the right engine. With that said, I thought Rage pulled it off pretty well... as long as you use an NVME to better hide texture pop-in.
@@GTXDash Rage does have large maps but again, broken up by loading screens. It would suit games with linear progression and large levels such as the The Evil Within series
So far, they have not interfered with id Tech. In fact, I wish they would adopt it more into the lineup of games than using... whatever hell is powering Starfield.
@@GTXDash It's true that they haven't interfered yet, but they're certainly not licensing it out to other companies like they frequently used to. I wish their other subsidiaries would use it too, LOL
@@skew5386 It'd be interesting to see the Elder Scrolls team collab with iD to replace Creation Engine with something based on iD's work but with the tooling Bethesda needs.
@@skew5386 They can't license something that has middleware from others, it's not as easy as you think. Why do you think idTech4 was the last engine id released its source code? idTech4 had no code from others, it was all id's stuff. And yes, I know about the Creative BS, but after Carmack rewrote code for the shadows, it was all good to be released.
idTech is so awesome.. I ran Doom Eternal at Ultra Nightmare settings, 1440p (DLSS Quality), with Ray Tracing at 100+ FPS at all times... had the same experience with the Wolfenstein Games, nothing else performs like idTech games...
Good video breakdown of this important game engine history. I say it's a shame that ID Tech is no longer headed up by Carmack or making open source its old engines, or competing with Unreal as an engine that even independents can use. Your game looks great, I will be buying on GOG once its out. Keep up the good work!
This. Of course, there are amazing graphics programmers that work on stuff that runs badly, but they squeezed as much as they could to even get the thing to run.
the games that stick with me are the ones with in house engines. rent an engine leads to a bunch of games that look and feel the same with different skins. so you end up with a soulless conveyor belt of product. this is ok for a business but not in a games as an artform. that said ive been writing my own engine on and off for almost 20 years now and its still very rudimentary. its 95% lua and runs on a dead stock interpreter with a few custom modules to handle apis, low level stuff or things that need to be fast. i want as little 3rd party code as possible (though i am using a few 3rd party modules as place holders).
Great video. I was using Allegro until a few years ago, but ended up switching to Godot purely because it allowed me to prototype faster (and incidentally, it's the only engine with which I've actually gotten games to completion) - but I have to say, I really do miss working with C. The games I make are so low-power that this hasn't ended up mattering too much, but Godot have recently added a very intuitive way to produce a build with all of the features your game doesn't use stripped out. Might be giving this a go.
While I like and agree with most of what was said in this video, I don't think the truck vs bike analogy was the best. Game engines can be big heavy and slow but, that doesn't mean that the output game has to be. Compilation settings, when done correctly should be able to remove many of the libraries and components from the game binary if they aren't required by the game. Now the amount that can be removed will still depend on the engine and the feature set in use, but I think it's important to illustrate that you CAN get good results on modern engines.
This has been a very long on going argument from both sides and it really is context specific. Do you boost up a simpler but lean engine or strip down a complex one for performance? Depending on the scenario - both options are viable.
There is a good talk about how the rendering tech in a Unreal Engine 4.27 (pre-Nanite) game was stripped down to the necessities and modified with some extra features for what the game needed. The game was not released, but the talk is very detailed and useful. The video title is: A Taste of Chocolate: Adding a Rendering Fast Path without Breaking Unreal Engine
Thise games look like games of their era. Battlefield 1 and Star Wars Battlefront look far better and use far more real time rendering than those games as well as other new features like PBR and wide use of Photogrammetry for it's materials
Some developers heavily rely on TAA to smooth out some of the effects used. So they don’t add the ability to turn off TAA as it will show flaws (pixelated or grainy looking shadows and effects)
@@SinisterOyster I actually don't mind TAA. I'm just not a fan of how many devs use it as a crutch for fixing issues that shouldn't have been there in the first place.
I think the biggest issue about modern engines is that they are basically tailor made for bigger corporations, that want to make impressive looking games fast. The whole existence of TAA, upscaling tech and Nanite are made for making games that would look good in still frames, but not in fast paced games or if the frame graph was enabled. Its really sad to see companies abandoning their own engines (especially Frostbite and Red engine) to move to Unreal.
TAA is actually pretty old. Using data from previous frames to anti alias new ones dates back to at least the 7th generation. I believe Crysis 2 had a form of TAA that was pretty bad by todays standards. Nvidia has had another form call TXAA that is in some early 8th gen PC titles too. Another form called SMAA T2X was another form of temporal antialiasing too. The point of it wasn't to make it look good with still frames, it was a way to smooth over certain things that MSAA and sometimes SMAA missed. In it's early days it was just another AA form like MSAA, FXAA, SSAA, and SMAA
i think we've all felt the frustration of the frequency of poorly performing big budget games and this was a truly refreshing video to watch. As well as eye opening to just how great a job ID Tech has been doing and how influential it has been.
It does. That's why we still get games that look like older games, sometimes even made in the old tech like Hell Denizen made in a modified version of Quake1 engine.
2 месяца назад+4
Holy shit, didn't expect to see Tilengine here! Such an awesome little framework :D I used it as the base for my last game project, and it was surprisingly easy to bind to a different language (I used Odin instead of c), so super happy to see it get some love :)) (Falconet is looking amazing too)
2 месяца назад+1
I'm hoping to find something similar for 3D graphics, with enough control for me to create the underlying systems how I want, but not getting stuck on the more complicated side of vertex buffers, pipelines and whatnot. Raylib seems a bit too simple, maybe sokol-gfx could have what I want? Meanwhile I'll still be using Godot for 3D stuff...
I just discovered your channel and I need to say you are a true gem ! I am an upcoming Game Developer and want to learn so much about all the areas of Game Development, and this video definitely opened my eyes to the world of engines and how interesting and important they are ! Thank you very much, hopefully your game becomes successful in the near future !
If there's just one thing that can be learned from games like Doom and Quake. It's that (aside from making a good game), front loading the technology aspect to make creation easier and allowing the players to make stuff leads to insane longevity. Doom and Quake will be played for a long long time, people will be making stuff for those games for a long long time. If you only had Doom and Quake and access to mods, you could easily play those games for the rest of your life and never run out of new "content" (I hate that term, but it's fitting here).
People are still playing my two full episodes ("Darkhell" & "Blakhell") via the D1 engine. Great design is simple yet simultaneously robust. Poor design stacks layers upon layers upon layers upon itself to mask janky flaws and poor choices.
10x faster hardware -> 10x better visuals, is just not how it works. So many techniques used in games do not scale, because they have fundamental issues that can't be fixed with faster hardware but require an entirely new approach that is often much more expensive.
@@sulfur2964 An engine can be bloated, doesn't mean the compiled game is bloated. Engines are still tools, there's a lot of room to work with most. Features can be disabled, code can be streamlined, packages can replace default functionalities if not ignored entirely. I have only used 2 engines at depth (GameMaker and Godot, which to be fair are kinda light engines for undemanding games) but I seriously doubt either had much effect on performance in a relevant way. At that, they allow much faster design as well as communities and support.
@ymi_yugy3133 Yes, I am aware that if you have a model of a bowling ball that has a million polygons, increasing it to 10 million won't make it look 10 times better. But keeping it to a million and improving reflections, adding higher-quality shadows, higher res textures, ambient occlusion, global illumination, etc etc... it can look more than just 2x better.
@@GTXDash All these techniques are good examples. Why dialing up the settings doesn't work. Shadows, Ambient occlusion, reflections. We have more or less fast ways to get some pretty good approximations (Shadow maps, HBAO, screen space reflections), but they all suffer from visual flaws that you can't fix by just turning up dials. The typical solution is ray tracing and that is just really expensive.
@ymi_yugy3133 Dialing up the detail is not what I'm saying makes good graphics. I really don't think expecting more than 2x or 3x visual and performance improvement over what an XBox 360 can do is too insane of an ask.
Great video. I agree there is a huge gap for simple and fast engines. An other engine I would recommend is Love2D with the g3D. It can run on a Raspberry Pi Zero which has only 500MB RAM. I was shocked when I seen a full FPS game powered with g3D runs like butter on the Pi Zero.
This is a very well-written and presented video! I also share the same thoughts you listed. I hope your game gets the same recognition and success both of you are investing in!
The microcontroller in the joycons of the Switch is more powerful than the NeoGeo. The problematic performance is simply on the shoulders of the engine and the game development. With 128 times more RAM and CPU power than 20 years ago (and probably the same with the GPU) we only get a bit nicer looking games with comparable enjoyment. While I am not a graphics whore, if a game requires twice the hardware of another one for the same performance, I expect a nicer looking screen in front of my eyes. I am fine with the graphics of Serious Sam SE, but I expect on my R9 5950X with 128GB of DDR4 3600 and RX 7800XT at least 32 simultaneous VMs to run the game just as well as my P4 2.4GHz ran it 20 years ago with a Radeon 9800XT (of 21 years ago) on 2GB of DDR1. As you can probably tell, no new game that looks like that runs like that today. You could argue that RAM speed has only improved about 10 times (comparing those two systems), it's not even 8 VMs that would actually run something looking like that.
I'm only a quarter of the way in, but I'm liking this so far. One thing that always bothered me is how the 360 generation was usually targetting 30fps. I prefer higher than 30, so I felt: "If performance was better, then they can finally increase framerates!" And then it just... never happened. I always wonder where that power ran off to, especially in the era of the PS5 and XBSX
Funnily enough there was enough GPU grunt to potentially get the frame rates up but memory bandwidth hampered them significantly. Multiple render buffers simply dragged down the frame rate and GPU's could be left just twiddling their thumbs waiting for data to be shifted. RSX alone would probably been better off with faster RAM and a lower clock rate as bandwidth was the primary bottleneck. This wasn't such a big issue on the 360 as you had the EDRAM to allocate render targets and so long as you managed the geometry splits well. This is why you would regularly see it performing better. EDIT : A big issue then and even now is that GPU utilization tools aren't terribly granular. You could take a top of the line GPU today, max out the bandwidth and it will show the GPU at 100% utilization even if the GPU is actually doing next to nothing. Made it very difficult to figure out were performance issues were coming from.
Id really took the Crysis 2 & 3 era to heart and went: "Okay then, FUCK YOU. Superb graphics AND Superb performance!" with the new Doom and then Eternal.
Doom reboot looks worse than Crysis series. They failed art direction. It's utterly inconsistent. And nuDoom has completely static environments, you can't interact with items like you do it in Crysis games. NuDoom also uses baked lighting, unlike Crysis with Light Propagation Volumes, which allows you to change everything and still get good lighting. And what's bitter, Doom Eternal won't work buttersmooth on cards that will run Crysis 3 fine.
One thing to keep in mind is that just because the end result of a modern game engine looks like a game from the 90's, doesn't mean it is doing anything close to similar to a game from the 90's, even if it's rendering at the same resolution. Just a surface level example, the Genesis (Megadrive)/SNES had specific purpose built hardware for blitting sprites to the screen. Modern GPUs don't work this way, they work on geometry primitives and bitmaps. What may have been a handful of assembly instructions before being handed off to bespoke hardware circuits then, is reems of code for projecting geometry and a full pass through a fully programmable rendering and rasterisation process today. As with everything else, we're moving towards generalisation at the expense of the performance benefits that specialisation brings.
Not really what I was saying. I'm not asking that a retro style game runs 100x better on modern hardware than the original hardware it's mimicking, just that it doesn't run worse than the original. I REALLY don't think that is too much of an ask.
@GTXDash it's a fundamentally different process. While retro style graphics really shouldn't tax modern hardware, it's also just not optimised for it. You'll never get the same efficiency as purpose built hardware out of software.
@@Salabar_ fair approach. You're still running software to do what would have been hardware on older consoles, but it's going to be less overhead. Cool though.
10:20 I've asked Deck 13 if they consider licensing out their PINA point&click engine, considering it hasn't been in use for more than a decade. They politely declined, but I am pretty damn sure, that the reason they said "no", is because I really was just a student at that time. They have given the same engine to another company specializing in Linux ports (RuneSoft), so it's more of a "don't see the point in licensing" thing, I suspect.
Software not made for the public may be subject to licensing issues because of used libraries. For example if the engine contains code for consoles, they can only give that to you as a registered developer for consoles, which some years ago was not possible for individuals, just companies. May have been something simple like RAD game tools, everyone used bink videos. And bink still has no pricetag on their website, which means "if you are a company, write us. If you are not, you can't afford it anyways".
Love the video! FYI, id Tech Engine versions 0-4 were all released free and open source under GPL licenses. Versions 5-7 (and after) are still proprietary, though. Wish id Software would license it out to third-party studios again, like they used to.
Sadly without Carmack at the company anymore I doubt any more versions of iD tech make it into the wild. He was the driver of open sourcing their prior iterations.
Engines became much more complicated and now use a lot more third party tools that may be under license and can not be open sourced. Also we do not know what the build pipeline looks like. Just pressing F5 to run the game is not standard on engines. There are rumours of days long build times on some AAA games.
@@WizardofWestmarch Sad, but true. I just wish Bethesda Game Studios would drop Creation Engine and fork id Tech for their projects going forward. They actually had considered using it for _Skyrim,) but Carmack dissuaded them from using it, since MegaTextures was a major part of their pipeline and he didn't think, at that time, that it was suitable for the type of open worlds that they created for their games. Interestingly, id released _RAGE_ the same year BGS released _Skyrim..._ using id Tech... with a semi-open world. Since then, however, that have brought in the Vulkan graphics API (they had been using OpenGL) and gotten rid of MegaTextures completely (with id Tech 7). And the world spaces in _DOOM Eternal_ are quite massive, even if they don't feel like it because the player is moving at such a fast speed. So, I'm confident that it's perfectly capable of an explorable open world. BGS would certainly have to create all kinds of new mechanics to customize the engine for their particular use case, but I believe that time and effort would be worth it. (And it might be possible to "strip [the Creation Engine] for parts" and incorporate them into their id Tech fork, considerably reducing the workload. But "dropping in" features like that would really depend on how modular the id Tech's architecture is.) Still, it would be worth it, because BGS would finally get rid of their mountain of technological debt and have a solid foundation on which to move forward. Caveat: All the above having been said, once they got their id Tech fork where they wanted it for their next game, BGS would be wise to set aside a team whose sole job, at least until the project is done, would be to create a new modding kit for the new engine, to be released Day 1 with the game.
Reminds me of when I ran Doom 2016 on my Surface Pro 6 tablet (an i5 CPU with integrated UHD Graphics 620). On Medium graphics with a slightly lower render scale it ran smooth as butter, even in busy scenes. Was impressed and it still looked great!
What kind of resolution did you use? Could you record with another device some scenes like Vega Processing Level? I remember having troubles with it. Please, do it as I want to see the experience.
Graphics programmer here. Quoting "Threat Interactive" here was a big mistake. The guy speak authoritatively, but that's it. Most of what he says are falsehoods resulting from lack of knowledge and laziness to find out. I watched the video in question was was impressed by how bad it was. For reference, pretty much every single thing he says about nanite is complete opposite of what the technology actually is. Brian Karis of Epic games, the mind behind the nanite has given a number of technical talks on the subject. I myself have implemented similar technology in the past, so I'd say I have the authority on the subject.
I think it would be great if there was some form of benchmark that can settle the argument once and for all. Nanite with LOD, LOD only with higher poly meshes, and Nanite only with no LOD. If someone can do a test like that, I would love to see the results.
@@GTXDash Generally speaking nanite will be faster when all other things are the same. When we do LOD transition, we're actually drawing 2 models of different LOD levels, to mask the "pop". Nanite doesn't have to do that, as transitions are much more fine grained so "pops" will not happen. That said, Nanite will be slower than custom LOD hierarchy in practice. Why? Because custom LOD will just have much lower triangle density, where as nanite will attempt to give you as dense a mesh as it can, but no denser than 1 triangle per pixel. You can adjust that "error" factor to any number of pixels though. The implementation I had was using 4 pixel area, as I saw no noticeable difference and it pushed out much fewer triangles to the GPU. I think the comparisson in general is not very useful. Nobody creates meshes with every vertex placed by hand, we use tools. Sometime in the 90s we did place every triangle coordinate by hand. Today we don't, there are plenty of bad topologies (triangle layouts) in production. Why do we not care about individual triangles today? To save time, and because hardware can manage an extra 100 or 1000 extra triangles without much issue. Nanite is basically that, instead of investing time and slowing your art pipeline with custom LOD and paying for expensive LOD tools such as Simplogon - you just tick a box saying "enable nanite" and you get amazing results. There are tradeoffs, and there are configurable parameters, but it's not about runtime performance, it's about developer productivity in general. Say you want to make a good looking game as a solo dev. Do you know how to make good LODs? Do you have a spare 10,000$ per year for a Simplygon license? Well, if not - "go home kid, let adults do this. "
I have feeling that more experienced game dev just seasoned out or even burn down, what we get average developer who struggle with all of details and complexity.
this whole video really lays out bare that optimization isn't just some long-winded party trick; it's still a necessity, and players will appreciate it whether or not they can't articulate it, especially when compared to a lot of the poorly optimized bloat-fests we get fed for upwards of $50.
You're pretty off the mark about the comparison between the Doom engine and the BUILD engine. Duke Nukem 3D was fully two years after DOOM and it has MANY features that are simply not present and not possible in the vanilla DOOM engine. It's true that later ports of the engine, particularly those that implement Hexen features like ACS scripting, are capable of doing the same things as BUILD, but those are VERY far removed from the vanilla DOOM game. Your example of "getting a sector to lower with many explosions" isn't something you can do in the vanilla DOOM engine without using the hard-coded Icon of Sin death effect, which has other effects, like ending the game. Which brings me to my other complaint about that section of the video, which is that DOOM was by no means "split" into "engine" and "game". The DOOM executable file is completely full of assumptions about what game it is; all of the "rules" of the game are contained, hard-coded, in the binary. The fact that add-on files can modify the rules of the game is only possible in later ports of the game that support loading integrated DeHackEd files, or more advanced modding systems introduced in ports like ZDoom. The concept of a "game engine" really wasn't discussed among gamers until the conversation became Duke3D vs. DOOM, and that was almost entirely because many documents surrounding Duke3D included a copyright message that said something along the lines of "The BUILD Engine is Copyright Ken Silverman (C) 1995". Nobody talked about Heretic or Hexen being "Doom engine" games until '96 or so; before that, they were just "based on Doom". Ultimately I do think the thesis of your video is reasonably sound, but you have a lot of problems in your fact-checking.
The comparison wasn't between OG Doom, but the Doom engine itself. It doesn't really matter that on release Duke3D could do things like spinning doors, or lowering multiple sectors, when that was already possible in Hexen's usage of the Doom engine.
sure, you're right about the hardcoded features- but when discussing the ratio of performance to visual quality- it doesn't matter as much if something is hardcoded. the DOOM rendering engine vs the BUILD rendering engine is a very close comparison. what's not a close comparison- is the BUILD rendering engine, to the QUAKE rendering engine. the addition of mipmaps, the ability to stop rendering anything that isn't potentially in view, the interesting new uses for the 256 color palette, such as full-bright pixels within the textures, the 3D mesh view-models rendered separately from the game world- the lightmap calculations done while compiling the map- it was all fantastic stuff- these were all revolutionary techniques, and they did change or even define how real-time rendering progressed from that point on
@@azazelleblack i agree with both of you- i still agree with the video because it's still true. id software's engines do and have historically outperformed any competing games or game engines targeting the same level of visual quality. id software and panic button have put tons of work into making sure their games run as fast as they know how to make them run- and this is obviously not unique to id software or panic button, but they are objectively doing better in this regard than other studios with other engines. - that's what other developers can learn from id tech.
@azazelleblack so yeah, this is not a contest of which is better, Build or Doom engine. We are all aware of the features Build has over Doom. The thing I wanted to point out is that Build tends to get a lot of the credit for things that Doom can also do, but because the original map authors didn't think to add those effects in their maps, people just assume it's an engine limit. Of course there are things that Build can do that id tech 1 never will, it's just not as extensive of a list as most people think it is. The Doom EXE does also store additional code that one wouldn't nowadays call as part of the engine. But the thing is, back in the early 90s, game devs were still trying to define what an engine is. Is it just the renderer, or is it also the physics, controls and game mechanics?
I doubt it matters, but Halo was originally supposed to launch on PC before Microsoft bought Bungie and had them make it an Xbox exclusive. From a regular dumb gamer's perspective you'd think "porting" it back to PC would be easy then.
33:26 thats why i kinda hope that valve would release source 2 to make that gap the engine itself is still inherent idtech 2 architecture while being moderate enough to work with
Funny thing is RPG Maker seems to have a similar design philosophy of being mainly used for a single genre but it gets a lot of flak for being too restrictive even though really if you're creative you can use it to make some crazy unique stuff and it seems to hit nearly every Criteria you have on your list lol
RPG Maker games that bend it too far tend to run pretty poorly though --- as should be expected. It's really made for a specific type of game and hyper-focused on that.
Actually, old VGA cards did support hardware scrolling. What you couldn't have was hardware scrolling using the popular Mode 13h, because you wouldn't be able to access the extra video memory required to scroll around.
@BaddeJimme yes, certain video cards have limited scrolling. But not other display modes like CGA and MCGA. But Commander Keen still ran fast even on a non EGA/VGA card, even when the scrolling is handled in software.
I've felt that devs have lost perspective for a long long time now, the "universal" engines are great, awesome tools and you can slap together a game really fast... But the longer these engines are used, the more and more talent is lost when creating a simple purpose built engine for whatever game you're trying to make. Personally, I love trying to do as much as I possibly can from scratch and it shows in my engine, both in terms of how long I've been working on it and the quality of code (long and poor lol), but it's also just a hobby of mine, so that doesn't matter... The point is I learn from it and I wish more devs would do the same, learn how to do some of the old school stuff and work up from there... You can even use that knowledge to make Unity and Gadot run better on low spec hardware.
Me as a beginner developer, I love engines. But I do recognize that engines have a lot of features and honestly bloat that we don't need and only bog down our games. I have watched a few talks and interviews with developers involved in the making of really old classic games and the one that really stood out for me was the developer behind Crash Bandicoot. The things they had to do and optimize just to get crash into a PSone is INSANE. I'm using Gamemaker and know a bit of Unity but I'm currently learning Love2D which is a library and not so much an engine. I'm looking to optimize my games the most I can this way and even extend this to fit any future games I make.
28:36 I feel kinda gaslighted on my childhood, because I remember UT2004 and SWAT4 running really well on my potato PC, while Doom 3 remained borderline unplayable until we upgraded the rig.
Doom 3 had dynamic lighting, stencil shadows and normal/specular/diffuse maps interacting with each other. It was a groundbreaking game engine. Nothing else looked like it when it was released.
@JennyTheNerdBat it's complicated. If you use an older GFX card like a geforce4 ti4600 or even a geforceFX 5800 Ultra, Unreal 2004 will run faster, but the opposite is true if you use hardware that id tech 4 takes advantage of like a 6600 even though a 6600 (non GT model) is often slower than a 5800 ultra in raw performance. If you pause at the benchmarks, you'll see "6600GT, high settings, 1024x768". You can't really lower Doom 3 graphics a whole lot. But in Unreal 2004, you can turn the visuals down so low that it's enough to hurt someone's eyes. Like I said in the video. It's not an apples to apples comparison. But a lot of people see Doom 3 as the Crysis of its day, which it really wasn't.
Making a game engine is tough, it's one step away from making a physics engine and even big game studios usually don't bother with making a physics engine themselves.
Loved the video. Want to point out tho that Riddick: Escape From Butcher Bay was released on XBox before Doom3 (and on Windows slightly after Doom3). It had all the same features that Doom3 had, but also some that Doom3 didn't (cubemap environmental lighting). My favorite being directional based ambient sounds, in the beginning you hear the wind blowing outside and depending which way you rotate the camera it changes how the wind sounds. If I recall from the dev documentary they recorded some of the ambient soundscapes using multiple directional microphones and blended them in runtime based on orientation.This feature is still very uncommon in games or engines today. So it was not just Doom3 that made all these cool features, Riddick was much less known game, but it was a technical beast. It was not just a technical marvel, it was and still is a really good game, where as Doom3 was a disappointment to many, me included. I recommend to look into Riddick: Escape From Butcher Bay, it might blow your mind how technologically advanced it was just around the same time as Doom3.
I'm already aware of Riddick. A lot of games used many of the techniques that defined Doom 3, but Doom 3 combined all these techniques including full dynamic shadow volumes at the global scale. Not to say that Reddick didn't look great. It may not be as dynamic as id tech 4 with its simpler environments and shadows, but still impressive.
@@GTXDash Thank you for answering. But Riddick used the exact same shadow technique that Doom3 did (full stencil volumes extended from polygon edges and shaded per-pixel, including objects being able to self-shadow). Both Doom3 and Riddick used per-pixel shaded normal mapping with full dynamic lighting. Riddick had the extra capacity of pre-calculated environmental cubemap lighting and reflections that Doom3 did not support during launch.
@ristopaasivirta9770 that may be true. But the only way I personally can know that for sure is if I can experiment with the engine... unless... Does the PC version allow for custom maps? If so, I would love to look into it to see what it is truly capable of.
@@GTXDash Yea it never was very moddable or editable. There used to be some tools back in the days, but they are probably really hard to come by now a days. So it could be really hard to verify by yourself and I can only give anecdotal statements having tinkered with game engines my whole life. Anyway just wanted to give a shout out to the game just to give some counter weight when people say that "Doom3 was ahead of it's time." there were others that did same things, and in my opinion Riddick did them better. Doom3 just happened to be the most popular and visible one.
This is the stuff I've been saying to my friends in the gamedev space for quite some time. I do create my own engines for the games I make, and that's not a brag, because the truth is, it's not as hard as people think to make an engine for a single game. And I'm so glad that I've gone this route. Every time I go try to use a general purpose engine for something, I'm disappointed.
As a complete hobby dev I know from personal experience that GMS has two compile modes, VM and YYC. VM is as it sounds and the game is compiled to bytecode that a virtual machine runs and is often a much slower way of making a game. Where YYC compiles to essentially a C++ style executable with faster performance.
"The new doom engine was much more futureproofed." Me, loading up e1m1 for the hundredth time with a laundry list of mods. even so far into the future, the engine is so versatile that it holds a dedicated development and modding community to this day.
I love the deep dive on the engines here. I definitely agree that specialized engines would do a lot to make game more optimized, but I think the economic reality of AAA will push towards a few "General Use" engines. for example, CDProject RED and Bethesda are both switching away from their custom engines to UE5, largely because it's easier to hire for.
I think we need more competition beside unreal engine 5 ngl, unity is kinda nuke themselves on the foot. More choice of powerful 3D engine that can expose unreal engine weakness would be really nice for consumer. That mean unreal engine will force to optimized their engine when there is a competition that make a well optimized engine that someone would use for their project.
Just wanted to say, give credit to the fact that there is now an offical update to doom snes coming out soon that will run in like the mid 20 fps! That's really impressive
Modern Game engines are suffering from the same thing that RISC-V is trying to solve on processors. Backwards compatibility, legacy code and a fork JUST in case someone wants to connect a CRT or serial Joystick. But for game engines, you have 3D features included even though the game is only using 2D assets, BUT in a 3D plane to fake parallax. Like making a Mass Effect game in a First Person Shooter engine (Andromeda).
To be fair andromeda had the best gameplay of all the mass effect games, not that it was a high bar, I love ME but it's carried a lot by its setting and story, and that was the problem with andromeda, the writing was so bad, and the setting kinda got change with the change of galaxy. But the movement was great and the gunplay was better than the other games, at least until you max your damage skills but your character keeps leveling and the enemies scaling with it, turning them into sponges, awful balancing, but easily fixable with mods. Also not letting you give commands to your squad? Come on.
@@ERECTED_MONUMENT Pretty sure they still believed in "Bioware magic" at that point. And they were being forced to use the EA FPS engine to suddenly make a 3rd Person RPG. They had to write a lot of code from scratch, WHICH did benefit future EA projects immensely, BUT is also why EA just keeps axing their studios.
@@hunn20004 Yeah, the "Bioware magic" didn't truly fall apart until Anthem, but didn't Dragon Age Inquisition come before Andromeda? It also used Frostbite, so they must've had some tools from that to use in Andromeda, unless there was zero communication between the teams.
@@ERECTED_MONUMENT True, Here's what Leo AI found on it: The Frostbite engine was criticized for its limitations in handling the RPG genre, particularly in Mass Effect: Andromeda. In contrast, Unreal Engine 3 was praised for its flexibility and suitability for RPGs. The use of Frostbite 3 led to some technical issues and visual inconsistencies in Andromeda, which were noticeable compared to the original trilogy. So i've mentally compressed the issues to "Frostbite was bad for Bioware" so a lot of intricacies did get lost during the last 7 years.
@@hunn20004 Yeah, at minimum Bioware should've spent some time retraining to get the studio ready for the engine change. Maybe they should've worked as a support team for one of EA's other titles. That being said, at some point they needed to change from UE3 anyways, UE4 only released in 2014, and wouldn't have been a smooth transition either since so much changed from UE3 to 4.
Thanks for sharing this video. I love everything Id does. Btw, i wonder why you where mentioning godot so frequently, considering how abysmal it's performance is.
I just use Unity, Unreal and Godot as a strawmen for "other competing engines" :P I'm actually fairly familiar with Godot since we used it once for a game jam a few years ago, and I thought Godot is... fine. It gets the job done.
Fair enough 😅 I suppose for 2d that is accurate. I am considering building my next project from scratch, using raylib and c++ More as a learning experience than anything else.
I had to double check if Build really couldn't legitimately do rooms over rooms. But if I'm still wrong then it seems that there's a fan wiki or two that needs correcting. 😅
The newest example of this is Pennys Big Breakaway. Tailored Game Engine for this specific type of game build from the ground up and doesnt use any more graphic features then necessary, runs amazingly good
32:15 - you mention that idtech used OpenGL rather than DirectX in the early 2000s, but Unreal Engine 1 and 2 both featured OpenGL renderers at launch, no? (They also both featured a software renderer).
Unreal 1 ran best and was most optimized for Glide. but it also supported D3D, OpenGL, S3 metal, etc, it was very API rich and thus very compatible. Unreal 2 used D3D, but you could switch and use OpenGL, both were supported.
@unfa00 technically, yes. But have you actually used OpenGL in unreal? It's a buggy mess. It's such an afterthought. Not surprising becuase Tim Sweeney openly hates OpenGL. Unreal 1 was built specifically for Glide And DirectX, Unreal 2 and 3 only DirectX, but after some modifications, could do OpenGL. But that was a lot of work which is one of many reasons why people dreaded porting Unreal 3 games to the PS3.
@@GTXDash In Unreal1 OpenGL became a must. Initially it ran the best under Glide, Remember the very first versions of Unreal had glide only, and D3D was always buggy tho stable. The common quirks on D3D were the added contrast/saturation when running in 16-bit mode, and the weird detail texturing, which was off by default but when you enabled it in the ini, the textures were far more dense than they should have been. Later, when the S3TC high-res texture pack arrived, it worked only under S3 metal and OpenGL. That off course is em speaking about the improved OpenGL renderer which became available later, as the original one was quite meh. But later it was a must have. UT2004 had openGL as well and worked well except one issue, it only supported blob shadows, but I still used it as my 6800LE back then ran out of Vram when all was maxed and in 1600x1200, it did so in D3D, but it did not happen with OGL, so I played with OGL. However with Unreal3 OpenGL was dead and porting that to PS3 must have been a pain.
23:10 the playstation 1 only worked with integer based coordinates, so it was able to calculate less, and be cheaper, but those vertecies had to jump around, thats why the original playstation looks as if it wobbles and jiggles a lot, by contrast, the N64 did everything much more granularly and thus had to do far more calculations for the same "graphical fidelity" so the tradeoffs were built into the console for the playstation and built into the games for the n64 the n64 was a better play experience when it did work, but the playstation was a better experience when you didnt care about all of the smaller details, or the massive amount of jiggling
Yeah. Ken Kutaragi's priority was he wanted the console to generate as many realtime UV mapped polygons fast and this was a pretty clever way to to get there. Ditch the Z-buffer and have the verts only snap to integer numbers. You didn't even notice is as much at the time on the 480i interlaced CRT TVs so it was a good compromise and seeing the demos of Ridge Racer running on a loop at an Electronics' Boutique store convinced me to buy a PS1. I was playing PC games at the time, but even then, I hadn't seen 3d polygons run that smoothly before.
@@rodneyabrett It was definitely very noticeable back in the day on a CRT. Maybe not _as_ apparent, but most PSX games were very wriggly and janky-looking even back then.
@@rodneyabrett Oh, yes, upscaled old 3D games look terrible and break all art direction in general, but it's even worse with PSX games due to all the glitchiness. I can't get my head around why people insist on doing that, much less why they insist on it being an improvement.
It was intensive, yes. But it was also far easier to understand and reason about. The rendering pipelines nowadays are so complex that you can't really keep a complete mental model of your renderer in your head. I can't, at least.
Check out "second reality" by future crew. That production came out before doom and it's a showcase of what would have been possible these days. IMHO the best thing ever coded on PC.
There was graphics acceleration since the 80s in consoles, arcades or the Amiga. But specifically for 2D rendering (sprites, blitter). It was using specific hardware, but was not the GPUs we know today.
@@phojamantirasoontrakul That is very true. The background/sprite based systems on consoles were designed to minimise the amount of work the CPU had to do. By focusing purely on deltas they could punch way beyond what it looked on paper. You had the bizarre situation where PC's had CPUs way beyond what the consoles but struggled to really put that power on screen. Consider that the original Mac and the Genesis/Mega Drive have the same CPU but very different results.
nicely written video, I think it's one of the best overviews I've seen of all the different iterations of id Tech and their strengths and innovative features. 37:36 surely you can run the game in a VM to check if it runs e.g. in a slow single core with limited RAM for fun?
I think a better version of the Van and motorbike example, the van needs to be big and featured enough for anything it could possibly need to carry. But building a brand new motorcycle which is slimmer and almost perfectly efficient for the delivery of your single specific package is generally too expensive and time consuming.
Just like how shipping companies don't need to design the vehicles they use, a publisher should be able to purchase a license to a specific engine they need for that specific game, without always going through the process of building their own engine.
Writing your own engine (like Dead Cells did) will be an advantage, but you need strong programmers to do that in A/AA production. Unity or Unreal are great for most games, and most games built in those skip on optimization, there’s a lot work to be done in oprimization department tha could make the games run better. Only AAA studios can purchase a license for something like IdTech, or maintain their own tech, but that’s expensive and hard, e.g. even big company like CDPR ditched their RedEngine for Unreal after Cyberpunk fiasco.
@roklaca3138 problem is it takes a lot of time, effort, and resources to create and maintain an engine. So much that some even say that once you start making your own engine, you're not a game dev anymore. you're an engine dev.
When you watched any video from Threat Interactive, DF tech analysis is everything but in depth. It's the most plebian possible tech analysis after "it's pretty" you can find and the most "catering to establishment" possible as well.
I don't think it's all or nothing. I try to look at problems from multiple perspectives. Threat Interactive have been very critical of DF's opinions of TAA and such. DF is very useful for data collecting, not necessarily their opinions on the direction that the industry needs to go.
Really like the talk when old games compared to new one, especially with example about Retryo Style game for modern consoles that just build differntly for it.
Great video. I’ve always been super interested in how games like half life 2 and other source engine games get their look. I’ve seen lots of super high quality animations with incredible art direction come out of G mod and it makes me wonder how source works
im a baby game dev... but this is literally why i use bevy over any big game engine. I take the pieces i need and have low level control. Does it suck to roll out all your tooling, sure... but ffs its nice to be modular and slim
Game maker is quite optimized for drawing sprites. The last couple projects I've worked on run at a "real" FPS (though capped at 30) of over 1000 when compiled, and include a small bit of 3d as well. Granted, I was working on them on my gaming PC, but they run fine on cheaper, older computers without dedicated graphics in the hundreds of FPS. Game maker has been around since 99... running on 90s / early 2000s hardware just fine (I remember player game maker games as a kid, as well as the demos that came with it). Unless you're drawing many hundreds of dynamic objects, the FPS for a pixel art side scroller shouldn't dip unless there is something seriously wrong. Maybe "non-code" game maker is slower, no idea about that, but I seriously doubt a single professional game has shipped using much "code-free" game maker slop.
@@lloyd011721 Honestly no idea, the game doesn't appear to have shaders or lighting. It is possible to draw things offscreen without restriction - like drawing the entire level regardless of the viewport and not disabling / deleting things outside the viewport. That would be kinda an insane oversight, and the dips seem to appear only in combat. Maybe porting is terrible in GM, haven't heard that on the forums ever - but if it was terrible for one console it'd be the switch lol. Edit: also just remember some people do pixel-perfect scaling by literally tripling or quadrupling the size of images, not scaling. I have used actual scaling, that's another possibility, but couldn't explain it completely.
Lion Entertainment did a great port of Doom to the Macintosh. Only time dabbled with a map editor. Around that time Marathon on the Mac was exceptional game. Never understood why at the time. There were always more audio channels to be heard on Mac ports. Enemy Territory: Quake Wars lasted for like a minute in gaming. Sill remember its accomplishment was one of the largest multi-texture sample to be pre-cached for rendering. Was always impressed with Guild Wars 1 engine for a MMO. Tabula Rasa, Tera and Defiance were other games felt the engine ran well too.
Modern tech is in dire need of simpler solutions. Not over-engineered crap. It's Wirth's law at its finest : better hardware just lead to worst software.
And software quality will keep deteriorating even more once AI solutions become ever more prevalent. Take a look at how every game developer uses DLSS, XESS, FSR, and so on as a crutch for bad optimization and terrible image clarity and antialiasing
@@barrupa Good opportunity to tell you about the channel Threat interactive, a youtube channel about graphic optimisation. They made a video about how TAA is used for the worst in modern games, it's the first video on their channel.
My sentiments exactly. What they need to do is return to old, time-tested, tried and true game design and optimization techniques. Old techniques combined with the horsepower of new technology and hardware are a win-win combination.
And I'm glad you mentioned Threat Interactive, because I love his videos! He's singing my song!
I have gotten into the habit of calling AI upscaling, frame generation, atc. "software duct tape," because that's how devs are using them, hoping they can slap them over their lazy, shoddy work and call it a day.
@@guyg.8529 I had already seen some of his videos. I am glad I am not the only one who hates the overuse of TAA as a crutch for low resolution per-pixel shaders and their artifacts. It's no wonder why old forward rendered games are far clearer and sharper looking. Everything nowadays looks like I forgot to wear my glasses.
@@guyg.8529 hardware manufacturers are on this, if engines were optimised, you would not need an i9 and 4080 to run decent settings. Take a look at indie some indie games boasting great graphics but you can run it on decent settings even on 1060...meanwhile nshitia nad AMDumb pushing $1000+ hardware down your throats for shitty ue5 stutterfest
As someone who has worked with the Build Engine for 20 years at this point: Another reason why Build is a bit slower, is because everything is dynamic, there's no static BSP tree. This can be considered an engine feature, as all parts of a level can change. For example, you can add and remove sectors from a level in realtime if you know what you're doing, something IDTech1 has no hope of doing. There's also things like planar sprites (floor/ceiling and wall-aligned sprites, which can be used as platforms, decoration, etc. For Doom, the only ports that can do this are ZDoom-based.)
Another thing to note about Build, is every two-sided line is a portal. This allows you crazy, overlapping, non-euclidean geometry without need of teleportation. The only condition for this is you can't render two overlapping pieces at the same time, requiring one to chop up overlapping sectors into smaller bits to prevent rendering issues. Duke3D uses teleportation in a lot of instances since it was easier and less intensive. The level "Tier Drops" is an example of this feature taken to the extreme, all of the different biomes in that level occupy the same physical space in the map.
That explains it. I remember watching a (I think it was) GamesDoneQuick TAS on Duke3D that was able to break the sectors in a hilarious way, And what you explained has now made me realize what was actually happening in that TAS.
I try to not pit Doom against Build since the Doom engine definitely became more demanding after Raven added those extra features for Hexen, and it still used static BSP maps... I think. I just like how similar and yet different these two engines are from each other, BSP vs portals, etc.
Even though I prefer speed over feature set, during my research however, I realized that Build isn't so demanding as I initially thought. A few weeks ago, I finally got my 486 running properly and when I ran Duke3D I was surprised how much faster it ran than expected. This is because LGR (when he first built his ultimate woodgrain 486 DX2 PC) tried to run Duke3D, he got terribly low frames. My 33mhz DX ran slightly better than that. I think there's something wrong with his woodgrain PC since it couldn't even run Quake after he installed a Pentium Overdrive. Go figure. I had to rewrite that section of my script after testing it on real hardware.
I really appreciate the comment.
@@GTXDash Yeah, Build's speed is very dependent on the video adapter used rather than the CPU itself. It really prefers hardware capable of VESA LFB (Linear Framebuffer) mode, which is part of VBE 2.0 (VESA BIOS Extensions). With hardware that doesn't have it natively, SciTech Display Doctor (aka. UniVBE) can help a lot with improving framerates, but it won't be as good as the real thing obviously.
Using a PCI or AGP video card rather than ISA will also go a long way to improving performance, if the system has the slots available.
And yes, Hexen still uses the BSP tree. The swinging doors and whatnot are a trick called "PolyObjects". They are extremely limited in how they can move. I'm also pretty sure they need to be convex in shape or they'll have sorting issues.
@StrikerTheHedgefox yeah, I remember opening a hexen map in Doom Builder and noticing a bunch of duplicate swinging doors just outside of bounds. It took me a minute or two to realize what I was looking at.
@@StrikerTheHedgefox I had a feeling you were going to mention Tier Drops 😅. Build was fun to work with! Moving from it to Unreal Engine blew my childhood mind.
Pretty sure the root issue is, that a good optimized game engine (be it in house, or 3rd party) needs a few very passionate and involved developers, that are so comfortable with the engine + have enough carisma and influence, that they can identify and prevent performance issues and suggest alternatives already in the early stages of development. But these kinds of people are difficult in a big coorporation setting, because if you have 1000 employees, and your entire game can collapse if those 3 core people leave the company, you are fucked. So all big companies are either a) disincentivsed or b) don't even try or c) tried but lost it along the way, to cultivate such tech. ID software was always technology driven and they always had enough success to stay alive. So it's a rare gem.
And it only gets worse. Given the direction the industry is heading, traditional optimization techniques will soon be replaced by relying on AI upscalers as a crutch for achieving decent performance-at the cost of image quality, blurriness, and artifacts. It's no surprise that older games with traditional forward rendering, or per-vertex shaders instead of per-pixel, often appear much sharper and clearer. This is especially true when traditional MSAA or other forms of multisampling are used, compared to modern games that employ a multitude of deferred techniques and low-resolution per-pixel effects, which require temporal accumulation and spatial upscaling to avoid looking like a dithered mess.
To be honest, the technology available when Crysis was released (2007) is more than sufficient to create good-looking games, especially if subsurface scattering and distance-based diffuse shadow maps (like those used in GTA V) are added. Everything developed since then seems to exist either to sell more hardware or to reduce the burden on developers and studios, such as AI upscalers or ray tracing.
I think the future looks grim, especially considering how mainstream GPUs have become power-hungry monsters ever since NVIDIA realized it could push enterprise-level hardware to consumers at insanely high clock speeds in the hopes of making real-time ray tracing viable for game development. This shift places the burden on consumers, who are now expected to shell out $300 or more for a 250W "space heater" that can barely run a game at 1080p60 with upscaling
Man quake 3 and id tech 3 was truly a generation. Many great games running on that engine from childhood. Call of Duty, MOHAA, American McGee's Alice, Star Trek Elite Force, Return To Castle Wolfenstein
I've been in game development since the Playstation 2 days and one thing I can say that contributes to this is the move to those algorithmic optimization solutions over manual artist-driven methods. They are faster but sloppier in many ways. Nothing beats creating LODs by hand when the artist has way more discernment on how to go about turning a 200k mesh into a 50k mesh without losing the base silhouette of the model and keeping UVs clean. I've met many younger 3d modelers that can't optimize as well because they're leaning into these auto-decimation tech too much. It works in a pinch but you don't want depend on it too much. DLSS has also made us lazy with optimization as well.
There's also the move to 100% realistic lighting over the Gen 7(PS3/Xbox 360) hybrid lighting where you had some realtime lit environments blended with pre-baked lighting. That alone is a major resource hog.
When Unreal first introduced Nanite, I knew that they would lead to games running way slower. lol.
Oh, finally! Someone that actually works in the industry. I have a question. So, before modern methods, does LOD work kind of like mipmaps where you have to have multiple models of the same object or 3d mesh with verifying degrees of complexity? I know how id tech 3 does this, but I'm not entirely sure what is industry standard.
@@GTXDash Yeah, it's similar. It's distance based from the player camera and you can create as many LODs as you need(LOD0 is the default model, followed by LOD1, LOD2, LOD3, etc.).
For skeletal meshes, it's multiple meshes skinned to the same skeleton, but the optimization can also happen with bones and skin influence count(how many bones can deform each vert). So you can cull fingers on a human mesh at LOD2, for example, as you wouldn't see the finger deformation/animation from that far away. Even on AAA games, I can always tell when the devs used solutions like SimplyGon or InstaLOD. The algo method doesn't look as clean, especially on VR games where the polymesh is more noticable.
InstaLOD still IS a very great solution and saves you so much time in production.. so not knocking it entirely, but when you have the time to do optimization right across all disciplines in the pipeline(not just art but also the engineering side), you can do magic with performance. Good example are some of these "impossible ports" on the Switch like Wolfenstein 2
As far as industry standard? I'd say that is largely relative to your budget, team size and what your deadlines happen to be.
Thanks for the interesting video!
Cheers
what you've mentioned here is the number one reason i love doom eternal's id tech 7. the lighting uses very efficient shadow maps, but mainly relies on baked lighting to get the job done- as well as cubemaps for all of the blurrier reflections and viewmodel lighting. it's blazing fast, and it's done with artistic intent everywhere you look. i genuinely feel that this produces better visuals than something like immortals of aveum's unreal engine 5, which pushes for entirely on the fly- realtime lighting. it looks fine, but runs entirely terribly. in my experience, doom eternal runs about 4x the speed of immortals of aveum - on the same GPU...
@@railroadism I played through Sekiro again recently and that game uses 90% pre-baked lighting, cubemaps, all the older methods and it not only runs butterly smooth at over 100fps in many cases, but it's visually a more beautiful game with a more painterly look to the render than something like Immortals Of Aveum. The juice sometimes isn't worth the squeeze with the newer lighting.
Absolutely. Even to this day I am still impressed when I see artist driven LOD done right. And by see, I mean done in such as way that it cannot be seen. You know there is LOD going on but you cannot see the transitions. I fall in the 'Artist driven' ideal far more than the advanced tech idea. Both have their place but it feels like we are on the rely on tech to solve problems rather than figuring out more elegant solutions. We are brute forcing so much nowadays. Which is why when you see something that is artist driven (doom a great example), it is a reminder that we can still do great things while using great tech at the same time.
And I agree, things like Nanite, reconstructive detail shaders, temporal upscaling etc has lead to some very sloppy work over the last few years. Why optimize things as much as we used to when you can just throw some upscaler at it complete with details fizzle? The amount of games now that have laughable base render resolutions is astounding. Some are approaching Ps2 level render resolutions and that is just wild to see. That said, there are things like that being done on titles like Doom (2016) on switch that you can only catch on rare moments. It was using that tech to near perfection were it made sense. That I can get on board with.
Sounds like 39 minutes of "use the right tool for the job" and ID is a beast with doing what they do best.
Not sure if Arkane use idtech well(Void Engine is only less than 10% idtech5)
A lot of old games looked amazing because they would "bake" the lighting effects into the map. Basically it does a bunch of calculations at map compile time to see how the lightning from the lava should look on the walls, and it colors the walls like that. It would take like 20 hours to compile a map for Counter-Strike or Team Fortress Classic because it had all of these lighting calculations. The only downside of doing this is that baked lighting effects do not cast shadows. Most gamers really don't care about shadows. I can enjoy the environment with the shadows cast by the environment (baked) even if it doesn't have shadows cast by the player (dynamic).
Modern games seem to skip all of that and just force the GPU to calculate everything on the fly. You get a game that looks the same but the GPU demands are 10x higher.
Most games are open world now, baked lighting just doesn't work
@@AdrianMeredith True. Modern games also try to do dynamic lighting (a fire that flickers), and that makes it so you can't bake the lights.
Rage is greatly underappreciated. The way they utilized large textures they broke up the static flat look of most engines at the time and made it look... real. The game itself was critically panned, but it offered something unlike what other games were bringing.
Mega Textures were dropped after Doom 2016
@@crestofhonor2349 that's true, but it was really the defining technology Rage brought. It helped still the immersion really well, but perhaps in a way relevant to this video, that feature alone wasn't contributing to gameplay mechanics which allowed the game to stand out. You spent a lot of time interacting with other characters in hubs, and folks wanted a Doom 2016. I think Rage was trying to be a blend of Fallout and Borderlands. No one was accusing iD for having too much story and dialog for Wolfenstein to Quake and those additional story elements were partly to blame for the public reception. I liked it, but I know that's not always a shared view.
The fact that they could do megatextures on consoles with only half a gigabyte of total memory and on a disc with only 9GB of space is also insane. Nowadays every game insists on having every surface be its own massive bespoke texture generated by Substance Painter's procedural generation with no tiling, and it's the main reason games have ballooned to over 100GB apiece. Whatever Id was doing in 2011 is still way beyond the rest of the industry.
@@stevethepocket mega textures had a ton of flaws which is why they were dropped in later iterations of ID Tech. Also they would not help modern games at all.
How small do you expect modern games to be?
@@crestofhonor2349 Small enough that I can fit my whole collection onto one drive would be nice, especially given the risk that they'll be gone forever if they get delisted and I don't already have them downloaded.
I half remember a quote from Marcus Beer on Epic War Axe when Rage came out.
"Id have some of the best technical artists and programmers in the industry; but they desperately need to hire some fucking good game designers"
Which is funny because Id has a long history of mistreating its designers or id tech clients like splash damage; particularly when John was Ceo.
I think the best games get a balance right between artists, designers, and programmers. Its just a shame the industry splits these disciplines up leading to imbalances in power.
And now they wanna replace these essential people with AI and sensitivity reading narrative designers. While it's hilarious to watch, I can still have fun with old games.
Yeah Rage was kinda disappointing. Especially that nothing of an ending.
@@realhet I don't mind LLMs being used as assistive tools for artists to be more productive, but we ain't there yet due to issues with creating valid, lean, and legal models. Its bit like needing a million moneys on a million typewriters to produce the works of Shakespeare at random. Statistically possible, but the computation time is almost infinite.
As for sensitivity reading narratives, I'd say its good in addressing historic misogyny, racism, propaganda tropes in games; but I find the focus on "relatable empathetic characters" is too much and ends up creating "cinematic character action games". The Dad of Boy reboot i find went too far in changing Kratos from a tragic anti-hero to a redemptive father figure. Had they gone with new norse viking protagonist I'd be all for it; but it felt like a large reteconing of the IP for "modern audiences". So I get what you mean.
Like I said its all about getting the balance right and that a difficult thing to do in the game's industry.
@@sealsharp oh another seal! 😅🦭🐟
For me the disappointing thing was avalanche making Rage 2. 🤭
@@OdysseyHome-Gaming I think there are more people to balance: management, sales, marketing, CEOs and shareholders from the balance. When you finish a game, their names are scrolling for 20 minutes. :D If they are able to replace the artists, developers, writers, programmers fully or partially with AI, then they will get more profit. They don't give a damn about how good is the game as long as it makes exponentially growing income each year.
These people are essential to execute a business plan.
Artists, developers, programmers, writers, actors are needed to invent stuff that is OUTSIDE the boundaries of common knowledge and imagination.
Rage 2: Another game that utilized the color: 0xFF00FF was: FarCry New Dawn. But I should not have an opinion because I played neither. The start of Rage 1 when you step out of the crashed spaceship was amazing, tho'. The end of it indeed was just a grind.
While idTech is great for smaller maps, its not feasible for large seamless open worlds. Most of their releases have been linear smaller maps connected by loading screens
It's all about matching a game with the right engine. With that said, I thought Rage pulled it off pretty well... as long as you use an NVME to better hide texture pop-in.
@@GTXDash Rage does have large maps but again, broken up by loading screens. It would suit games with linear progression and large levels such as the The Evil Within series
@trblemayker5157 Personally, I don't see "loading screens" as an issue since other open world games have tons of loading screens like in Skyrim.
I completely forgot that this is the spirit of game dev. Getting the most out of hardware with what you had. Limitations are key.
I've always said: Zenimax having full control over idtech is one of the worst things that have happened in relation to game engines.
So far, they have not interfered with id Tech. In fact, I wish they would adopt it more into the lineup of games than using... whatever hell is powering Starfield.
@@GTXDash It's true that they haven't interfered yet, but they're certainly not licensing it out to other companies like they frequently used to. I wish their other subsidiaries would use it too, LOL
@@skew5386 It'd be interesting to see the Elder Scrolls team collab with iD to replace Creation Engine with something based on iD's work but with the tooling Bethesda needs.
This is really what needs to happen.
@@skew5386 They can't license something that has middleware from others, it's not as easy as you think.
Why do you think idTech4 was the last engine id released its source code? idTech4 had no code from others, it was all id's stuff.
And yes, I know about the Creative BS, but after Carmack rewrote code for the shadows, it was all good to be released.
idTech is so awesome.. I ran Doom Eternal at Ultra Nightmare settings, 1440p (DLSS Quality), with Ray Tracing at 100+ FPS at all times...
had the same experience with the Wolfenstein Games, nothing else performs like idTech games...
Good video breakdown of this important game engine history. I say it's a shame that ID Tech is no longer headed up by Carmack or making open source its old engines, or competing with Unreal as an engine that even independents can use. Your game looks great, I will be buying on GOG once its out. Keep up the good work!
Game devs need to release more stuff on GOG. I love that site.
@@GTXDash Agreed, and that's why I'm going to get it there and not steam.
the way forward is back
simplified, custom engines
I agree, it's actually a lot easier to write custom engine these days.
your mistake here is comparing normal, mortal, developers to John Carmack
This. Of course, there are amazing graphics programmers that work on stuff that runs badly, but they squeezed as much as they could to even get the thing to run.
the games that stick with me are the ones with in house engines. rent an engine leads to a bunch of games that look and feel the same with different skins. so you end up with a soulless conveyor belt of product. this is ok for a business but not in a games as an artform.
that said ive been writing my own engine on and off for almost 20 years now and its still very rudimentary. its 95% lua and runs on a dead stock interpreter with a few custom modules to handle apis, low level stuff or things that need to be fast. i want as little 3rd party code as possible (though i am using a few 3rd party modules as place holders).
The problem, here, is that there’s only one John Carmack.
@seanys Some of the best engines ever made didn't all come from Carrack 😉
Be the next one, or even better, the first you!
That's actually good. If we'll have two Carmacks, we'll have two shitty Megatexture implementations.
Great video. I was using Allegro until a few years ago, but ended up switching to Godot purely because it allowed me to prototype faster (and incidentally, it's the only engine with which I've actually gotten games to completion) - but I have to say, I really do miss working with C.
The games I make are so low-power that this hasn't ended up mattering too much, but Godot have recently added a very intuitive way to produce a build with all of the features your game doesn't use stripped out. Might be giving this a go.
While I like and agree with most of what was said in this video, I don't think the truck vs bike analogy was the best.
Game engines can be big heavy and slow but, that doesn't mean that the output game has to be.
Compilation settings, when done correctly should be able to remove many of the libraries and components from the game binary if they aren't required by the game. Now the amount that can be removed will still depend on the engine and the feature set in use, but I think it's important to illustrate that you CAN get good results on modern engines.
This has been a very long on going argument from both sides and it really is context specific. Do you boost up a simpler but lean engine or strip down a complex one for performance? Depending on the scenario - both options are viable.
There is a good talk about how the rendering tech in a Unreal Engine 4.27 (pre-Nanite) game was stripped down to the necessities and modified with some extra features for what the game needed. The game was not released, but the talk is very detailed and useful.
The video title is: A Taste of Chocolate: Adding a Rendering Fast Path without Breaking Unreal Engine
do you mean hyenas by creative assembly?
I feel like this video was made specifically for me. Super video!
I think older games looked better cause they used a lot of prebaked shadows etc. Look at Battlefield 3 and 4.. both games still looks great I think.
Thise games look like games of their era. Battlefield 1 and Star Wars Battlefront look far better and use far more real time rendering than those games as well as other new features like PBR and wide use of Photogrammetry for it's materials
Battlefields 3 & 4 couldn't use baked lighting, because they have destructible environments. Everything is dynamic.
thanks intergalatical supreme overlord of megatextures, John Carmack.
Some developers heavily rely on TAA to smooth out some of the effects used. So they don’t add the ability to turn off TAA as it will show flaws (pixelated or grainy looking shadows and effects)
@@SinisterOyster I actually don't mind TAA. I'm just not a fan of how many devs use it as a crutch for fixing issues that shouldn't have been there in the first place.
TAA is fine but I wish it was used differently
I think the biggest issue about modern engines is that they are basically tailor made for bigger corporations, that want to make impressive looking games fast. The whole existence of TAA, upscaling tech and Nanite are made for making games that would look good in still frames, but not in fast paced games or if the frame graph was enabled. Its really sad to see companies abandoning their own engines (especially Frostbite and Red engine) to move to Unreal.
TAA is actually pretty old. Using data from previous frames to anti alias new ones dates back to at least the 7th generation. I believe Crysis 2 had a form of TAA that was pretty bad by todays standards. Nvidia has had another form call TXAA that is in some early 8th gen PC titles too. Another form called SMAA T2X was another form of temporal antialiasing too. The point of it wasn't to make it look good with still frames, it was a way to smooth over certain things that MSAA and sometimes SMAA missed. In it's early days it was just another AA form like MSAA, FXAA, SSAA, and SMAA
@@crestofhonor2349 I mean the usage of TAA. Epic uses it everywhere to cover up shortcomings like in Lumen, shadows, foliage, hair etc.
I know less and less engines can keep up with Unreal but holy crap I didn't expect Frostbite to bite the dust.
Are you dummy or something EA still using frostbite even to this day huh? What dumbass argument over here hahaha
i think we've all felt the frustration of the frequency of poorly performing big budget games and this was a truly refreshing video to watch. As well as eye opening to just how great a job ID Tech has been doing and how influential it has been.
is it just me or the old tech has some sort of charm to it ?
It does. That's why we still get games that look like older games, sometimes even made in the old tech like Hell Denizen made in a modified version of Quake1 engine.
Holy shit, didn't expect to see Tilengine here! Such an awesome little framework :D
I used it as the base for my last game project, and it was surprisingly easy to bind to a different language (I used Odin instead of c), so super happy to see it get some love :))
(Falconet is looking amazing too)
I'm hoping to find something similar for 3D graphics, with enough control for me to create the underlying systems how I want, but not getting stuck on the more complicated side of vertex buffers, pipelines and whatnot. Raylib seems a bit too simple, maybe sokol-gfx could have what I want?
Meanwhile I'll still be using Godot for 3D stuff...
I just discovered your channel and I need to say you are a true gem ! I am an upcoming Game Developer and want to learn so much about all the areas of Game Development, and this video definitely opened my eyes to the world of engines and how interesting and important they are ! Thank you very much, hopefully your game becomes successful in the near future !
If there's just one thing that can be learned from games like Doom and Quake. It's that (aside from making a good game), front loading the technology aspect to make creation easier and allowing the players to make stuff leads to insane longevity. Doom and Quake will be played for a long long time, people will be making stuff for those games for a long long time. If you only had Doom and Quake and access to mods, you could easily play those games for the rest of your life and never run out of new "content" (I hate that term, but it's fitting here).
All mainline Quake games still have people playing online to this day, and that would def not be the case if the games weren't open source
People are still playing my two full episodes ("Darkhell" & "Blakhell") via the D1 engine. Great design is simple yet simultaneously robust. Poor design stacks layers upon layers upon layers upon itself to mask janky flaws and poor choices.
10x faster hardware -> 10x better visuals, is just not how it works. So many techniques used in games do not scale, because they have fundamental issues that can't be fixed with faster hardware but require an entirely new approach that is often much more expensive.
That doesn't change the fact that many games and engines are bloated, unoptimized piles of crap
@@sulfur2964 An engine can be bloated, doesn't mean the compiled game is bloated. Engines are still tools, there's a lot of room to work with most. Features can be disabled, code can be streamlined, packages can replace default functionalities if not ignored entirely. I have only used 2 engines at depth (GameMaker and Godot, which to be fair are kinda light engines for undemanding games) but I seriously doubt either had much effect on performance in a relevant way. At that, they allow much faster design as well as communities and support.
@ymi_yugy3133 Yes, I am aware that if you have a model of a bowling ball that has a million polygons, increasing it to 10 million won't make it look 10 times better. But keeping it to a million and improving reflections, adding higher-quality shadows, higher res textures, ambient occlusion, global illumination, etc etc... it can look more than just 2x better.
@@GTXDash All these techniques are good examples. Why dialing up the settings doesn't work.
Shadows, Ambient occlusion, reflections. We have more or less fast ways to get some pretty good approximations (Shadow maps, HBAO, screen space reflections), but they all suffer from visual flaws that you can't fix by just turning up dials.
The typical solution is ray tracing and that is just really expensive.
@ymi_yugy3133 Dialing up the detail is not what I'm saying makes good graphics. I really don't think expecting more than 2x or 3x visual and performance improvement over what an XBox 360 can do is too insane of an ask.
5 mins in about my fav game engine and Threat Interactive gets used as reference, this video is going to be so goated
Great vid, really enjoyed learning about it all. Thanks for uploading
UT2004 mentioned! Very well edited and made video dude.
Great video. I agree there is a huge gap for simple and fast engines. An other engine I would recommend is Love2D with the g3D. It can run on a Raspberry Pi Zero which has only 500MB RAM. I was shocked when I seen a full FPS game powered with g3D runs like butter on the Pi Zero.
I haven't played around with love2d for a few years, but g3d looks like a great new development :)
This is a very well-written and presented video! I also share the same thoughts you listed. I hope your game gets the same recognition and success both of you are investing in!
This video is so much better than I expected.
That;’s a very well produce video! There’s a lot of videos on old games and engines basics, we need some more advanced stuff like you do. Subscribed!
The microcontroller in the joycons of the Switch is more powerful than the NeoGeo. The problematic performance is simply on the shoulders of the engine and the game development.
With 128 times more RAM and CPU power than 20 years ago (and probably the same with the GPU) we only get a bit nicer looking games with comparable enjoyment. While I am not a graphics whore, if a game requires twice the hardware of another one for the same performance, I expect a nicer looking screen in front of my eyes. I am fine with the graphics of Serious Sam SE, but I expect on my R9 5950X with 128GB of DDR4 3600 and RX 7800XT at least 32 simultaneous VMs to run the game just as well as my P4 2.4GHz ran it 20 years ago with a Radeon 9800XT (of 21 years ago) on 2GB of DDR1. As you can probably tell, no new game that looks like that runs like that today. You could argue that RAM speed has only improved about 10 times (comparing those two systems), it's not even 8 VMs that would actually run something looking like that.
I'm only a quarter of the way in, but I'm liking this so far.
One thing that always bothered me is how the 360 generation was usually targetting 30fps. I prefer higher than 30, so I felt: "If performance was better, then they can finally increase framerates!"
And then it just... never happened. I always wonder where that power ran off to, especially in the era of the PS5 and XBSX
Funnily enough there was enough GPU grunt to potentially get the frame rates up but memory bandwidth hampered them significantly. Multiple render buffers simply dragged down the frame rate and GPU's could be left just twiddling their thumbs waiting for data to be shifted. RSX alone would probably been better off with faster RAM and a lower clock rate as bandwidth was the primary bottleneck. This wasn't such a big issue on the 360 as you had the EDRAM to allocate render targets and so long as you managed the geometry splits well. This is why you would regularly see it performing better.
EDIT : A big issue then and even now is that GPU utilization tools aren't terribly granular. You could take a top of the line GPU today, max out the bandwidth and it will show the GPU at 100% utilization even if the GPU is actually doing next to nothing. Made it very difficult to figure out were performance issues were coming from.
Id really took the Crysis 2 & 3 era to heart and went: "Okay then, FUCK YOU. Superb graphics AND Superb performance!" with the new Doom and then Eternal.
Doom reboot looks worse than Crysis series. They failed art direction. It's utterly inconsistent. And nuDoom has completely static environments, you can't interact with items like you do it in Crysis games. NuDoom also uses baked lighting, unlike Crysis with Light Propagation Volumes, which allows you to change everything and still get good lighting.
And what's bitter, Doom Eternal won't work buttersmooth on cards that will run Crysis 3 fine.
One thing to keep in mind is that just because the end result of a modern game engine looks like a game from the 90's, doesn't mean it is doing anything close to similar to a game from the 90's, even if it's rendering at the same resolution.
Just a surface level example, the Genesis (Megadrive)/SNES had specific purpose built hardware for blitting sprites to the screen. Modern GPUs don't work this way, they work on geometry primitives and bitmaps. What may have been a handful of assembly instructions before being handed off to bespoke hardware circuits then, is reems of code for projecting geometry and a full pass through a fully programmable rendering and rasterisation process today.
As with everything else, we're moving towards generalisation at the expense of the performance benefits that specialisation brings.
Not really what I was saying. I'm not asking that a retro style game runs 100x better on modern hardware than the original hardware it's mimicking, just that it doesn't run worse than the original. I REALLY don't think that is too much of an ask.
@GTXDash it's a fundamentally different process.
While retro style graphics really shouldn't tax modern hardware, it's also just not optimised for it. You'll never get the same efficiency as purpose built hardware out of software.
I have this sprite engine at extremely barebones stage and it simply writes pixels into swapchain from a compute shader. No rasterisation is involved.
@@Salabar_ fair approach. You're still running software to do what would have been hardware on older consoles, but it's going to be less overhead. Cool though.
I’m a simple man. You make a good video. You don’t ask me to like and subscribe. I like and subscribe.
Exactly! I hate it when people waist my time like that.
9:30 that van analogy was Socrates-brain brilliant
10:20 I've asked Deck 13 if they consider licensing out their PINA point&click engine,
considering it hasn't been in use for more than a decade.
They politely declined, but I am pretty damn sure, that the reason they said "no",
is because I really was just a student at that time.
They have given the same engine to another company specializing in Linux ports (RuneSoft),
so it's more of a "don't see the point in licensing" thing, I suspect.
Software not made for the public may be subject to licensing issues because of used libraries.
For example if the engine contains code for consoles, they can only give that to you as a registered developer for consoles, which some years ago was not possible for individuals, just companies.
May have been something simple like RAD game tools, everyone used bink videos. And bink still has no pricetag on their website, which means "if you are a company, write us. If you are not, you can't afford it anyways".
I dont know dick about computers, but this was entertaining and simply stated enough for me to watch the whole way through and enjoy. Good job.
RUclips recommendation win. Great video!
hearing "python" and "natively" regarding the same thing makes me kind of uncomfortable 😅
Pretty cool video, thanks for all the detailed explanations.
I'll buy it when it comes out, u guys made an effort to have the game run natively on linux, which I use, so really appreciate that.
Raylib is also an amazing engine-like library you can use. Its brilliant in all aspects.
Suprised to see you had 2k subs, this is quality i expected from a much bigger channel, great video 👍
Love the video!
FYI, id Tech Engine versions 0-4 were all released free and open source under GPL licenses. Versions 5-7 (and after) are still proprietary, though. Wish id Software would license it out to third-party studios again, like they used to.
Sadly without Carmack at the company anymore I doubt any more versions of iD tech make it into the wild. He was the driver of open sourcing their prior iterations.
Engines became much more complicated and now use a lot more third party tools that may be under license and can not be open sourced. Also we do not know what the build pipeline looks like. Just pressing F5 to run the game is not standard on engines.
There are rumours of days long build times on some AAA games.
@@WizardofWestmarch Sad, but true.
I just wish Bethesda Game Studios would drop Creation Engine and fork id Tech for their projects going forward. They actually had considered using it for _Skyrim,) but Carmack dissuaded them from using it, since MegaTextures was a major part of their pipeline and he didn't think, at that time, that it was suitable for the type of open worlds that they created for their games.
Interestingly, id released _RAGE_ the same year BGS released _Skyrim..._ using id Tech... with a semi-open world. Since then, however, that have brought in the Vulkan graphics API (they had been using OpenGL) and gotten rid of MegaTextures completely (with id Tech 7). And the world spaces in _DOOM Eternal_ are quite massive, even if they don't feel like it because the player is moving at such a fast speed. So, I'm confident that it's perfectly capable of an explorable open world. BGS would certainly have to create all kinds of new mechanics to customize the engine for their particular use case, but I believe that time and effort would be worth it. (And it might be possible to "strip [the Creation Engine] for parts" and incorporate them into their id Tech fork, considerably reducing the workload. But "dropping in" features like that would really depend on how modular the id Tech's architecture is.) Still, it would be worth it, because BGS would finally get rid of their mountain of technological debt and have a solid foundation on which to move forward.
Caveat: All the above having been said, once they got their id Tech fork where they wanted it for their next game, BGS would be wise to set aside a team whose sole job, at least until the project is done, would be to create a new modding kit for the new engine, to be released Day 1 with the game.
Reminds me of when I ran Doom 2016 on my Surface Pro 6 tablet (an i5 CPU with integrated UHD Graphics 620). On Medium graphics with a slightly lower render scale it ran smooth as butter, even in busy scenes. Was impressed and it still looked great!
What kind of resolution did you use? Could you record with another device some scenes like Vega Processing Level? I remember having troubles with it.
Please, do it as I want to see the experience.
I'd like for the video creator to turn on auto-captioning for this insightful video.
28:00 - 29:00 bro. what is this. massively triggering my tinnitus
Graphics programmer here. Quoting "Threat Interactive" here was a big mistake. The guy speak authoritatively, but that's it. Most of what he says are falsehoods resulting from lack of knowledge and laziness to find out.
I watched the video in question was was impressed by how bad it was. For reference, pretty much every single thing he says about nanite is complete opposite of what the technology actually is. Brian Karis of Epic games, the mind behind the nanite has given a number of technical talks on the subject.
I myself have implemented similar technology in the past, so I'd say I have the authority on the subject.
I think it would be great if there was some form of benchmark that can settle the argument once and for all. Nanite with LOD, LOD only with higher poly meshes, and Nanite only with no LOD. If someone can do a test like that, I would love to see the results.
@@GTXDash Generally speaking nanite will be faster when all other things are the same. When we do LOD transition, we're actually drawing 2 models of different LOD levels, to mask the "pop". Nanite doesn't have to do that, as transitions are much more fine grained so "pops" will not happen. That said, Nanite will be slower than custom LOD hierarchy in practice. Why? Because custom LOD will just have much lower triangle density, where as nanite will attempt to give you as dense a mesh as it can, but no denser than 1 triangle per pixel.
You can adjust that "error" factor to any number of pixels though. The implementation I had was using 4 pixel area, as I saw no noticeable difference and it pushed out much fewer triangles to the GPU.
I think the comparisson in general is not very useful. Nobody creates meshes with every vertex placed by hand, we use tools. Sometime in the 90s we did place every triangle coordinate by hand. Today we don't, there are plenty of bad topologies (triangle layouts) in production. Why do we not care about individual triangles today? To save time, and because hardware can manage an extra 100 or 1000 extra triangles without much issue. Nanite is basically that, instead of investing time and slowing your art pipeline with custom LOD and paying for expensive LOD tools such as Simplogon - you just tick a box saying "enable nanite" and you get amazing results. There are tradeoffs, and there are configurable parameters, but it's not about runtime performance, it's about developer productivity in general.
Say you want to make a good looking game as a solo dev. Do you know how to make good LODs? Do you have a spare 10,000$ per year for a Simplygon license? Well, if not - "go home kid, let adults do this. "
I have feeling that more experienced game dev just seasoned out or even burn down, what we get average developer who struggle with all of details and complexity.
this whole video really lays out bare that optimization isn't just some long-winded party trick; it's still a necessity, and players will appreciate it whether or not they can't articulate it, especially when compared to a lot of the poorly optimized bloat-fests we get fed for upwards of $50.
You're pretty off the mark about the comparison between the Doom engine and the BUILD engine. Duke Nukem 3D was fully two years after DOOM and it has MANY features that are simply not present and not possible in the vanilla DOOM engine. It's true that later ports of the engine, particularly those that implement Hexen features like ACS scripting, are capable of doing the same things as BUILD, but those are VERY far removed from the vanilla DOOM game. Your example of "getting a sector to lower with many explosions" isn't something you can do in the vanilla DOOM engine without using the hard-coded Icon of Sin death effect, which has other effects, like ending the game.
Which brings me to my other complaint about that section of the video, which is that DOOM was by no means "split" into "engine" and "game". The DOOM executable file is completely full of assumptions about what game it is; all of the "rules" of the game are contained, hard-coded, in the binary. The fact that add-on files can modify the rules of the game is only possible in later ports of the game that support loading integrated DeHackEd files, or more advanced modding systems introduced in ports like ZDoom.
The concept of a "game engine" really wasn't discussed among gamers until the conversation became Duke3D vs. DOOM, and that was almost entirely because many documents surrounding Duke3D included a copyright message that said something along the lines of "The BUILD Engine is Copyright Ken Silverman (C) 1995". Nobody talked about Heretic or Hexen being "Doom engine" games until '96 or so; before that, they were just "based on Doom".
Ultimately I do think the thesis of your video is reasonably sound, but you have a lot of problems in your fact-checking.
The comparison wasn't between OG Doom, but the Doom engine itself. It doesn't really matter that on release Duke3D could do things like spinning doors, or lowering multiple sectors, when that was already possible in Hexen's usage of the Doom engine.
sure, you're right about the hardcoded features- but when discussing the ratio of performance to visual quality- it doesn't matter as much if something is hardcoded. the DOOM rendering engine vs the BUILD rendering engine is a very close comparison. what's not a close comparison- is the BUILD rendering engine, to the QUAKE rendering engine. the addition of mipmaps, the ability to stop rendering anything that isn't potentially in view, the interesting new uses for the 256 color palette, such as full-bright pixels within the textures, the 3D mesh view-models rendered separately from the game world- the lightmap calculations done while compiling the map- it was all fantastic stuff- these were all revolutionary techniques, and they did change or even define how real-time rendering progressed from that point on
@@railroadism Your post seems pretty orthogonal to the discussion prior.
@@azazelleblack i agree with both of you- i still agree with the video because it's still true. id software's engines do and have historically outperformed any competing games or game engines targeting the same level of visual quality. id software and panic button have put tons of work into making sure their games run as fast as they know how to make them run- and this is obviously not unique to id software or panic button, but they are objectively doing better in this regard than other studios with other engines. - that's what other developers can learn from id tech.
@azazelleblack so yeah, this is not a contest of which is better, Build or Doom engine. We are all aware of the features Build has over Doom. The thing I wanted to point out is that Build tends to get a lot of the credit for things that Doom can also do, but because the original map authors didn't think to add those effects in their maps, people just assume it's an engine limit.
Of course there are things that Build can do that id tech 1 never will, it's just not as extensive of a list as most people think it is.
The Doom EXE does also store additional code that one wouldn't nowadays call as part of the engine. But the thing is, back in the early 90s, game devs were still trying to define what an engine is. Is it just the renderer, or is it also the physics, controls and game mechanics?
I doubt it matters, but Halo was originally supposed to launch on PC before Microsoft bought Bungie and had them make it an Xbox exclusive. From a regular dumb gamer's perspective you'd think "porting" it back to PC would be easy then.
Wow, that was informative. Good work. So many games in this video i played as a kid.
33:26
thats why i kinda hope that valve would release source 2 to make that gap
the engine itself is still inherent idtech 2 architecture while being moderate enough to work with
Funny thing is RPG Maker seems to have a similar design philosophy of being mainly used for a single genre but it gets a lot of flak for being too restrictive even though really if you're creative you can use it to make some crazy unique stuff and it seems to hit nearly every Criteria you have on your list lol
RPG Maker games that bend it too far tend to run pretty poorly though --- as should be expected. It's really made for a specific type of game and hyper-focused on that.
Actually, old VGA cards did support hardware scrolling. What you couldn't have was hardware scrolling using the popular Mode 13h, because you wouldn't be able to access the extra video memory required to scroll around.
@BaddeJimme yes, certain video cards have limited scrolling. But not other display modes like CGA and MCGA. But Commander Keen still ran fast even on a non EGA/VGA card, even when the scrolling is handled in software.
I've felt that devs have lost perspective for a long long time now, the "universal" engines are great, awesome tools and you can slap together a game really fast... But the longer these engines are used, the more and more talent is lost when creating a simple purpose built engine for whatever game you're trying to make.
Personally, I love trying to do as much as I possibly can from scratch and it shows in my engine, both in terms of how long I've been working on it and the quality of code (long and poor lol), but it's also just a hobby of mine, so that doesn't matter... The point is I learn from it and I wish more devs would do the same, learn how to do some of the old school stuff and work up from there... You can even use that knowledge to make Unity and Gadot run better on low spec hardware.
Me as a beginner developer, I love engines. But I do recognize that engines have a lot of features and honestly bloat that we don't need and only bog down our games. I have watched a few talks and interviews with developers involved in the making of really old classic games and the one that really stood out for me was the developer behind Crash Bandicoot. The things they had to do and optimize just to get crash into a PSone is INSANE. I'm using Gamemaker and know a bit of Unity but I'm currently learning Love2D which is a library and not so much an engine. I'm looking to optimize my games the most I can this way and even extend this to fit any future games I make.
28:36 I feel kinda gaslighted on my childhood, because I remember UT2004 and SWAT4 running really well on my potato PC, while Doom 3 remained borderline unplayable until we upgraded the rig.
I also doubt those claims. Lighting in those games was baked. They should have run better on low end hardware for that reason alone.
Doom 3 had dynamic lighting, stencil shadows and normal/specular/diffuse maps interacting with each other. It was a groundbreaking game engine. Nothing else looked like it when it was released.
@JennyTheNerdBat it's complicated. If you use an older GFX card like a geforce4 ti4600 or even a geforceFX 5800 Ultra, Unreal 2004 will run faster, but the opposite is true if you use hardware that id tech 4 takes advantage of like a 6600 even though a 6600 (non GT model) is often slower than a 5800 ultra in raw performance. If you pause at the benchmarks, you'll see "6600GT, high settings, 1024x768". You can't really lower Doom 3 graphics a whole lot. But in Unreal 2004, you can turn the visuals down so low that it's enough to hurt someone's eyes.
Like I said in the video. It's not an apples to apples comparison. But a lot of people see Doom 3 as the Crysis of its day, which it really wasn't.
And unlike Doom 3, they had really good gameplay.
Brilliant video! I've been thinking of something similar for a while and I love how you tied it to how you picked the tilengine for your game.
Making a game engine is tough, it's one step away from making a physics engine and even big game studios usually don't bother with making a physics engine themselves.
Loved the video.
Want to point out tho that Riddick: Escape From Butcher Bay was released on XBox before Doom3 (and on Windows slightly after Doom3). It had all the same features that Doom3 had, but also some that Doom3 didn't (cubemap environmental lighting). My favorite being directional based ambient sounds, in the beginning you hear the wind blowing outside and depending which way you rotate the camera it changes how the wind sounds. If I recall from the dev documentary they recorded some of the ambient soundscapes using multiple directional microphones and blended them in runtime based on orientation.This feature is still very uncommon in games or engines today.
So it was not just Doom3 that made all these cool features, Riddick was much less known game, but it was a technical beast.
It was not just a technical marvel, it was and still is a really good game, where as Doom3 was a disappointment to many, me included.
I recommend to look into Riddick: Escape From Butcher Bay, it might blow your mind how technologically advanced it was just around the same time as Doom3.
I'm already aware of Riddick. A lot of games used many of the techniques that defined Doom 3, but Doom 3 combined all these techniques including full dynamic shadow volumes at the global scale. Not to say that Reddick didn't look great. It may not be as dynamic as id tech 4 with its simpler environments and shadows, but still impressive.
@@GTXDash Thank you for answering. But Riddick used the exact same shadow technique that Doom3 did (full stencil volumes extended from polygon edges and shaded per-pixel, including objects being able to self-shadow). Both Doom3 and Riddick used per-pixel shaded normal mapping with full dynamic lighting. Riddick had the extra capacity of pre-calculated environmental cubemap lighting and reflections that Doom3 did not support during launch.
@ristopaasivirta9770 that may be true. But the only way I personally can know that for sure is if I can experiment with the engine... unless... Does the PC version allow for custom maps? If so, I would love to look into it to see what it is truly capable of.
@@GTXDash Yea it never was very moddable or editable. There used to be some tools back in the days, but they are probably really hard to come by now a days. So it could be really hard to verify by yourself and I can only give anecdotal statements having tinkered with game engines my whole life.
Anyway just wanted to give a shout out to the game just to give some counter weight when people say that "Doom3 was ahead of it's time." there were others that did same things, and in my opinion Riddick did them better. Doom3 just happened to be the most popular and visible one.
riddick looked better than doom3 in my mind. maybe the time gap playing tricks though.
This is the stuff I've been saying to my friends in the gamedev space for quite some time. I do create my own engines for the games I make, and that's not a brag, because the truth is, it's not as hard as people think to make an engine for a single game.
And I'm so glad that I've gone this route. Every time I go try to use a general purpose engine for something, I'm disappointed.
always remember:
the n64's issue was RAMBUS ram.
@@lexibigcheese Definitely. The CPU absolutely wasn't the problem.
@@GTXDash i've seen kaze push loads of vertices through it.
@@lexibigcheese That's witchcraft right there :D
As a complete hobby dev I know from personal experience that GMS has two compile modes, VM and YYC. VM is as it sounds and the game is compiled to bytecode that a virtual machine runs and is often a much slower way of making a game. Where YYC compiles to essentially a C++ style executable with faster performance.
Falconet looks right up my alley --- wishlisted!
You did Duke 3D and Build a bit dirty, though.
Based Doom engine using it for my game
"The new doom engine was much more futureproofed." Me, loading up e1m1 for the hundredth time with a laundry list of mods. even so far into the future, the engine is so versatile that it holds a dedicated development and modding community to this day.
I love the deep dive on the engines here. I definitely agree that specialized engines would do a lot to make game more optimized, but I think the economic reality of AAA will push towards a few "General Use" engines. for example, CDProject RED and Bethesda are both switching away from their custom engines to UE5, largely because it's easier to hire for.
I think we need more competition beside unreal engine 5 ngl, unity is kinda nuke themselves on the foot. More choice of powerful 3D engine that can expose unreal engine weakness would be really nice for consumer.
That mean unreal engine will force to optimized their engine when there is a competition that make a well optimized engine that someone would use for their project.
Just wanted to say, give credit to the fact that there is now an offical update to doom snes coming out soon that will run in like the mid 20 fps! That's really impressive
Very interesting, did not expect from a random 20k views video.
Modern Game engines are suffering from the same thing that RISC-V is trying to solve on processors.
Backwards compatibility, legacy code and a fork JUST in case someone wants to connect a CRT or serial Joystick.
But for game engines, you have 3D features included even though the game is only using 2D assets, BUT in a 3D plane to fake parallax.
Like making a Mass Effect game in a First Person Shooter engine (Andromeda).
To be fair andromeda had the best gameplay of all the mass effect games, not that it was a high bar, I love ME but it's carried a lot by its setting and story, and that was the problem with andromeda, the writing was so bad, and the setting kinda got change with the change of galaxy. But the movement was great and the gunplay was better than the other games, at least until you max your damage skills but your character keeps leveling and the enemies scaling with it, turning them into sponges, awful balancing, but easily fixable with mods. Also not letting you give commands to your squad? Come on.
@@ERECTED_MONUMENT Pretty sure they still believed in "Bioware magic" at that point.
And they were being forced to use the EA FPS engine to suddenly make a 3rd Person RPG.
They had to write a lot of code from scratch, WHICH did benefit future EA projects immensely, BUT is also why EA just keeps axing their studios.
@@hunn20004 Yeah, the "Bioware magic" didn't truly fall apart until Anthem, but didn't Dragon Age Inquisition come before Andromeda? It also used Frostbite, so they must've had some tools from that to use in Andromeda, unless there was zero communication between the teams.
@@ERECTED_MONUMENT True, Here's what Leo AI found on it:
The Frostbite engine was criticized for its limitations in handling the RPG genre, particularly in Mass Effect: Andromeda. In contrast, Unreal Engine 3 was praised for its flexibility and suitability for RPGs. The use of Frostbite 3 led to some technical issues and visual inconsistencies in Andromeda, which were noticeable compared to the original trilogy.
So i've mentally compressed the issues to "Frostbite was bad for Bioware" so a lot of intricacies did get lost during the last 7 years.
@@hunn20004 Yeah, at minimum Bioware should've spent some time retraining to get the studio ready for the engine change. Maybe they should've worked as a support team for one of EA's other titles.
That being said, at some point they needed to change from UE3 anyways, UE4 only released in 2014, and wouldn't have been a smooth transition either since so much changed from UE3 to 4.
Thanks for sharing this video. I love everything Id does.
Btw, i wonder why you where mentioning godot so frequently, considering how abysmal it's performance is.
I just use Unity, Unreal and Godot as a strawmen for "other competing engines" :P
I'm actually fairly familiar with Godot since we used it once for a game jam a few years ago, and I thought Godot is... fine. It gets the job done.
Fair enough 😅
I suppose for 2d that is accurate.
I am considering building my next project from scratch, using raylib and c++
More as a learning experience than anything else.
20:16 that limitation is a map editor limitation if i can recall correctly more than a rendering limitation
I had to double check if Build really couldn't legitimately do rooms over rooms. But if I'm still wrong then it seems that there's a fan wiki or two that needs correcting. 😅
@@GTXDash the physics engine has a ceiling and floor properties, instead of just different collision planes, hence the limitation
The newest example of this is Pennys Big Breakaway. Tailored Game Engine for this specific type of game build from the ground up and doesnt use any more graphic features then necessary, runs amazingly good
32:15 - you mention that idtech used OpenGL rather than DirectX in the early 2000s, but Unreal Engine 1 and 2 both featured OpenGL renderers at launch, no? (They also both featured a software renderer).
no. they used glide
Unreal 1 ran best and was most optimized for Glide. but it also supported D3D, OpenGL, S3 metal, etc, it was very API rich and thus very compatible. Unreal 2 used D3D, but you could switch and use OpenGL, both were supported.
@unfa00 technically, yes. But have you actually used OpenGL in unreal? It's a buggy mess. It's such an afterthought. Not surprising becuase Tim Sweeney openly hates OpenGL. Unreal 1 was built specifically for Glide And DirectX, Unreal 2 and 3 only DirectX, but after some modifications, could do OpenGL. But that was a lot of work which is one of many reasons why people dreaded porting Unreal 3 games to the PS3.
@@GTXDash In Unreal1 OpenGL became a must. Initially it ran the best under Glide, Remember the very first versions of Unreal had glide only, and D3D was always buggy tho stable. The common quirks on D3D were the added contrast/saturation when running in 16-bit mode, and the weird detail texturing, which was off by default but when you enabled it in the ini, the textures were far more dense than they should have been. Later, when the S3TC high-res texture pack arrived, it worked only under S3 metal and OpenGL. That off course is em speaking about the improved OpenGL renderer which became available later, as the original one was quite meh. But later it was a must have. UT2004 had openGL as well and worked well except one issue, it only supported blob shadows, but I still used it as my 6800LE back then ran out of Vram when all was maxed and in 1600x1200, it did so in D3D, but it did not happen with OGL, so I played with OGL. However with Unreal3 OpenGL was dead and porting that to PS3 must have been a pain.
@@Lady_Zenith I'm probably thinking of UT99 when they worked out the kinks with DirectX.
23:10 the playstation 1 only worked with integer based coordinates, so it was able to calculate less, and be cheaper, but those vertecies had to jump around, thats why the original playstation looks as if it wobbles and jiggles a lot, by contrast, the N64 did everything much more granularly and thus had to do far more calculations for the same "graphical fidelity"
so the tradeoffs were built into the console for the playstation and built into the games for the n64
the n64 was a better play experience when it did work, but the playstation was a better experience when you didnt care about all of the smaller details, or the massive amount of jiggling
Yeah. Ken Kutaragi's priority was he wanted the console to generate as many realtime UV mapped polygons fast and this was a pretty clever way to to get there. Ditch the Z-buffer and have the verts only snap to integer numbers. You didn't even notice is as much at the time on the 480i interlaced CRT TVs so it was a good compromise and seeing the demos of Ridge Racer running on a loop at an Electronics' Boutique store convinced me to buy a PS1. I was playing PC games at the time, but even then, I hadn't seen 3d polygons run that smoothly before.
@johnm9263 yeah, I wasn't implying that the PS1 didn't make sacrifices to get there or that it was in anyway objectively better than the N64.
@@rodneyabrett It was definitely very noticeable back in the day on a CRT. Maybe not _as_ apparent, but most PSX games were very wriggly and janky-looking even back then.
@@todesziege Yeah, totally.. but upscaled and it gets way more egregious. This is also why I don't like to upscale emulation when it comes to PS1
@@rodneyabrett Oh, yes, upscaled old 3D games look terrible and break all art direction in general, but it's even worse with PSX games due to all the glitchiness.
I can't get my head around why people insist on doing that, much less why they insist on it being an improvement.
games without graphics acceleration must be so programming intensive , i can't imagine being in the gamedev industry in the 90s
It was intensive but the scope was also more limited. Work to what you have.
It was intensive, yes. But it was also far easier to understand and reason about. The rendering pipelines nowadays are so complex that you can't really keep a complete mental model of your renderer in your head. I can't, at least.
Check out "second reality" by future crew. That production came out before doom and it's a showcase of what would have been possible these days. IMHO the best thing ever coded on PC.
There was graphics acceleration since the 80s in consoles, arcades or the Amiga. But specifically for 2D rendering (sprites, blitter). It was using specific hardware, but was not the GPUs we know today.
@@phojamantirasoontrakul That is very true. The background/sprite based systems on consoles were designed to minimise the amount of work the CPU had to do. By focusing purely on deltas they could punch way beyond what it looked on paper.
You had the bizarre situation where PC's had CPUs way beyond what the consoles but struggled to really put that power on screen. Consider that the original Mac and the Genesis/Mega Drive have the same CPU but very different results.
awesome video, but while i find the whispering and some music to be very fitting for the topic, its really grating on the ears
nicely written video, I think it's one of the best overviews I've seen of all the different iterations of id Tech and their strengths and innovative features.
37:36 surely you can run the game in a VM to check if it runs e.g. in a slow single core with limited RAM for fun?
@@technoguyx we could try that.
I think a better version of the Van and motorbike example, the van needs to be big and featured enough for anything it could possibly need to carry. But building a brand new motorcycle which is slimmer and almost perfectly efficient for the delivery of your single specific package is generally too expensive and time consuming.
Just like how shipping companies don't need to design the vehicles they use, a publisher should be able to purchase a license to a specific engine they need for that specific game, without always going through the process of building their own engine.
Based on your criteria, what's the best engines for 3D games?
Writing your own engine (like Dead Cells did) will be an advantage, but you need strong programmers to do that in A/AA production. Unity or Unreal are great for most games, and most games built in those skip on optimization, there’s a lot work to be done in oprimization department tha could make the games run better. Only AAA studios can purchase a license for something like IdTech, or maintain their own tech, but that’s expensive and hard, e.g. even big company like CDPR ditched their RedEngine for Unreal after Cyberpunk fiasco.
not unity
Custom engines
@roklaca3138 problem is it takes a lot of time, effort, and resources to create and maintain an engine. So much that some even say that once you start making your own engine, you're not a game dev anymore. you're an engine dev.
Source
When you watched any video from Threat Interactive, DF tech analysis is everything but in depth. It's the most plebian possible tech analysis after "it's pretty" you can find and the most "catering to establishment" possible as well.
DF mostly do framerate and image quality analysis, real technical content is very rare in their analysis videos. That's quite a shame.
I don't think it's all or nothing. I try to look at problems from multiple perspectives. Threat Interactive have been very critical of DF's opinions of TAA and such. DF is very useful for data collecting, not necessarily their opinions on the direction that the industry needs to go.
Really like the talk when old games compared to new one, especially with example about Retryo Style game for modern consoles that just build differntly for it.
Love your videos ❤
I hope your game sells amazingly 💓
Excellent video! Very informative and very well explained, keep it up!
Great video. I’ve always been super interested in how games like half life 2 and other source engine games get their look. I’ve seen lots of super high quality animations with incredible art direction come out of G mod and it makes me wonder how source works
im a baby game dev... but this is literally why i use bevy over any big game engine. I take the pieces i need and have low level control. Does it suck to roll out all your tooling, sure... but ffs its nice to be modular and slim
Fun video! Would be great if subtitles/captions were added.
Game maker is quite optimized for drawing sprites. The last couple projects I've worked on run at a "real" FPS (though capped at 30) of over 1000 when compiled, and include a small bit of 3d as well. Granted, I was working on them on my gaming PC, but they run fine on cheaper, older computers without dedicated graphics in the hundreds of FPS. Game maker has been around since 99... running on 90s / early 2000s hardware just fine (I remember player game maker games as a kid, as well as the demos that came with it). Unless you're drawing many hundreds of dynamic objects, the FPS for a pixel art side scroller shouldn't dip unless there is something seriously wrong.
Maybe "non-code" game maker is slower, no idea about that, but I seriously doubt a single professional game has shipped using much "code-free" game maker slop.
what would cause that game to have such fps drops on a switch then? just poorly written/ managed code?
@@lloyd011721 Honestly no idea, the game doesn't appear to have shaders or lighting. It is possible to draw things offscreen without restriction - like drawing the entire level regardless of the viewport and not disabling / deleting things outside the viewport. That would be kinda an insane oversight, and the dips seem to appear only in combat. Maybe porting is terrible in GM, haven't heard that on the forums ever - but if it was terrible for one console it'd be the switch lol.
Edit: also just remember some people do pixel-perfect scaling by literally tripling or quadrupling the size of images, not scaling. I have used actual scaling, that's another possibility, but couldn't explain it completely.
Lion Entertainment did a great port of Doom to the Macintosh. Only time dabbled with a map editor. Around that time Marathon on the Mac was exceptional game. Never understood why at the time. There were always more audio channels to be heard on Mac ports.
Enemy Territory: Quake Wars lasted for like a minute in gaming. Sill remember its accomplishment was one of the largest multi-texture sample to be pre-cached for rendering.
Was always impressed with Guild Wars 1 engine for a MMO. Tabula Rasa, Tera and Defiance were other games felt the engine ran well too.
Guild Wars looked great for its time. I preferred it over WOW.