What Game Developers can Learn from id Tech

Поделиться
HTML-код
  • Опубликовано: 13 окт 2024
  • The release date for Falconet has been pushed to November 2024.
    Wishlist on Steam ⭐ bit.ly/3bWPN2E
    Link to the GOG page will be added as soon as it goes live.
    Game music used in the video:
    • Commander Keen 4 music...
    www.moddb.com/...
    • Doom II OST (SC55) - I...
    • Doom OST - Victory Music
    • Quake one - Slipgate C...
    • 6_idkill.Vega.Cih
    Original music used in the video:
    • Calm Cosmos | Free Mus...
    • Science background mus...
    • Vibe Mountain - Operat...
    • William Rosati - Float...
    • Lupus Nocte - Arcadewa...
    • Copyright Royalty-Free...
    • NO COPYRIGHT MUSIC Amb...

Комментарии • 384

  • @guyg.8529
    @guyg.8529 2 дня назад +127

    Modern tech is in dire need of simpler solutions. Not over-engineered crap. It's Wirth's law at its finest : better hardware just lead to worst software.

    • @barrupa
      @barrupa 2 дня назад +20

      And software quality will keep deteriorating even more once AI solutions become ever more prevalent. Take a look at how every game developer uses DLSS, XESS, FSR, and so on as a crutch for bad optimization and terrible image clarity and antialiasing

    • @guyg.8529
      @guyg.8529 День назад +16

      @@barrupa Good opportunity to tell you about the channel Threat interactive, a youtube channel about graphic optimisation. They made a video about how TAA is used for the worst in modern games, it's the first video on their channel.

    • @HickoryDickory86
      @HickoryDickory86 День назад +7

      My sentiments exactly. What they need to do is return to old, time-tested, tried and true game design and optimization techniques. Old techniques combined with the horsepower of new technology and hardware are a win-win combination.
      And I'm glad you mentioned Threat Interactive, because I love his videos! He's singing my song!
      I have gotten into the habit of calling AI upscaling, frame generation, atc. "software duct tape," because that's how devs are using them, hoping they can slap them over their lazy, shoddy work and call it a day.

    • @barrupa
      @barrupa День назад +4

      @@guyg.8529 I had already seen some of his videos. I am glad I am not the only one who hates the overuse of TAA as a crutch for low resolution per-pixel shaders and their artifacts. It's no wonder why old forward rendered games are far clearer and sharper looking. Everything nowadays looks like I forgot to wear my glasses.

    • @roklaca3138
      @roklaca3138 День назад +2

      @@guyg.8529 hardware manufacturers are on this, if engines were optimised, you would not need an i9 and 4080 to run decent settings. Take a look at indie some indie games boasting great graphics but you can run it on decent settings even on 1060...meanwhile nshitia nad AMDumb pushing $1000+ hardware down your throats for shitty ue5 stutterfest

  • @R.B.
    @R.B. День назад +20

    Rage is greatly underappreciated. The way they utilized large textures they broke up the static flat look of most engines at the time and made it look... real. The game itself was critically panned, but it offered something unlike what other games were bringing.

    • @crestofhonor2349
      @crestofhonor2349 13 часов назад

      Mega Textures were dropped after Doom 2016

    • @R.B.
      @R.B. 12 часов назад

      @@crestofhonor2349 that's true, but it was really the defining technology Rage brought. It helped still the immersion really well, but perhaps in a way relevant to this video, that feature alone wasn't contributing to gameplay mechanics which allowed the game to stand out. You spent a lot of time interacting with other characters in hubs, and folks wanted a Doom 2016. I think Rage was trying to be a blend of Fallout and Borderlands. No one was accusing iD for having too much story and dialog for Wolfenstein to Quake and those additional story elements were partly to blame for the public reception. I liked it, but I know that's not always a shared view.

  • @seanys
    @seanys 2 часа назад +6

    The problem, here, is that there’s only one John Carmack.

  • @giedmich
    @giedmich День назад +9

    Man quake 3 and id tech 3 was truly a generation. Many great games running on that engine from childhood. Call of Duty, MOHAA, American McGee's Alice, Star Trek Elite Force, Return To Castle Wolfenstein

  • @StrikerTheHedgefox
    @StrikerTheHedgefox День назад +10

    As someone who has worked with the Build Engine for 20 years at this point: Another reason why Build is a bit slower, is because everything is dynamic, there's no static BSP tree. This can be considered an engine feature, as all parts of a level can change. For example, you can add and remove sectors from a level in realtime if you know what you're doing, something IDTech1 has no hope of doing. There's also things like planar sprites (floor/ceiling and wall-aligned sprites, which can be used as platforms, decoration, etc. For Doom, the only ports that can do this are ZDoom-based.)
    Another thing to note about Build, is every two-sided line is a portal. This allows you crazy, overlapping, non-euclidean geometry without need of teleportation. The only condition for this is you can't render two overlapping pieces at the same time, requiring one to chop up overlapping sectors into smaller bits to prevent rendering issues. Duke3D uses teleportation in a lot of instances since it was easier and less intensive. The level "Tier Drops" is an example of this feature taken to the extreme, all of the different biomes in that level occupy the same physical space in the map.

    • @GTXDash
      @GTXDash  День назад +1

      That explains it. I remember watching a (I think it was) GamesDoneQuick TAS on Duke3D that was able to break the sectors in a hilarious way, And what you explained has now made me realize what was actually happening in that TAS.
      I try to not pit Doom against Build since the Doom engine definitely became more demanding after Raven added those extra features for Hexen, and it still used static BSP maps... I think. I just like how similar and yet different these two engines are from each other, BSP vs portals, etc.
      Even though I prefer speed over feature set, during my research however, I realized that Build isn't so demanding as I initially thought. A few weeks ago, I finally got my 486 running properly and when I ran Duke3D I was surprised how much faster it ran than expected. This is because LGR (when he first built his ultimate woodgrain 486 DX2 PC) tried to run Duke3D, he got terribly low frames. My 33mhz DX ran slightly better than that. I think there's something wrong with his woodgrain PC since it couldn't even run Quake after he installed a Pentium Overdrive. Go figure. I had to rewrite that section of my script after testing it on real hardware.
      I really appreciate the comment.

    • @StrikerTheHedgefox
      @StrikerTheHedgefox День назад

      @@GTXDash Yeah, Build's speed is very dependent on the video adapter used rather than the CPU itself. It really prefers hardware capable of VESA LFB (Linear Framebuffer) mode, which is part of VBE 2.0 (VESA BIOS Extensions). With hardware that doesn't have it natively, SciTech Display Doctor (aka. UniVBE) can help a lot with improving framerates, but it won't be as good as the real thing obviously.
      Using a PCI or AGP video card rather than ISA will also go a long way to improving performance, if the system has the slots available.
      And yes, Hexen still uses the BSP tree. The swinging doors and whatnot are a trick called "PolyObjects". They are extremely limited in how they can move. I'm also pretty sure they need to be convex in shape or they'll have sorting issues.

    • @GTXDash
      @GTXDash  День назад

      @StrikerTheHedgefox yeah, I remember opening a hexen map in Doom Builder and noticing a bunch of duplicate swinging doors just outside of bounds. It took me a minute or two to realize what I was looking at.

  • @DaBigBoo_
    @DaBigBoo_ 22 часа назад +14

    the way forward is back
    simplified, custom engines

    • @leezhieng
      @leezhieng 20 часов назад

      I agree, it's actually a lot easier to write custom engine these days.

  • @WhiskeyTF-iu2bj
    @WhiskeyTF-iu2bj 18 часов назад +6

    "The new doom engine was much more futureproofed." Me, loading up e1m1 for the hundredth time with a laundry list of mods. even so far into the future, the engine is so versatile that it holds a dedicated development and modding community to this day.

  • @gstreetgames2530
    @gstreetgames2530 День назад +9

    Good video breakdown of this important game engine history. I say it's a shame that ID Tech is no longer headed up by Carmack or making open source its old engines, or competing with Unreal as an engine that even independents can use. Your game looks great, I will be buying on GOG once its out. Keep up the good work!

    • @GTXDash
      @GTXDash  22 часа назад +2

      Game devs need to release more stuff on GOG. I love that site.

    • @gstreetgames2530
      @gstreetgames2530 20 часов назад +2

      @@GTXDash Agreed, and that's why I'm going to get it there and not steam.

  • @OdysseyHome-Gaming
    @OdysseyHome-Gaming День назад +11

    I half remember a quote from Marcus Beer on Epic War Axe when Rage came out.
    "Id have some of the best technical artists and programmers in the industry; but they desperately need to hire some fucking good game designers"
    Which is funny because Id has a long history of mistreating its designers or id tech clients like splash damage; particularly when John was Ceo.
    I think the best games get a balance right between artists, designers, and programmers. Its just a shame the industry splits these disciplines up leading to imbalances in power.

    • @realhet
      @realhet День назад +4

      And now they wanna replace these essential people with AI and sensitivity reading narrative designers. While it's hilarious to watch, I can still have fun with old games.

    • @sealsharp
      @sealsharp 20 часов назад

      Yeah Rage was kinda disappointing. Especially that nothing of an ending.

  • @snoopdouglas
    @snoopdouglas День назад +7

    Great video. I was using Allegro until a few years ago, but ended up switching to Godot purely because it allowed me to prototype faster (and incidentally, it's the only engine with which I've actually gotten games to completion) - but I have to say, I really do miss working with C.
    The games I make are so low-power that this hasn't ended up mattering too much, but Godot have recently added a very intuitive way to produce a build with all of the features your game doesn't use stripped out. Might be giving this a go.

  • @B-Titor
    @B-Titor Час назад +3

    Alternative title: "Why Doom can run on anything"

  • @highestsettings
    @highestsettings 2 дня назад +7

    If there's just one thing that can be learned from games like Doom and Quake. It's that (aside from making a good game), front loading the technology aspect to make creation easier and allowing the players to make stuff leads to insane longevity. Doom and Quake will be played for a long long time, people will be making stuff for those games for a long long time. If you only had Doom and Quake and access to mods, you could easily play those games for the rest of your life and never run out of new "content" (I hate that term, but it's fitting here).

    • @miguelpereira9859
      @miguelpereira9859 День назад

      All mainline Quake games still have people playing online to this day, and that would def not be the case if the games weren't open source

    • @Novastar.SaberCombat
      @Novastar.SaberCombat День назад

      People are still playing my two full episodes ("Darkhell" & "Blakhell") via the D1 engine. Great design is simple yet simultaneously robust. Poor design stacks layers upon layers upon layers upon itself to mask janky flaws and poor choices.

  • @AndrewTSq
    @AndrewTSq День назад +16

    I think older games looked better cause they used a lot of prebaked shadows etc. Look at Battlefield 3 and 4.. both games still looks great I think.

    • @crestofhonor2349
      @crestofhonor2349 14 часов назад

      Thise games look like games of their era. Battlefield 1 and Star Wars Battlefront look far better and use far more real time rendering than those games as well as other new features like PBR and wide use of Photogrammetry for it's materials

  • @SomeKittyCat
    @SomeKittyCat День назад +7

    Id really took the Crysis 2 & 3 era to heart and went: "Okay then, FUCK YOU. Superb graphics AND Superb performance!" with the new Doom and then Eternal.

  • @trblemayker5157
    @trblemayker5157 День назад +8

    While idTech is great for smaller maps, its not feasible for large seamless open worlds. Most of their releases have been linear smaller maps connected by loading screens

    • @GTXDash
      @GTXDash  21 час назад

      It's all about matching a game with the right engine. With that said, I thought Rage pulled it off pretty well... as long as you use an NVME to better hide texture pop-in.

    • @trblemayker5157
      @trblemayker5157 6 часов назад

      @@GTXDash Rage does have large maps but again, broken up by loading screens. It would suit games with linear progression and large levels such as the The Evil Within series

  • @deltacx1059
    @deltacx1059 День назад +10

    Sounds like 39 minutes of "use the right tool for the job" and ID is a beast with doing what they do best.

  • @skew5386
    @skew5386 День назад +22

    I've always said: Zenimax having full control over idtech is one of the worst things that have happened in relation to game engines.

    • @GTXDash
      @GTXDash  День назад +8

      So far, they have not interfered with id Tech. In fact, I wish they would adopt it more into the lineup of games than using... whatever hell is powering Starfield.

    • @skew5386
      @skew5386 День назад +11

      @@GTXDash It's true that they haven't interfered yet, but they're certainly not licensing it out to other companies like they frequently used to. I wish their other subsidiaries would use it too, LOL

    • @WizardofWestmarch
      @WizardofWestmarch День назад +1

      @@skew5386 It'd be interesting to see the Elder Scrolls team collab with iD to replace Creation Engine with something based on iD's work but with the tooling Bethesda needs.

    • @VoyivodaFTW1
      @VoyivodaFTW1 День назад

      This is really what needs to happen.

    • @GoldSrc_
      @GoldSrc_ 12 часов назад

      @@skew5386 They can't license something that has middleware from others, it's not as easy as you think.
      Why do you think idTech4 was the last engine id released its source code? idTech4 had no code from others, it was all id's stuff.
      And yes, I know about the Creative BS, but after Carmack rewrote code for the shadows, it was all good to be released.

  • @rakhoo5236
    @rakhoo5236 2 дня назад +6

    5 mins in about my fav game engine and Threat Interactive gets used as reference, this video is going to be so goated

  • @Raynhardx
    @Raynhardx Час назад +3

    Pretty sure the root issue is, that a good optimized game engine (be it in house, or 3rd party) needs a few very passionate and involved developers, that are so comfortable with the engine + have enough carisma and influence, that they can identify and prevent performance issues and suggest alternatives already in the early stages of development. But these kinds of people are difficult in a big coorporation setting, because if you have 1000 employees, and your entire game can collapse if those 3 core people leave the company, you are fucked. So all big companies are either a) disincentivsed or b) don't even try or c) tried but lost it along the way, to cultivate such tech. ID software was always technology driven and they always had enough success to stay alive. So it's a rare gem.

    • @barrupa
      @barrupa 49 минут назад +1

      And it only gets worse. Given the direction the industry is heading, traditional optimization techniques will soon be replaced by relying on AI upscalers as a crutch for achieving decent performance-at the cost of image quality, blurriness, and artifacts. It's no surprise that older games with traditional forward rendering, or per-vertex shaders instead of per-pixel, often appear much sharper and clearer. This is especially true when traditional MSAA or other forms of multisampling are used, compared to modern games that employ a multitude of deferred techniques and low-resolution per-pixel effects, which require temporal accumulation and spatial upscaling to avoid looking like a dithered mess.
      To be honest, the technology available when Crysis was released (2007) is more than sufficient to create good-looking games, especially if subsurface scattering and distance-based diffuse shadow maps (like those used in GTA V) are added. Everything developed since then seems to exist either to sell more hardware or to reduce the burden on developers and studios, such as AI upscalers or ray tracing.
      I think the future looks grim, especially considering how mainstream GPUs have become power-hungry monsters ever since NVIDIA realized it could push enterprise-level hardware to consumers at insanely high clock speeds in the hopes of making real-time ray tracing viable for game development. This shift places the burden on consumers, who are now expected to shell out $300 or more for a 250W "space heater" that can barely run a game at 1080p60 with upscaling

  •  День назад +4

    Holy shit, didn't expect to see Tilengine here! Such an awesome little framework :D
    I used it as the base for my last game project, and it was surprisingly easy to bind to a different language (I used Odin instead of c), so super happy to see it get some love :))
    (Falconet is looking amazing too)

    •  День назад +1

      I'm hoping to find something similar for 3D graphics, with enough control for me to create the underlying systems how I want, but not getting stuck on the more complicated side of vertex buffers, pipelines and whatnot. Raylib seems a bit too simple, maybe sokol-gfx could have what I want?
      Meanwhile I'll still be using Godot for 3D stuff...

  • @CyberWolf755
    @CyberWolf755 2 дня назад +3

    There is a good talk about how the rendering tech in a Unreal Engine 4.27 (pre-Nanite) game was stripped down to the necessities and modified with some extra features for what the game needed. The game was not released, but the talk is very detailed and useful.
    The video title is: A Taste of Chocolate: Adding a Rendering Fast Path without Breaking Unreal Engine

    • @gibsson
      @gibsson День назад

      do you mean hyenas by creative assembly?

  • @widrolo
    @widrolo 22 часа назад +7

    I think the biggest issue about modern engines is that they are basically tailor made for bigger corporations, that want to make impressive looking games fast. The whole existence of TAA, upscaling tech and Nanite are made for making games that would look good in still frames, but not in fast paced games or if the frame graph was enabled. Its really sad to see companies abandoning their own engines (especially Frostbite and Red engine) to move to Unreal.

    • @crestofhonor2349
      @crestofhonor2349 14 часов назад

      TAA is actually pretty old. Using data from previous frames to anti alias new ones dates back to at least the 7th generation. I believe Crysis 2 had a form of TAA that was pretty bad by todays standards. Nvidia has had another form call TXAA that is in some early 8th gen PC titles too. Another form called SMAA T2X was another form of temporal antialiasing too. The point of it wasn't to make it look good with still frames, it was a way to smooth over certain things that MSAA and sometimes SMAA missed. In it's early days it was just another AA form like MSAA, FXAA, SSAA, and SMAA

    • @widrolo
      @widrolo 8 часов назад

      @@crestofhonor2349 I mean the usage of TAA. Epic uses it everywhere to cover up shortcomings like in Lumen, shadows, foliage, hair etc.

  • @BaddeJimme
    @BaddeJimme 17 часов назад +7

    Actually, old VGA cards did support hardware scrolling. What you couldn't have was hardware scrolling using the popular Mode 13h, because you wouldn't be able to access the extra video memory required to scroll around.

  • @SinisterOyster
    @SinisterOyster День назад +6

    Some developers heavily rely on TAA to smooth out some of the effects used. So they don’t add the ability to turn off TAA as it will show flaws (pixelated or grainy looking shadows and effects)

    • @GTXDash
      @GTXDash  День назад +3

      @@SinisterOyster I actually don't mind TAA. I'm just not a fan of how many devs use it as a crutch for fixing issues that shouldn't have been there in the first place.

    • @crestofhonor2349
      @crestofhonor2349 13 часов назад

      TAA is fine but I wish it was used differently

  • @ymi_yugy3133
    @ymi_yugy3133 17 часов назад +15

    10x faster hardware -> 10x better visuals, is just not how it works. So many techniques used in games do not scale, because they have fundamental issues that can't be fixed with faster hardware but require an entirely new approach that is often much more expensive.

    • @sulfur2964
      @sulfur2964 15 часов назад +6

      That doesn't change the fact that many games and engines are bloated, unoptimized piles of crap

    • @BlueMesaCable
      @BlueMesaCable 14 часов назад +2

      ​@@sulfur2964 An engine can be bloated, doesn't mean the compiled game is bloated. Engines are still tools, there's a lot of room to work with most. Features can be disabled, code can be streamlined, packages can replace default functionalities if not ignored entirely. I have only used 2 engines at depth (GameMaker and Godot, which to be fair are kinda light engines for undemanding games) but I seriously doubt either had much effect on performance in a relevant way. At that, they allow much faster design as well as communities and support.

  • @jamesgphillips91
    @jamesgphillips91 Час назад +3

    im a baby game dev... but this is literally why i use bevy over any big game engine. I take the pieces i need and have low level control. Does it suck to roll out all your tooling, sure... but ffs its nice to be modular and slim

  • @helix0rz
    @helix0rz День назад +2

    This is a very well-written and presented video! I also share the same thoughts you listed. I hope your game gets the same recognition and success both of you are investing in!

  • @gibsson
    @gibsson День назад +3

    33:26
    thats why i kinda hope that valve would release source 2 to make that gap
    the engine itself is still inherent idtech 2 architecture while being moderate enough to work with

  • @AndreiNeacsu
    @AndreiNeacsu День назад +8

    The microcontroller in the joycons of the Switch is more powerful than the NeoGeo. The problematic performance is simply on the shoulders of the engine and the game development.
    With 128 times more RAM and CPU power than 20 years ago (and probably the same with the GPU) we only get a bit nicer looking games with comparable enjoyment. While I am not a graphics whore, if a game requires twice the hardware of another one for the same performance, I expect a nicer looking screen in front of my eyes. I am fine with the graphics of Serious Sam SE, but I expect on my R9 5950X with 128GB of DDR4 3600 and RX 7800XT at least 32 simultaneous VMs to run the game just as well as my P4 2.4GHz ran it 20 years ago with a Radeon 9800XT (of 21 years ago) on 2GB of DDR1. As you can probably tell, no new game that looks like that runs like that today. You could argue that RAM speed has only improved about 10 times (comparing those two systems), it's not even 8 VMs that would actually run something looking like that.

  • @SeishukuS12
    @SeishukuS12 День назад +6

    I've felt that devs have lost perspective for a long long time now, the "universal" engines are great, awesome tools and you can slap together a game really fast... But the longer these engines are used, the more and more talent is lost when creating a simple purpose built engine for whatever game you're trying to make.
    Personally, I love trying to do as much as I possibly can from scratch and it shows in my engine, both in terms of how long I've been working on it and the quality of code (long and poor lol), but it's also just a hobby of mine, so that doesn't matter... The point is I learn from it and I wish more devs would do the same, learn how to do some of the old school stuff and work up from there... You can even use that knowledge to make Unity and Gadot run better on low spec hardware.

    • @altridev
      @altridev 23 часа назад +1

      Me as a beginner developer, I love engines. But I do recognize that engines have a lot of features and honestly bloat that we don't need and only bog down our games. I have watched a few talks and interviews with developers involved in the making of really old classic games and the one that really stood out for me was the developer behind Crash Bandicoot. The things they had to do and optimize just to get crash into a PSone is INSANE. I'm using Gamemaker and know a bit of Unity but I'm currently learning Love2D which is a library and not so much an engine. I'm looking to optimize my games the most I can this way and even extend this to fit any future games I make.

  • @azazelleblack
    @azazelleblack 8 часов назад +12

    You're pretty off the mark about the comparison between the Doom engine and the BUILD engine. Duke Nukem 3D was fully two years after DOOM and it has MANY features that are simply not present and not possible in the vanilla DOOM engine. It's true that later ports of the engine, particularly those that implement Hexen features like ACS scripting, are capable of doing the same things as BUILD, but those are VERY far removed from the vanilla DOOM game. Your example of "getting a sector to lower with many explosions" isn't something you can do in the vanilla DOOM engine without using the hard-coded Icon of Sin death effect, which has other effects, like ending the game.
    Which brings me to my other complaint about that section of the video, which is that DOOM was by no means "split" into "engine" and "game". The DOOM executable file is completely full of assumptions about what game it is; all of the "rules" of the game are contained, hard-coded, in the binary. The fact that add-on files can modify the rules of the game is only possible in later ports of the game that support loading integrated DeHackEd files, or more advanced modding systems introduced in ports like ZDoom.
    The concept of a "game engine" really wasn't discussed among gamers until the conversation became Duke3D vs. DOOM, and that was almost entirely because many documents surrounding Duke3D included a copyright message that said something along the lines of "The BUILD Engine is Copyright Ken Silverman (C) 1995". Nobody talked about Heretic or Hexen being "Doom engine" games until '96 or so; before that, they were just "based on Doom".
    Ultimately I do think the thesis of your video is reasonably sound, but you have a lot of problems in your fact-checking.

    • @Razumen
      @Razumen 7 часов назад +2

      The comparison wasn't between OG Doom, but the Doom engine itself. It doesn't really matter that on release Duke3D could do things like spinning doors, or lowering multiple sectors, when that was already possible in Hexen's usage of the Doom engine.

  • @hunn20004
    @hunn20004 День назад +5

    Modern Game engines are suffering from the same thing that RISC-V is trying to solve on processors.
    Backwards compatibility, legacy code and a fork JUST in case someone wants to connect a CRT or serial Joystick.
    But for game engines, you have 3D features included even though the game is only using 2D assets, BUT in a 3D plane to fake parallax.
    Like making a Mass Effect game in a First Person Shooter engine (Andromeda).

    • @ERECTED_MONUMENT
      @ERECTED_MONUMENT 19 часов назад

      To be fair andromeda had the best gameplay of all the mass effect games, not that it was a high bar, I love ME but it's carried a lot by its setting and story, and that was the problem with andromeda, the writing was so bad, and the setting kinda got change with the change of galaxy. But the movement was great and the gunplay was better than the other games, at least until you max your damage skills but your character keeps leveling and the enemies scaling with it, turning them into sponges, awful balancing, but easily fixable with mods. Also not letting you give commands to your squad? Come on.

    • @hunn20004
      @hunn20004 3 часа назад

      @@ERECTED_MONUMENT Pretty sure they still believed in "Bioware magic" at that point.
      And they were being forced to use the EA FPS engine to suddenly make a 3rd Person RPG.
      They had to write a lot of code from scratch, WHICH did benefit future EA projects immensely, BUT is also why EA just keeps axing their studios.

  • @RedSntDK
    @RedSntDK День назад +8

    I doubt it matters, but Halo was originally supposed to launch on PC before Microsoft bought Bungie and had them make it an Xbox exclusive. From a regular dumb gamer's perspective you'd think "porting" it back to PC would be easy then.

    • @ZanaGBYT
      @ZanaGBYT 22 часа назад +4

      Actually... Halo was supposed to be a Mac OS 9 Exclusive! Running in OpenGL! It were to have been the last hurrah for the platform.
      And after steve jobs declined buying bungie to help finishing halo for a carbonized release (OS9 and OSX), they sold to microsoft under the tense that it had to be an xbox exclusive.

  • @Disthron
    @Disthron 18 часов назад +8

    Not so Fun Fact: The Crysys engine used tessellation to increase the mesh of 3D models, just so it would perform worse on AMD hardware. Nvidia had some proprietary tech that could handle it. It was actually unoptimized on purpose.

    • @crestofhonor2349
      @crestofhonor2349 14 часов назад +1

      AMD GPUs are technically still worse at tessellation to this day but most of the time tessellation is very cheap

  • @JennyTheNerdBat
    @JennyTheNerdBat 6 часов назад +6

    28:36 I feel kinda gaslighted on my childhood, because I remember UT2004 and SWAT4 running really well on my potato PC, while Doom 3 remained borderline unplayable until we upgraded the rig.

    • @forasago
      @forasago 4 часа назад

      I also doubt those claims. Lighting in those games was baked. They should have run better on low end hardware for that reason alone.

  • @jabbahiggs7039
    @jabbahiggs7039 День назад +2

    That;’s a very well produce video! There’s a lot of videos on old games and engines basics, we need some more advanced stuff like you do. Subscribed!

  • @HickoryDickory86
    @HickoryDickory86 День назад +4

    Love the video!
    FYI, id Tech Engine versions 0-4 were all released free and open source under GPL licenses. Versions 5-7 (and after) are still proprietary, though. Wish id Software would license it out to third-party studios again, like they used to.

    • @WizardofWestmarch
      @WizardofWestmarch День назад +4

      Sadly without Carmack at the company anymore I doubt any more versions of iD tech make it into the wild. He was the driver of open sourcing their prior iterations.

    • @sealsharp
      @sealsharp 20 часов назад +1

      Engines became much more complicated and now use a lot more third party tools that may be under license and can not be open sourced. Also we do not know what the build pipeline looks like. Just pressing F5 to run the game is not standard on engines.
      There are rumours of days long build times on some AAA games.

    • @HickoryDickory86
      @HickoryDickory86 18 часов назад

      @@WizardofWestmarch Sad, but true.
      I just wish Bethesda Game Studios would drop Creation Engine and fork id Tech for their projects going forward. They actually had considered using it for _Skyrim,) but Carmack dissuaded them from using it, since MegaTextures was a major part of their pipeline and he didn't think, at that time, that it was suitable for the type of open worlds that they created for their games.
      Interestingly, id released _RAGE_ the same year BGS released _Skyrim..._ using id Tech... with a semi-open world. Since then, however, that have brought in the Vulkan graphics API (they had been using OpenGL) and gotten rid of MegaTextures completely (with id Tech 7). And the world spaces in _DOOM Eternal_ are quite massive, even if they don't feel like it because the player is moving at such a fast speed. So, I'm confident that it's perfectly capable of an explorable open world. BGS would certainly have to create all kinds of new mechanics to customize the engine for their particular use case, but I believe that time and effort would be worth it. (And it might be possible to "strip [the Creation Engine] for parts" and incorporate them into their id Tech fork, considerably reducing the workload. But "dropping in" features like that would really depend on how modular the id Tech's architecture is.) Still, it would be worth it, because BGS would finally get rid of their mountain of technological debt and have a solid foundation on which to move forward.
      Caveat: All the above having been said, once they got their id Tech fork where they wanted it for their next game, BGS would be wise to set aside a team whose sole job, at least until the project is done, would be to create a new modding kit for the new engine, to be released Day 1 with the game.

  • @ThatKidBobo
    @ThatKidBobo 23 часа назад +7

    Based Doom engine using it for my game

  • @baddragonite
    @baddragonite 21 час назад +2

    Funny thing is RPG Maker seems to have a similar design philosophy of being mainly used for a single genre but it gets a lot of flak for being too restrictive even though really if you're creative you can use it to make some crazy unique stuff and it seems to hit nearly every Criteria you have on your list lol

  • @3d1e00
    @3d1e00 День назад +5

    I just don't think Crytek thought that single core processors were going to keep upclocking. That wasn't what anyone was thinking then at all. We have fundamental limits on clock speed, the multi-threading solution was the only option and everyone knew it.

    • @GTXDash
      @GTXDash  День назад +2

      @3d1e00 I didn't say Crysis uses one core. I tried to convey that they assumed the speeds of a single core would continue to increase at the same rate it had been doing for a decade leading up to that point. Crysis does indeed improve performance on 2 cores, at least a little, but not 4 or higher. With that said, 2 core support had to be added to the engine because much of Cryengine 2's development happened before the Athlon 2X or Pentium D were even released. It takes a long time to build a new game engine even in the 2000s.
      Edit: But looking back, I definitely should've worded it differently.

  • @the_disco_option
    @the_disco_option День назад +4

    I love the deep dive on the engines here. I definitely agree that specialized engines would do a lot to make game more optimized, but I think the economic reality of AAA will push towards a few "General Use" engines. for example, CDProject RED and Bethesda are both switching away from their custom engines to UE5, largely because it's easier to hire for.

    • @gibsson
      @gibsson День назад

      I think we need more competition beside unreal engine 5 ngl, unity is kinda nuke themselves on the foot. More choice of powerful 3D engine that can expose unreal engine weakness would be really nice for consumer.
      That mean unreal engine will force to optimized their engine when there is a competition that make a well optimized engine that someone would use for their project.

  • @xxkairzzxx1994
    @xxkairzzxx1994 19 часов назад +1

    Suprised to see you had 2k subs, this is quality i expected from a much bigger channel, great video 👍

  • @Definesleepalt
    @Definesleepalt 8 часов назад +4

    games without graphics acceleration must be so programming intensive , i can't imagine being in the gamedev industry in the 90s

    • @Jabjabs
      @Jabjabs 6 часов назад +1

      It was intensive but the scope was also more limited. Work to what you have.

    • @hudo108
      @hudo108 4 часа назад +1

      It was intensive, yes. But it was also far easier to understand and reason about. The rendering pipelines nowadays are so complex that you can't really keep a complete mental model of your renderer in your head. I can't, at least.

    • @MarcShake
      @MarcShake 27 минут назад

      Check out "second reality" by future crew. That production came out before doom and it's a showcase of what would have been possible these days. IMHO the best thing ever coded on PC.

  • @WaveSmash
    @WaveSmash День назад +3

    I think a better version of the Van and motorbike example, the van needs to be big and featured enough for anything it could possibly need to carry. But building a brand new motorcycle which is slimmer and almost perfectly efficient for the delivery of your single specific package is generally too expensive and time consuming.

    • @GTXDash
      @GTXDash  День назад +4

      Just like how shipping companies don't need to design the vehicles they use, a publisher should be able to purchase a license to a specific engine they need for that specific game, without always going through the process of building their own engine.

  • @axa993
    @axa993 23 часа назад +2

    I remember Door Kickers devs building their very simple game engine.

  • @Littlefighter1911
    @Littlefighter1911 22 часа назад +4

    10:20 I've asked Deck 13 if they consider licensing out their PINA point&click engine,
    considering it hasn't been in use for more than a decade.
    They politely declined, but I am pretty damn sure, that the reason they said "no",
    is because I really was just a student at that time.
    They have given the same engine to another company specializing in Linux ports (RuneSoft),
    so it's more of a "don't see the point in licensing" thing, I suspect.

    • @sealsharp
      @sealsharp 20 часов назад +2

      Software not made for the public may be subject to licensing issues because of used libraries.
      For example if the engine contains code for consoles, they can only give that to you as a registered developer for consoles, which some years ago was not possible for individuals, just companies.
      May have been something simple like RAD game tools, everyone used bink videos. And bink still has no pricetag on their website, which means "if you are a company, write us. If you are not, you can't afford it anyways".

  • @SvenHeidemann-uo2yl
    @SvenHeidemann-uo2yl 23 часа назад +4

    Thanks for sharing this video. I love everything Id does.
    Btw, i wonder why you where mentioning godot so frequently, considering how abysmal it's performance is.

    • @GTXDash
      @GTXDash  23 часа назад +1

      I just use Unity, Unreal and Godot as a strawmen for "other competing engines" :P
      I'm actually fairly familiar with Godot since we used it once for a game jam a few years ago, and I thought Godot is... fine. It gets the job done.

    • @SvenHeidemann-uo2yl
      @SvenHeidemann-uo2yl 21 час назад +1

      Fair enough 😅
      I suppose for 2d that is accurate.
      I am considering building my next project from scratch, using raylib and c++
      More as a learning experience than anything else.

  • @arthurcuesta6041
    @arthurcuesta6041 20 часов назад +2

    This video is so much better than I expected.

  • @Kruku666
    @Kruku666 38 минут назад +2

    is it just me or the old tech has some sort of charm to it ?

  • @iammichaeldavis
    @iammichaeldavis День назад +3

    9:30 that van analogy was Socrates-brain brilliant

  • @DanielMircea
    @DanielMircea День назад +8

    I am really surprised to see doom 3 ahead of farcry and ut 2004. I had an fx 5200 back in the day and doom 3 was outright unplayable no matter the settings, whereas farcry and ut 2004 worked really well.

    • @DigitalMoonlight
      @DigitalMoonlight День назад +3

      This era (roughly 1998 - 2006) of PC gaming was particularly strange because unified shaders didn’t exist yet, but we had standard graphics APIs so aside from GLIDE in the tail end of the 90s very little was programmed specifically for a particular video card. As a result different GPU architectures would be better or worse on a game/engine by game/engine basis with drivers versions sometimes having drastically different performance characteristics. A common way to boost FPS in Quake 2 engine games was to rename a game’s executable to Quake2.exe, SiN is a particularly good example because the game was practically broken without doing this.

    • @GTXDash
      @GTXDash  День назад

      @DanielMircea Older cards like certian models of the FX and Gf4 ran badly in Doom 3. It's kinda like how a really fast 486, which ran Duke3D just fine, couldn't run Quake very well due to the lack of certain features that were introduced in the pentium.
      Also, I think Ut2004 and FarCry graphics settings could go a lot lower than doom 3.

  • @betaradish9968
    @betaradish9968 День назад +1

    As a complete hobby dev I know from personal experience that GMS has two compile modes, VM and YYC. VM is as it sounds and the game is compiled to bytecode that a virtual machine runs and is often a much slower way of making a game. Where YYC compiles to essentially a C++ style executable with faster performance.

  • @TheOneAndOnlySame
    @TheOneAndOnlySame День назад +7

    When you watched any video from Threat Interactive, DF tech analysis is everything but in depth. It's the most plebian possible tech analysis after "it's pretty" you can find and the most "catering to establishment" possible as well.

    • @guyg.8529
      @guyg.8529 День назад +7

      DF mostly do framerate and image quality analysis, real technical content is very rare in their analysis videos. That's quite a shame.

    • @GTXDash
      @GTXDash  22 часа назад +2

      I don't think it's all or nothing. I try to look at problems from multiple perspectives. Threat Interactive have been very critical of DF's opinions of TAA and such. DF is very useful for data collecting, not necessarily their opinions on the direction that the industry needs to go.

  • @johnm9263
    @johnm9263 5 часов назад +3

    23:10 the playstation 1 only worked with integer based coordinates, so it was able to calculate less, and be cheaper, but those vertecies had to jump around, thats why the original playstation looks as if it wobbles and jiggles a lot, by contrast, the N64 did everything much more granularly and thus had to do far more calculations for the same "graphical fidelity"
    so the tradeoffs were built into the console for the playstation and built into the games for the n64
    the n64 was a better play experience when it did work, but the playstation was a better experience when you didnt care about all of the smaller details, or the massive amount of jiggling

    • @rodneyabrett
      @rodneyabrett 2 минуты назад

      Yeah. Ken Kutaragi's priority was he wanted the console to generate as many realtime UV mapped polygons fast and this was a pretty clever way to to get there. Ditch the Z-buffer and have the verts only snap to integer numbers. You didn't even notice is as much at the time on the 480i interlaced CRT TVs so it was a good compromise and seeing the demos of Ridge Racer running on a loop at an Electronics' Boutique store convinced me to buy a PS1. I was playing PC games at the time, but even then, I hadn't seen 3d polygons run that smoothly before.

  • @JessicaFEREM
    @JessicaFEREM 3 дня назад +5

    Yeah I could run doom2016 on a laptop with a very thermally throttled CPU and not enough ram can still run it at 720p at 60fps. it runs on a potato and doesn't look bad while doing so.

    • @Capewearer
      @Capewearer День назад +1

      But Crysis 3 looks better and runs better too. And Crysis 3 doesn't even use lightmaps and baked light probes, meaning that lighting can be fully dynamic, unlike Doom 2016.
      If Doom 2016 threw out megatextures, it would be much better. And changed environment design, it's very blocky and horizontal to vertical ratio is ridiculuos.
      But yeah, in 2024 Id Software are probably wizards for doing so.

  • @rhrabar0004
    @rhrabar0004 День назад +3

    RUclips recommendation win. Great video!

  • @uhuhyesm435
    @uhuhyesm435 18 часов назад +4

    Wait UT 2004? Am I remembering this wrong? I remember DOOM 3 being much harder to run than UT 2004.

    • @crestofhonor2349
      @crestofhonor2349 14 часов назад

      Doom 3 was very expensive at the time thanks to its heavy use of dynamic lighting and stencil shadows. It was very innovative at the time

  • @worldother2080
    @worldother2080 11 часов назад +1

    la primera vez que me comencé a interesar en el desarrollo de juego fue con un video en youtube sobre Doom. El sistema indie fomenta a aprender el estilo doom para el desarrollo de juegos por lo significativo que es, es como un paradigma.

  • @deaglan6641
    @deaglan6641 23 часа назад +1

    Great video. I’ve always been super interested in how games like half life 2 and other source engine games get their look. I’ve seen lots of super high quality animations with incredible art direction come out of G mod and it makes me wonder how source works

  • @Blzut3
    @Blzut3 2 дня назад +4

    A bit nit picky but the Quake engine, as far as I can tell was given the title of "id Tech 2." At the time of the retronym the Quake and Quake II engine licensing page was the same and example id Tech 2 games included Quake engine games like Hexen II and Half Life. I don't know for sure but I assume they were licensed together based on that. However colloquially, there is of course reason to want to distinguish between Quake/QuakeWorld and Quake II's engine so that led to a battle (mostly in Wikipedia edits) between the Quake and Doom community over the unofficial "id Tech 1" name which went on for a long time. These days I feel like most people agree it's more useful to call Doom id Tech 1 and Quake is just an early revision of id Tech 2, but of course I'm a bit biased.
    Also not that it really would have added much, but you did forget the Raven Engine!

    • @GTXDash
      @GTXDash  2 дня назад +2

      @@Blzut3 Yep. I left out a lot, as the script kept getting bigger and bigger, I eventually I realised I cant cover everything. I'm admittedly one of those "Wikipedia editors" that support the notion of the Quake Engine being a variant of id Tech 2.

  • @AlexGoldring
    @AlexGoldring День назад +19

    Graphics programmer here. Quoting "Threat Interactive" here was a big mistake. The guy speak authoritatively, but that's it. Most of what he says are falsehoods resulting from lack of knowledge and laziness to find out.
    I watched the video in question was was impressed by how bad it was. For reference, pretty much every single thing he says about nanite is complete opposite of what the technology actually is. Brian Karis of Epic games, the mind behind the nanite has given a number of technical talks on the subject.
    I myself have implemented similar technology in the past, so I'd say I have the authority on the subject.

    • @GTXDash
      @GTXDash  22 часа назад +1

      I think it would be great if there was some form of benchmark that can settle the argument once and for all. Nanite with LOD, LOD only with higher poly meshes, and Nanite only with no LOD. If someone can do a test like that, I would love to see the results.

    • @AlexGoldring
      @AlexGoldring 19 часов назад +3

      @@GTXDash Generally speaking nanite will be faster when all other things are the same. When we do LOD transition, we're actually drawing 2 models of different LOD levels, to mask the "pop". Nanite doesn't have to do that, as transitions are much more fine grained so "pops" will not happen. That said, Nanite will be slower than custom LOD hierarchy in practice. Why? Because custom LOD will just have much lower triangle density, where as nanite will attempt to give you as dense a mesh as it can, but no denser than 1 triangle per pixel.
      You can adjust that "error" factor to any number of pixels though. The implementation I had was using 4 pixel area, as I saw no noticeable difference and it pushed out much fewer triangles to the GPU.
      I think the comparisson in general is not very useful. Nobody creates meshes with every vertex placed by hand, we use tools. Sometime in the 90s we did place every triangle coordinate by hand. Today we don't, there are plenty of bad topologies (triangle layouts) in production. Why do we not care about individual triangles today? To save time, and because hardware can manage an extra 100 or 1000 extra triangles without much issue. Nanite is basically that, instead of investing time and slowing your art pipeline with custom LOD and paying for expensive LOD tools such as Simplogon - you just tick a box saying "enable nanite" and you get amazing results. There are tradeoffs, and there are configurable parameters, but it's not about runtime performance, it's about developer productivity in general.
      Say you want to make a good looking game as a solo dev. Do you know how to make good LODs? Do you have a spare 10,000$ per year for a Simplygon license? Well, if not - "go home kid, let adults do this. "

  • @technoguyx
    @technoguyx День назад +2

    nicely written video, I think it's one of the best overviews I've seen of all the different iterations of id Tech and their strengths and innovative features.
    37:36 surely you can run the game in a VM to check if it runs e.g. in a slow single core with limited RAM for fun?

    • @GTXDash
      @GTXDash  День назад +1

      @@technoguyx we could try that.

  • @rodneyabrett
    @rodneyabrett 15 минут назад

    I've been in game development since the Playstation 2 days and one thing I can say that contributes to this is the move to those algorithmic optimization solutions over manual artist-driven methods. They are faster but sloppier in many ways. Nothing beats creating LODs by hand when the artist has way more discernment on how to go about turning a 200k mesh into a 50k mesh without losing the base silhouette of the model and keeping UVs clean. I've met many younger 3d modelers that can't optimize as well because they're leaning into these auto-decimation tech too much. It works in a pinch but you don't want depend on it too much. DLSS has also made us lazy with optimization as well.
    There's also the move to 100% realistic lighting over the Gen 7(PS3/Xbox 360) hybrid lighting where you had some realtime lit environments blended with pre-baked lighting. That alone is a major resource hog.
    When Unreal first introduced Nanite, I knew that they would lead to games running way slower. lol.

    • @GTXDash
      @GTXDash  2 минуты назад

      Oh, finally! Someone that actually works in the industry. I have a question. So, before modern methods, does LOD work kind of like mipmaps where you have to have multiple models of the same object or 3d mesh with verifying degrees of complexity? I know how id tech 3 does this, but I'm not entirely sure what is industry standard.

  • @jonny-nava-367
    @jonny-nava-367 День назад +6

    Based on your criteria, what's the best engines for 3D games?

    • @jabbahiggs7039
      @jabbahiggs7039 День назад +1

      Writing your own engine (like Dead Cells did) will be an advantage, but you need strong programmers to do that in A/AA production. Unity or Unreal are great for most games, and most games built in those skip on optimization, there’s a lot work to be done in oprimization department tha could make the games run better. Only AAA studios can purchase a license for something like IdTech, or maintain their own tech, but that’s expensive and hard, e.g. even big company like CDPR ditched their RedEngine for Unreal after Cyberpunk fiasco.

    • @HentaiSpirit
      @HentaiSpirit День назад +1

      not unity

    • @roklaca3138
      @roklaca3138 День назад

      Custom engines

    • @cinderguard3156
      @cinderguard3156 День назад +4

      ​@roklaca3138 problem is it takes a lot of time, effort, and resources to create and maintain an engine. So much that some even say that once you start making your own engine, you're not a game dev anymore. you're an engine dev.

    • @xXx_Oshino_xXx
      @xXx_Oshino_xXx День назад +1

      Source

  • @Rudimaentaer
    @Rudimaentaer 21 час назад +2

    I play Witchfire right now. It looks like a UE5 game but it is in fact a UE4 game. And there are no LOD pop ins. Yes it has stutters and traversal stutter, but it is still in EA. And i get tripple the frames i would get with any similar looking UE5 game. I shudder everytime a game company makes the announcement that they switch to UE5.
    UE5 is so overbloatet that the basic load is so high that my old combo of a 9900k and a GTX 1080 is just enough that a tiny game like Bramble: The Mountain King can run on 1080p with 60fps.

  • @Superchunk-k2h
    @Superchunk-k2h 14 часов назад +8

    Modern professional software development emphasizes doing it fast instead of doing it right, I can only imagine how sloppy 3D pipeline code is with low stakes like video games.

    • @coondog7934
      @coondog7934 12 часов назад

      If it works, it works. There is not much more to it. Engines are just tools in order to create a certain image you have in mind.

    • @Ch4nKyy
      @Ch4nKyy 9 часов назад

      I worked in other industries with "higher stakes" and let me tell you, game engines are cutting edge.

    • @Razumen
      @Razumen 7 часов назад

      @@Ch4nKyy Engine stutter and performance issues beg to differ.

    • @Ch4nKyy
      @Ch4nKyy 7 часов назад

      @@Razumen I'm not saying they are perfect, I am saying that software development everywhere else is just as bad or worse.

    • @Razumen
      @Razumen 4 часа назад

      @@Ch4nKyy Not where outcomes actually matter.

  • @SR-388
    @SR-388 2 дня назад +2

    one thing I feel should be mentioned with doom 3 on pc and its original xbox port is that the levels and enviroments are pretty cramped, this can help alot with perfomance, metroid prime 1 also did this and ran at 60fps on the gamecube

    • @GTXDash
      @GTXDash  День назад

      Definitely. Not an Apples to apples comparison when compared to something like Half Life 2. It's not so much "What can the engine do", it's more about "What a game needs from the engine".

  • @randommidimusic1489
    @randommidimusic1489 22 часа назад +5

    hopefully i can run doom the dark ages fine on my gtx 1060 eternal ran fine

    • @thatlonzoguy
      @thatlonzoguy 21 час назад

      Not maxed but i bet it will run fine

    • @randommidimusic1489
      @randommidimusic1489 20 часов назад

      @@thatlonzoguy yeah i am not planning to play on maxed out settings

    • @crestofhonor2349
      @crestofhonor2349 14 часов назад +1

      Considering you can get 30fps in Jedi Survivor on a 1060 and the 780m in the ROG Ally, I can't imagine that the new Doom game would be any heavier than the technical mess that is Jedi Survivor

  • @SuperFriendBFG
    @SuperFriendBFG День назад +3

    I love FEAR, but yeah, iD Tech 4 was a vastly more efficient engine. That said, neither was good at vast open space,s but at least iD Tech 4 could render a lot more geometry. I'd be curious how well your game would run on a Raspberry Pi 4/5.

    • @morphius2003
      @morphius2003 День назад

      Some youtuber did a video about Doom 3 on the Raspberry Pi 5. 720p medium settings is playable (occasional fps drops to the low tens and some stutters)

  • @MercurioMarco
    @MercurioMarco 20 часов назад +6

    hearing "python" and "natively" regarding the same thing makes me kind of uncomfortable 😅

  • @akademiacybersowa
    @akademiacybersowa 6 часов назад +1

    4:00 Immediate thought for me: adding more cores do objectively add more processing power but at the cost of delay due to workload preparation nad synching, and overhead (both computational and production) due to the need to balance the workload across multiple cores. Which is additionaly hard as you cannot even rely on the number of cores.
    In a sense it's a similar issue like with CRT vs LCD screens. Those aren't just "newer, better technology". It is a paradigm change and it requires a different design approach to utilize strength of the technology while avoiding its weaknesses.
    P.S.
    PSX games on CRT TV look so unique from todays perspective. PSX was a pinacle of available gaming platforms when I grew up. But during teenage/young adult times I grew tired of CRTs and their issues (which are plenty!). So I was happy to switch to LCD. Not I'm happy I have both so I can re-experience the tech of the past.

  • @chimpana
    @chimpana День назад +9

    How about the fact that we're starting to understand that photorealistic games are soulless ... stylised and even lofi retro graphics are typically much more interesting IMHO...

    • @tubeincompetence
      @tubeincompetence День назад +2

      Mostly just replying to say I agree. Personally I don't think I need any update in graphics for games I want to play

    • @cabir.bin.hayyan.800
      @cabir.bin.hayyan.800 День назад +1

      I was wondering if a game can be both photorealistic and stylised in its own way.

    • @SweetieSnowyCelestia
      @SweetieSnowyCelestia День назад +1

      first you get your self photorealistic render, then you stylize it. That's how it works

    • @SweetieSnowyCelestia
      @SweetieSnowyCelestia День назад

      @@cabir.bin.hayyan.800 Look at Borderlands 3

    • @GTXDash
      @GTXDash  21 час назад

      @@cabir.bin.hayyan.800 Capcom seems to have a good grasp on that.

  • @StigDesign
    @StigDesign 11 часов назад +1

    what also is interesting is how many games actually used idTech but renamed the engine, while others admiring used the engine and some had addend on the engine, Half life, Call of duty and so many other games used the engine with no or Litle or much modifying :D

    • @Razumen
      @Razumen 7 часов назад

      What are you talking about? HL and CoD modified the engine greatly.

  • @crestofhonor2349
    @crestofhonor2349 13 часов назад +4

    10 times better hardware doesn't equate to 10 times better visuals. There are a lot of things that go into a game and all of those areas aren't going to be equally beneficial to the image presented to the viewer. A lot of smaller things are being fixed in terms of lighting and geometry that weren't quite there last gen. We've also tended to double the frame rate in a lot of titles too which is another thing contributing. There are certainly some diminishing returns but often we are trying to do certain things that once were restricted to offline renders or even baked lighting applied to the world. Those things can cost heavy, especially things like real time ray tracing or adding in things like physics which isn't only computationally heavy but hard on programing too.
    Custom Engines can also be a double-edged sword too. Most developers do not have the time or resources to make one so most of the time it's the bigger developers. Other times it can be a nightmare like with Square Enix and the Luminous Engine and EA and the Frostbite Engine. Most big companies do have their own custom engines though although they do pick and choose which they use for each project and might even use commercial engines like Unity and Unreal Engine.
    Granted there are sometimes smaller studios who do make their own engines like Evening Star making not just their own engine for their first independent game, but also using an entirely new coding engine that hadn't been used in a game before. That game's story is fantastic and I'm still wondering to this day why Sega didn't have them do Sonic Superstars, when they did such a fantastic job on Sonic Mania

    • @Razumen
      @Razumen 7 часов назад +1

      No, he's right, optimization has gone out the window in favor of just getting it done, which is why UE5 has so many performance problems, and system requirements for so many games are ridiculously high, even then they look WORSE than games from years ago.

    • @crestofhonor2349
      @crestofhonor2349 Час назад

      @@Razumen most games don’t look worse though. There are the ones that do but most don’t look worse at all

  • @JeydetaJosen
    @JeydetaJosen День назад +7

    Wow whant an interesting well made Video. I love it!
    But one thing I have to say about feature packed Eninges like UE5 with nanite. I think it revolutionised how Indie Companys can work and pull of cool and big things that just wrent possible before if you hadn't the manpower and the budget.

    • @GTXDash
      @GTXDash  22 часа назад

      I have no problem with UE5, people just need to be vigilant as to how its features are implemented.

  • @samiiscool8181
    @samiiscool8181 2 дня назад +2

    Love your videos ❤
    I hope your game sells amazingly 💓

  • @sealsharp
    @sealsharp 20 часов назад +1

    I wish i could look into Idtech6. From what i've read, it's made with multithreading in mind and i'm really curious what that would look like.
    Unity does some smart things in the background on other threads, but fully using all threads ain't that simple. Though it looks like Unity7 will improve that.

  • @Disthron
    @Disthron 18 часов назад +9

    Fun Fact: The Binary Space Partitioning system used in Doom was actually invented to fix the issues with Wolfenstein 3D on the SNES!

    • @oberonQA
      @oberonQA 15 часов назад +2

      Incorrect. The BSP algorithm system was conceived in the 60's and then refined throughout the 80's and 90's.

    • @Razumen
      @Razumen 7 часов назад +1

      No, it came from Schumacker in 1969, who published a report that described how carefully positioned planes in a virtual environment could be used to accelerate polygon ordering.

  • @elchippe
    @elchippe 4 часа назад +1

    Others similar engines are Heaps Engine, RayLib, Ogre3D, Love2D, Tessaract Game engine.

  • @anothergol
    @anothergol День назад +5

    Not entirely true, these games were made for EGA cards which already had scrolling capabilities. And the way it's scrolling is similar to scrolling in older computers like the C64. It's built-in the hardware, so.. yeah scrolling was definitely not a hack. CGA wasn't great for games, but EGA was way more suitable (and later VGA so much better).
    The reason for the required tile alignments had more to do with the memory format of EGA graphic modes.

    • @GTXDash
      @GTXDash  День назад +1

      Yes, they can scroll but only to a certain point. Tiles have to be refreshed outside of the frame that is visible. You probably already know this, but yeah, that's on me, I should've worded it better. As far as I know CGA (and I guess in turn, MCGA) didn't do hardware scrolling, but the ATR still allowed CGA to run just as fast as EGA.

    • @anothergol
      @anothergol День назад +2

      @@GTXDash yeah but it was still not really new or revolutionary (as much as I like Carmack, and Catacomb 3D/Wolf3D really was revolutionary), it was similar to scrolling on much older computers. In fact, it was even kinda dumb because there was a better & easier method, letting the buffer wrap around the 64k segment, which is what Keen 4 & above did (and that was also pretty much the standard scrolling in VGA modes). So I would say that Carmack only found about the bad, and later good way, of doing scrolling, on his own, but he was still only doing what others had been doing. He only became a genius later I would say :)
      So yeah, that's why Keen 4 looked much better with less repeating tiles. It's not the hardware that got better, it's his trick that wasn't useful and there was a simpler way.

  • @ghost085
    @ghost085 День назад +6

    You may be wrong about Doom 3 performance. I remember being able to run Unreal Tournament 2004 on my geforce 4mx, while Doom 3 looked and ran terrible on that same machine.

    • @GTXDash
      @GTXDash  День назад +3

      @@ghost085 I have an MX 440. And Unreal ran badly on my system. Maybe I'm missing something?

    • @sealsharp
      @sealsharp 20 часов назад +2

      @@GTXDash @ghost085 There's a difference in shader capabilities between MX420 and MX440. Doom3 required DirectX9.0b while UT2044 required 8.1. That MX4 generation was a strange one because each of those "cheap" versions where crippled in different ways and hardware manufacturers could also add their own flavour in how they chose memory.
      While i can not make a diagnosis from your two statements, i would believe both of you, because that fits my memory from the what people experienced with the MX4-series.

    • @ghost085
      @ghost085 19 часов назад +1

      @@sealsharp The big difference in that generation was between the Geforce 4 MX and the Geforce 4 Ti cards. I got an used geforce4Ti 4200 from a friend back in the day and it was a whole generation ahead compared to the Ti.

    • @ghost085
      @ghost085 19 часов назад +1

      @@GTXDash I remember I could play Unreal Tournament 2004 at medium-high settings at 800x600 and good speeds, while Doom 3 at low settings, 640x480 and shadows disabled ran poorly. Perhaps your CPU wasn't so good, Unreal was very demanding on the CPU side. Even when magazines benchmarked Unreal Tournament they always would show results separately, labeled as Flyby (just the camera flying around an empty map), and Botmach (a full fledged battle with bots), with the difference between these results being very high.

    • @GTXDash
      @GTXDash  8 часов назад

      @ghost085 yeah, I think the tables turn when you use hardware from 2004. The Geforce 6600 was used to determine the speeds with every game set to max settings. UT2004 could have its settings set a lot lower than doom 3. Hence, why in that case it could run better.

  • @Hybred
    @Hybred День назад +5

    Good video! I hope it goes viral

  • @eduardosanchezbarrios5810
    @eduardosanchezbarrios5810 День назад +4

    IdTech engines ha vecthey level editor a simple tool thatvlest you vuild the games enviroments and setup scripts in an easy way also the maps are compiling in a way that optimizes and generales oclusión curling... but moders engines dont do the things in that way the focus of making the enviroments only with static meses made editing more harder because if you need to modify thecmap geometry an 3d artis nees to do that change while in idTech games the level desing can do the thong easy and fast because it was a brush base tool

  • @BlueMesaCable
    @BlueMesaCable 19 часов назад +1

    Game maker is quite optimized for drawing sprites. The last couple projects I've worked on run at a "real" FPS (though capped at 30) of over 1000 when compiled, and include a small bit of 3d as well. Granted, I was working on them on my gaming PC, but they run fine on cheaper, older computers without dedicated graphics in the hundreds of FPS. Game maker has been around since 99... running on 90s / early 2000s hardware just fine (I remember player game maker games as a kid, as well as the demos that came with it). Unless you're drawing many hundreds of dynamic objects, the FPS for a pixel art side scroller shouldn't dip unless there is something seriously wrong.
    Maybe "non-code" game maker is slower, no idea about that, but I seriously doubt a single professional game has shipped using much "code-free" game maker slop.

    • @lloyd011721
      @lloyd011721 18 часов назад

      what would cause that game to have such fps drops on a switch then? just poorly written/ managed code?

    • @BlueMesaCable
      @BlueMesaCable 18 часов назад +2

      ​@@lloyd011721 Honestly no idea, the game doesn't appear to have shaders or lighting. It is possible to draw things offscreen without restriction - like drawing the entire level regardless of the viewport and not disabling / deleting things outside the viewport. That would be kinda an insane oversight, and the dips seem to appear only in combat. Maybe porting is terrible in GM, haven't heard that on the forums ever - but if it was terrible for one console it'd be the switch lol.
      Edit: also just remember some people do pixel-perfect scaling by literally tripling or quadrupling the size of images, not scaling. I have used actual scaling, that's another possibility, but couldn't explain it completely.

  • @kalei91
    @kalei91 2 дня назад +1

    Excellent video! Very informative and very well explained, keep it up!

  • @ivucica
    @ivucica 20 часов назад +1

    10:49 “as heavy as Godot” - …I’m surprised a bit, when did Godot become heavy? I never had the impression it’s bloated or slow, especially if “held correctly”.
    (Last time I was working on a full game, we wrote a custom game loop and an OpenGL ES renderer, no engine as such; but if I went back into gamedev today, and wasn’t just experimenting, Godot would probably be the first thing I’d consider performance-wise.)

  • @HereIsZane
    @HereIsZane День назад +1

    I wish one of the latest version were leaked. Fingers crossed

  • @dont-be-hasty
    @dont-be-hasty День назад +1

    Amazing essay on a very important (and cherished) piece of gaming history: id Software. Thank you!
    I'm curious what porting to consoles looks like for your game, which has non-C/C++ code in it. Last I checked (which was never, only heard rumors on Reddit and Discord,) Sony and Nintendo do not allow games _not_ written in C/C++ on their platforms.

    • @sleepy_femboy0
      @sleepy_femboy0 День назад +1

      Well, Unity used mono to run C# on consoles. As for Python, homebrew devs have already gotten python working. But in theory, as long as the user can't hack their hardware or run their own code using the game, I don't see why they wouldn't allow it.

    • @GTXDash
      @GTXDash  21 час назад

      If we can make a living on selling games, my co-partner will be able to quit his job and work on it full time. We would probably rewrite the logic in C if we can't get the Python code to work.

    • @sealsharp
      @sealsharp 20 часов назад

      Some companies ( like Apple ) do not allow Just-In-Time(JIT) compilation for security reasons. So c#/dotnet for apple is provided as Ahead-of-time (AOT) compiled. Unity has it's IL2CPP mechanism which convert the compiled bytecode of C# into C++ sourcecode which then gets compiled into binary.

  • @P1XeLIsNotALittleSquare
    @P1XeLIsNotALittleSquare 16 часов назад +18

    6:50 technical debt doesn't appear out of nowhere by itself, it is direct consequence of bad coding culture.
    7:58 it's pretty easy to put any modern cpu to its knees. I mean, put 3 or 4 nested for-loops and even 14900k will suffer.
    9:00 If you don't use a feature, the code in engine doesn't execute, so it won't hurt the performance in any way. Few kilobytes of ram for a feature is not a big deal nowadays.
    26:38 I highly doubt that _all_ 3d engines use something from idtech...
    34:10 Don't be sloppy, it's not a marketing, it's software development

    • @jvne_
      @jvne_ 16 часов назад

      or just not being ready to modify an existing open source engine ie. as you mentioned, skill issue

    • @sourkefir
      @sourkefir 11 часов назад

      UE is opensource for free, you just have to sign an agreement.

    • @skaruts
      @skaruts 8 часов назад +2

      *_"If you don't use a feature, the code in engine doesn't execute, so it won't hurt the performance in any way. Few kilobytes of ram for a feature is not a big deal nowadays."_*
      That's not quite true: the engine has to be built around that feature or with it in mind, which may dictate how other systems interact, and it will have to perform checks in many places to see if that feature is being used. Plus, that feature may still be exported with the game, adding to the file size.

    • @P1XeLIsNotALittleSquare
      @P1XeLIsNotALittleSquare 8 часов назад

      ​@@skaruts well... yes and no. Some features are integral for the engine, at this point its about choosing right tool for the job. But many features are optional. And runtime checks isn't big deal thanks for branch predictions and other fun stuff)
      Cities Skylines (first one) devs rewrote entire rendering and game runs just fine.

    • @azazelleblack
      @azazelleblack 7 часов назад +2

      Right. It's a poor craftsman that blames his tools.

  • @Bunnunoox
    @Bunnunoox День назад +3

    This is the most informative ad I've ever watched, haha. Genuinely a great vid though.

  • @Novastar.SaberCombat
    @Novastar.SaberCombat День назад +5

    A lot.

  • @JuanxDlol
    @JuanxDlol День назад +2

    29:13 - Not rly, its id Tech 3.5 actually, they couldnt cut enough features of 4 to make the port to xbox 2001, but they sure could port the main stuff from id Tech 4 to 3, thus creating id Tech 3.5 - That also testament in how well made id Tech is.

    • @GTXDash
      @GTXDash  День назад +2

      If that's true, then that's incredible, because... yeah, the sharp per-pixel lighting is still there.

  • @jc_dogen
    @jc_dogen 2 дня назад +1

    crytek wasn't just ignorantly expecting single core speeds to improve. imo that's a kind of thoughtless assumption to make. those guys weren't stupid, even to the average gamer in 2005-2006 it was obvious how cpus were changing. the reality is that it's a multi core programming is difficult problem (even more so at the time) and some of the optimizations they made to improve cpu performance simply reduced the workload of multithreaded tasks.

    • @GTXDash
      @GTXDash  День назад +3

      I'm not saying Crytek are a bunch of hacks. Building an engine for release years after the code is set in stone is a risk that all engine developers faced, especially in the 90s and 2000s when technology was changing so rapidly.
      Crytek didn't start working on the engine in 2005, more like 2003 (depending on who you ask). It took a good while to build it.
      CryEngine 2 did receive treatment to run on dual cores since they were aware (during the early stages of the engines development) of the upcoming Athlon X2 and Pentium D, but 4 or 8? No difference.
      Again. It's not their fault. How does one build a game engine for technology that doesn't exist yet?

  • @rafinery576
    @rafinery576 44 секунды назад

    Yup, overheads today are ridiculous, and I'm a bit fed up with latest tech working well only in demos.
    Meanwhile, there's no commercial dedicated solutions for RTS, MOBA, Larian-like RPGs, fighting games... and so on.

  • @Mautar55
    @Mautar55 День назад

    34:47 . I would add tools and features as extensions so they can be plugged off if not necessary. As for visual style, i would rather say a visual fidelity target range.

  • @ikcikor3670
    @ikcikor3670 День назад +5

    Nah, I'd tech

  • @forasago
    @forasago 4 часа назад +2

    27:37 "no game handled dynamic lighting the way id tech 4 did... until FEAR"
    NOPE!
    1. Shrek had shadow volumes on Xbox in November 15, 2001. It used them for directional shadows.
    2. For comparable visuals (shadow volumes used on many local light sources) Doom 3 wasn't first either! Starbreeze's Riddick: Escape from Butcher Bay had virtually the same aesthetics and actually ran even better than Doom 3. And it first launched on Xbox on June 1, 2004. Doom 3's port only released a year later (April 4, 2005). And on PC Doom 3 came out on August 3, 2004 and Riddick followed on December 3rd. Nobody waited for FEAR. By the time FEAR came out the entire industry had already written off shadow volumes as a gimmick due to how poorly they scale.
    P.S. This is nothing against you personally but it's one of the great injustices in gaming history that Carmack constantly gets credited for shadow volumes (and some other things like the fast inverse square root) when he didn't do them first or best.

    • @GTXDash
      @GTXDash  2 часа назад +1

      shadow volumes have been around since id-tech 3 (and 2 depending on who you ask). That's not what made id tech 4 unique. It did it at the global scale an dnone of it was static. Freaking Shrek didn't do that, only select objects casted volume shadows.
      Reddick looked great on Xbox, not as dynamic as id tech 4 with its simpler environments and shadows. But still impressive.

    • @rodneyabrett
      @rodneyabrett 9 минут назад

      Riddick was the first game I remembered that used normal maps and the devs even put out a video explaining the tech. I remember thinking that normal maps would not stick around as a technology because of how time consuming it was to make the model a second time, just to steal its light information. lol.. I was wrong about that. 😂
      ruclips.net/video/SQrHkKnSBcA/видео.html&ab_channel=MarianoMartinez

  • @Zombiefruit
    @Zombiefruit 20 часов назад +8

    Please don't quote the scammer that is ThreatInteractive. He's not wrong but he wants $900,000 to hire devs to fix unreal. He's a pure grifter. And let me sum up the video for you: law of diminishing returns (plus some rose-colored glasses). Just look at the silent hill 2 remake. The in-engine cutscenes look far better than the pre-rendered cutscenes from the original.
    Blaming game engines and cherry picking examples makes for a poor argument

    • @ZumoDePapaya
      @ZumoDePapaya 20 часов назад +3

      The funniest part is that ThreatInteractive is supposed to be his studios name, and he pretends it's a real company when it's extremely obvious it's just him

    • @JackWse
      @JackWse 19 часов назад +3

      Say what you will about threat interactive and th the the wishes they wish for.. they're not bad things, and they do actually know what they're talking about (assuming it's actually a they). If you're judging just on the visual quality that's really missing the the point lol.. and if you're comparing it to a PS2 release of which the story of the rendering was a miserable labor of love that barely worked, partially because it didn't.. I got nothing on that one, should be kind of self-evident
      Unreal is a stock engine, designed for an increasingly small awareness of actual technical capabilities, trying to be a one size fits all for people that won't even check a box, let alone allow the user to check the box lol.. and this is consistent time and time again.. and there's a reason for that.. engine techs are not common anymore. You can get paid a lot of money, to do things other thing games lol and people with those kinds of skills usually want to eat.
      I think threat has presented a case fairly reasonably for what they would like, and as to whether or not it's doable.. That's a.. well oh well who knows I don't.. carmack probably would as long as it doesn't have anything to do with sound.
      But.. from your very specific example lol.. I have a sneaking suspicion you don't either.

    • @JackWse
      @JackWse 19 часов назад +1

      Also, you're not wrong about the diminishing returns thing.. But that's kind of where I'm like thinking you might have missed threat interactive's point.. The point is is that it's diminishing returns... so don't do it... And certainly don't make it the standard, or you know.. maybe this wasn't as emphasized but I'll throw in my own.. maybe try to educate Devs on boxes to check instead of trying to encourage them not to.
      There's a lot of GDCs that epics given, and a lot of them make me want to throw up in my mouth when it comes to the nitty gritty.. and mostly because I know that what they say is gold as far as what people are going to do with the engine.

    • @JackWse
      @JackWse 19 часов назад +1

      That said, this video already doesn't really know what it's talking about either.. He's making a lot of assumptions, that you could actually just find out from interviews, and a lot of generalizations that I think are damaging.. Just in terms of public perception and inevitable group think.

    • @GTXDash
      @GTXDash  8 часов назад

      @Zombiefruit he gives hard data and his methods can be recreated so you don't have to take his word for it. However, I just want to point out, when I make a reference to something, that doesn't necessarily mean I agree with everything. There are many youtubers I reference that ironically disagree with each other on certain issues.

  • @alfosisepic
    @alfosisepic День назад +2

    I love your profile picture.

  • @RedSntDK
    @RedSntDK День назад +3

    26:33 You mean "undocumented features", right?

    • @kphuts815
      @kphuts815 День назад

      lol, it took me a bit to understand this

    • @GTXDash
      @GTXDash  21 час назад +1

      Of course.