PLEASE READ FOR UPDATES & RESPONSES(and discord link): 1. Watch this video in 4K (as in streaming settings) as any lower compression will hide the visual details discussed. 2. We would really appreciate it if viewers could share on giant game related subreddits. These prevent self promotion but these can really push our channel's success by the thousands. 3. TAA commands though we suggest FXAA since Lumen really doesn't work well with it. r.TemporalAA.Quality 2 r.TemporalAACurrentFrameWeight .6 (with vsync) r.TemporalAASamples 2 r.TemporalAAFilterSize 0.09 r.TemporalAA.Upsampling 0 r.TemporalAA.R11G11B10History 1 r.TemporalAA.HistoryScreenPercentage 100 r.TemporalAACatmullRom 0 5. We would like to invite you to the official Threat Interactive discord: discord.gg/7ZdvFxFTba This will be the main discord for engine development, news, and Q&A support for developers. 6. Some people seem to have issues with the comment made at 4:25 | Yes, SH2R uses one of the first versions of UE5. This is still disgusting quality to have been included in a official release of the engine and these issues are still present in the newest preview of UE5.5. 7. We do not care about ignorant statements auguring about our decision to zero-in on UE5 exclusively. It doesn't matter if Unity has a higher market share. The people complaining about a "monopoly" are failing to understand it's already a monopoly! And the monopoly includes the major games that are trying to push better visuals. The list of UE5 games using these poor designs have grown too much! We intend to push graphics as well and show the example of visual quality we expect from an overly supported engine such as UE5. We already know improvements to UE via forks impact the industry as seen with The Finals. That was a performance enhancement, the industry also needs a visual enhancement and our videos prove it will not detrimental to performance on 9th gen console hardware. If you are paying funds to engines like Godot, don't expect anything to get fixed in the games you'll actually be playing. 8. Regarding the comments made about the hair in 4:45. What should been have said the hair looks better in comparison with the traumatic dithered variant. In reality it could look much better, without too much performance cost and most importantly being TAA independent as shown by frostbite developers here: ruclips.net/video/ool2E8SQPGU/видео.htmlsi=_yjztNp5PBIeZkA2 9. Stay Tuned*
I'm so glad you are making these videos. As a game dev it was so frustrating to see influencers talking out of their ass. I was looking like "old man yelling at cloud" when I talk about things like tessellation and taa
The irony of not using the fog to their advantage to hide low quality LOD for performance purposes, when that’s literally why it existed in the original game.
Irony or not. Two completely different mechanics. In the original game it wasn't "fog" just floating textures. This is volumetric, millions of particles rendering at the same time.
@@Darkcranio Fog doesn't use that much resources, there's literally a performance breakdown in the video. Just sloppy devs being sloppy devs. I couldn't believe that the game with a static environment like that and the fog advantage still stutters horribly and has UE load stutter.
@@Darkcranio Volumetric fog doesn't/shouldn't take millions of particles to render. Volumetric fog is generally raymarched in post, with particles or drawn objects being used to populate a 3D texture or some other data structure with information about how dense the fog is at a particular location in the world, as well as maybe the material properties of that fog.
@@disseminate4it has been that way since UE4. the goal is to make people with less technical knowledge about game engine can create their own game more easily.
@@disseminate4 It's Nvidia's greedy cooperate influence they wanna sell GPU's they wanna become just like Apple and sell GPU's every year and if games were optimized they wouldn't sell new GPU's. Nvidia has alooooot of money, they can pay off developers and even game companies.
crazy how games used to look ok on lowest settings, and bumping them up added fidelity. now max settings are baseline, and by reducing them you are subjected to pixel vomit.
From what I know you can't straightforward bake lumen lighting, it's just meant to be dynamic. But you can use static lights instead of moveable when you can to reduce the real time calculations
Baked reflections never hold up as well as Lumen. Lightmaps can also only be so high resolution. Per pixel lighting on paper would provide more precise GI than baked lighting, though obviously it's vastly more expensive and in this case a lot more noisy.
Ey, that’s me. Thanks for all the work yall do. It’s always got me scratching my head why games have been so demanding. The ghosting too with reliance on TAA in particular bothers me too. I think we are in an awkward middle ground for graphics and optimization that we will end up getting past, but not yet. Actually hopeful we will- idk if UE5 is in with Nvidia and AMD to push their upscaling tech and raytracing. Another game that could be fascinating to take a look at is Dragons Dogma 2, the game doesn’t seem to do anything ground breaking (maybe except for a dynamic day night cycle), but it is SO hard to run. Especially at native resolution, the game defaults to progressive upscaling. I also notice tons of dithering in hair and foliage. Sucks with that one too because DD2 is a fun game that shouldn’t be having these problems.
We are glad to see you here and find you in support of the topic we're covering. You're more in the category of "overlooking" rather than "refuse to address" but that's just because there isn't enough information about what we cover so no hard feelings! We are getting support from viewers because other channels we showed are making shameful comments regarding the quality of upscalers and we hope our content helps you others spread more correct information about what's really going on. We 100% encourage you to view all our videos since I think they would answer a lot of your questions and help your takes on some things in the future. Thank you for the kind support and title suggestion!
@@ThreatInteractivedw abt it man, I didn’t take any offense. I’ve actually been watchin the content since yall started the channel. I am usually pretty positive about DLSS, but mostly compared to FSR, XeSS, and TAA, because DLSS is just the best at what it does. However, I’ve found it weird the absolute reliance on upscaling to get anything done now. Again thanks for all yall do and gl on the game. All this will just make games run better and I find it really interesting. I might go ahead and shoot yall an email and we could talk sometime. I think it would be fun
14:05 I get the pragmatic reason to focus on overhauling Unreal, but the FOSS ideologue in me feels compelled to point out the effort in Godot and other FOSS projects isn't wasted just because they're not used in industry. I think it's important to have a free/libre floor at least to keep pressure on proprietary, privately owned projects to not screw over developers who trust and depend on them (see Unity's "charge per install" attempt or Microsoft's AI/privacy abuse with Windows).
yeah its very narrow minded to dismiss smaller engines which are what the indie and ever shrinking AA scene thrives on, i feel any amount of competition is good and we should just give unreal a monopoly in games cuz "its the best we have"
I think it would be great, if Threat iterated more on that point, because I feel like their stance can easily be misunderstood. I assume they're specifically talking about AAA productions, where Unreal is simply ubiquitous. And it shows by the sheer volume of bigger titles released on that engine. I'm by no means an expert and other engines are certainly very capable, but there must be more reasons than just documentation and support, why almost no team uses engines like Unity or Godot to develop large scale games in the vein of a Battlefield or open world games. Unity game releases like Escape from Tarkov (BSG really struggles with this one), BattleBit or Cities Skylines are few and far between. And for such huge productions, I've mostly seen devs either using Unreal or their own proprietary engine such as Frostbite. And visually I start to dislike UE more and more.
@@CyboTyaS My understanding is that if you're making a game today with high fidelity graphical targets and don't want to make your own engine (or do comparable engine work), Unreal is basically your only option. Unity can look just as good, but not out of the box like Unreal. Godot is functional for 2D games, but sub-par for 3D. There's also the matter that if everyone is using Unreal, companies can easily find employees already familiar with it, saving on training time/costs. Good documentation and community support also shouldn't be underestimated. The behind the scenes reports of bad releases often include tons of loss due to engine and interface issues (Anthem and Mass Effect: Andromeda come to mind with Frostbite).
@@stormburn1 The _actual, core_ reason for using Unreal is simply that large publishers prefer having one single engine that everyone is trained in - so that when they fire 20% of their developers to 'reduce costs', they won't have to spend months training the replacement developers they hired for half the money straight out of school. A highly experienced dev team could have avoided some, most, or all of the issues discussed in this video even if they were forced to use Unreal. But experienced developers cost _money,_ and therefore get _fired._ Such an experienced dev team could have even made this game in Godot if they wanted, modifying it for their needs as required (that's the real best thing about Godot btw, it's extremely easy to work with) but with Unreal you already get so many features out of the box that you actually have to turn many of them _off_ first. This is why a fresh full Unreal Engine install is _60GB_ and a fresh full Godot Engine install is _55MB_ (yes with an M).
@@stormburn1 "but sub-par for 3D" Don't agree, on the hands of good developers, Godot is a very capable 3D engine, proven by the fact that, The Road To Vostok maker changed from Unity to Godot 4x and was able to still keep most of the visual fidelity. And don't assume that is only Godot 4 making it possible, there's a guy making a visually impressive 3D horror/stalker/Fear inspired shooter in Godot 3x as well, called Leaden Sky. A engine is only as good as the person using it. And a good experienced developer can create impressive games in tools many people think are bad.
It's baffling to see bad practices like that used in a AAA game. Using fog to hide imperfect rendering is something old games were doing from the near start. I mean, the first Silent Hill was doing it ! It was the perfect occasion to use it. Also, geometry LOD is a basic optimisation, combining both would have been perfect for this game. Looks like Wirth's law is becoming a threat rather than an inconvenience. And it's also baffling to see TAA to be used for nearly everything. I wonder if it's related to the fact UE5 use defferred rendering by default . Because, if i'm not wrong, it's the only form of light AA working with it, with SSAA (MSAA don't work with deffered rendering). I'm pretty sure you won't have that kind of problems with the next DOOM Game, which use a forward rendering engine.
The abusive use of TAA is far from using "deferred". We have deferred games that use MSAA, SMAA, SRAA, reasonable TAA, and more. It's just that: 1. Most developers do not care about the image quality issues or simply like the blurry upscaler look(we've seen the praise). 2. Investors/high level employers do not care to set standards or pay to development of better approaches.
@@ThreatInteractive Do I understand correctly that there is difference between UE 5 TAA and "reasonable TAA"? Where would you place modern demanding TAA methods like DLSS/FSR/XESS (they take a lot more frametime, but seem to look good). In BG3 default TAA and upscaling seem to look a lot better than SMAA.
Your right! The default hair was so traumatizing is made the less dithered hair look better. The best hair shown in real-time rendering is actually done independently of TAA! ruclips.net/video/ool2E8SQPGU/видео.htmlsi=fV3W6LuI0eCTq3cz (frostbite)
24 дня назад+31
And this is why I think people are really underestimating art style. Sometimes, simpler render engine pipelines with a better combination of textures and models is enough to make those visions pop without burning your house down.
I already hated UE5 being adopted by developers. Forcing people to buy video cards ever-increasing in price for graphical improvements that get overshadowed by a smeary picture and poor performance. Now I can articulate it even better!
You can't. This is all in the short term, the hardware will become better and these games will look much better in the future, because they are not pooping in and out optimised geometry.
Its parallels the movie industry where they now focus on big budget movies full of CGI that fail because they just use money to attempt to substitute creativity.
It's like everyone sees those 4k 4090 tech demos and thinks that's what unreal engine is but that is with a 4090 at 30 fps going through RUclips compression. But in reality it's a lot flashy features like Luman, Nanite, and ray tracing that get it a lot of attention and praise but then you look into it and it sucks and most other engines can probably do the exact same quality without the crutch of Nanite and other unreal engine stuff.
I simply do not buy UE games unless I know the devs basically reverse engineered the whole bloody thing and managed to make a game that is properly optimised (Lies of P for example)
It's not the fault of the engine itself (the tech is actually really good) it's the fact that devs are choosing to ignore optimization because "ehh Nanite and lumen is enough", especially since older games that run on UE3 like the first three Arkham games actually perform really well while looking really good. The "auto optimization" that UE5 has just makes them think that they don't need to optimise
@@lollingrock Are you serious?! The devs test the games on different hardware. There is no way they couldn't have seen and felt first hand that the "auto optimization" does fuck all and the games run like partially frozen molasses.
It’s bizarre isn’t it? feels like finally a professional is voicing and demonstrating what us gamers have been complaining about for years, with absolutely zero alterior motive other than exposing and educating. This guy makes Digital Foundry look like shill amateurs, I’m not sure if that’s why they really are, I might be wrong, but the difference between the two approaches is gigantic, one feels like truth, the other feels more like an advert.
@@George-um2vc digital foundry is pretty honest from what I've seen. They often don't always have nice things to say because so many games have issues like the ones listed here.
@@SoulbentAnime that’s true, they are more critical than many give them credit for, but I prefer this style of analysis, although it somewhat depressed me (since all future games are essentially doomed for the foreseeable future) and I will likely be playing Witcher 3 and RDR2 till things change.
@@Gustaviustwinkelberry they said jedi survivor was the worst pc port of the year. I don't think that's downplaying it. This dude is definitely more in depth obviously.
This game would be perfect for baked lighting. Much better performance and since there's no time of day or anything like that I dont see the point of GI. Baked lighting could potentially look better and have more control over the atmosphere
If you want everything to look cohesive and firmly planted into the game world, you actually don't want to do baked lighting. Dynamic objects and dynamic lights (Jame's flash light, certain lights changing color or intensity in later areas) don't blend well with pre-baked lighting. Also working with Lumen is so much faster for artists, they can make adjustments in real time and testing dozens of iterations a day, rather than needing to wait for the lights to bake (which can take anywhere from 5 to 45 minutes per iteration). This also leaves the door open for modders to do cool things with the lights.
@@03chrisvThe point about iteration speed is a very good one, but using the full pathtracer could be used for development, and baked lighting be calculated every once in a while to make sure that the final results are still what you’re expecting. Won’t change the issue of flashlights not really playing nice with that ofc, but even that could be worked around by fading pre-baked lighting out near the camera, and only using lumen for closeup stuff I’d imagine.
@@03chrisv Bounce lighting for flashlights is unironically the easiest thing to do. You just sample a few colours at where the beam is hitting and create phantom point lights there. The Last of Us - the original one, on the PS3 - did this.
@@AlphaGargthere's also not using GI for realtime lights, like Unity with directional lightmaps, which blend realtime direct lighting with baked indirect.
GPU companies make money from telling you that your graphics performance is bad and you need to upgrade. They're financially invested in not pushing for optimizations by developers. That's pretty much it.
@@arenzricodexd4409have you played DOOM and DOOM eternal ? I see no one complaining about those games. They look nice, and perform well. As they should, since the gameplay is fast paced, and the environment not so open-world. But these days, you can't take that for granted
@@etienne1062 people praise doom for it's graphic but in reality they did not really push it's graphic that hard (already play both). So it is faster to run. Plus it is one of the popular franchise out there sometimes they got a pass like in the graphic department. But others? Just look with Space Marine 2. I see the graphic is just as good as Doom if not better in certain aspect but people complaining because despite it's good performance developer are not using things like high resolution texture. Same with black myth wukong. Dev did a good quite a good job where their low setting really mean low setting since you can really see the difference but then people still complaining about low res asset.
@@arenzricodexd4409 With good reason.. Look at CS2, Valve hasn't even properly optimized this easy looking game .. these large companies have become complacent
dude that first minute in your video was impressively good and exactly what i want to hear when i see a thumbnail like yours thank you so much for what you do stranger!
I have next to non of this on a ps5 in quality mode besides a little ghosting on fog and obviously 30fps. Perhaps pc is suffering more with this particular game?
It's just like a well shooken Polaroid... That was out of focus... Of a moving object... And then you tripped while you were taking it! Photo realistic!
Ok but AA in games is still garbage like TAA, nothing like the good old days, instead of distance changing the effect of AA it all is blurry/waterpaint and the end result is like looking at 2D. Modern problems self created, and worse is TAA is baked into games
@@neilranson4185 it has absolutely 0 to do with platform. PS5 is basically a pc. PS5 is not a console like PS1 or 2 where they had their own tech for FX or post-processing. Whatever issue you have on PC, you will have in a PS5. And if you don't, then it's because the devs adapted the game to the fixed hardware that a console is.
It's so weird to point out the problems with UE and then discount alternatives, why put effort into improving someone else's commercial product you're literally doing free work for a multi-billion dollar company that doesn't care about you at all.
To me it seemed more like a "We know how the AAA industry is shifting towards UE5 so we'll focus on UE5". These will be the games that will "force" people to get new GPUs. The smaller AA or indie market was never and it's probably not gonna be the one showing the issues present in this video. But I totally get your point since I'm also an advocate for FOSS
Absolutely. If Threat Interactive went and put their efforts into improving Godot, they would be helping to build a Godot into a powerful alternative to Epic's monopoly. Godot isn't reliant on temporal solutions like Unreal is. It's easier to build on what's already there, rather than spending months fixing Epic's mistakes. A powerful Godot would democratize game development and shatter the hold that large corporations and proprietary engines have over the space. But if Epic decides Threat Interactive's efforts on a custom version of Unreal are detracting from all the money and advertising they've poured into Nanite and DLSS, they could very well shut the project down and ensure their work never again sees the light of day. None of these solutions that Threat Interactive talks about are going to help Epic make more money, so at best they're inessential, and at worst, an tangible threat to Epic's "innovations".
@@jwhi419 Lumen, Nanite, etc. are all on by default. And a lot of indie devs just leave them on, not not knowing what they are or what they're intended for, leading to games that are horribly-optimized for no reason. It's easy to say it's the devs' fault, but that ignores Unreal's responsibility in teaching its tools to devs in an intuitive way...
I understand the issues with TAA, specifically with the most basic and bad implementations, because there have been improvements made (by Guerilla Interactive, for instance). But honestly I've never seen SMAA handle specular well, and that includes Frostbite. Specular in NFS 2015 just doesn't look nice, for instance. It shimmers and shifts. While subjective, I'd still go for TAA, but ONLY at very high framerates and rez because of the flaws it brings and its inherent temporal and rez-bound nature. TAA at 30fps or 1080p is just a thing I never want to see, personally. As for TAA being an excuse for poor implementation of things like AO, transparency and translucency - this here I can get behind 100%. Don't like the trend of everything being built around that. A side note for Unreal and its overall lighting implementation... You know, it's funny. I've worked with UE since it's version 2 days and it's always been like this with Epic. First, they insisted on Lightmaps for years (for obvious reasons at first), offering very poor alternatives or stuff that is just not enough or never leaves beta implementation (Hello, LPVs!) . So, for the longest time, it was Lightmpaps or bust. It's not that they are bad, per se, but it's just not optimal for all scenarios either. Then comes around UE5 and now we see the exact opposite. Away with the lightmaps, welcome to Lumen. No other less resource-intensive solution. I just don't get it. Why is Epic always like that when it comes to lighting solutions. They have such amazing tools for so many things but their lighting pipeline has been so damn stringent for decades. Forward vs deferred - another story altogether. It's like a hypetrain... We have Doom Eternal's ID Tech working on forward rendering and doing miracles. But no, "outdated technology". And this brings with it some many pains too. I'm still waiting for UE to implement custom-mesh GBuffer decals (not to be confused with simple masked custom-mesh decals or decals with a mask-dither pattern+AA solution, bleh...). It only supports GBuffer decal actors with this functionality, but not custom meshes. So, pipelines for materials like those used in Alien Isolation, Doom Eternal, etc. are just not available to you in the same capacity. UE is often being lauded as being so flexible. And quite often it's deserved. But in other cases - not at all.
SMAA should only be used for basic staircased aliasing/input(for spatial upscaling, which is what FSR1 lacks). For specular aliasing there are many solutions such as filtered mipmaps such as the ones offered in cryengine, extremely cheap specular AA as seen in Half Life Alyx and CS2(or the Square Enix version), and a whole new approach we'll be covering soon. And of course, we're not against sub pixel sampling via more intelligently implemented MSAA or Temporal Subpixel Sampling like 9:20 but without "Standard MSAAx2" sampling and infinite frame holding accumulation buffer. Lots of aliasing issues are *specific* and have their own solution. The problem right now is getting all those solutions into one engine the industry uses, so that they can be combined. We would like to see you in the discord if you don't mind listing more issues you've personally noticed with UE. Thank you for your comment.
Please check Gears 5. It is a Ue4 game but still one of the most optimized game of all time for me. While we know most of the Coalition's staff are ex-Ue developers, what did they do to the Ue4 pipeline to get good optimizations that the industry still can not achieve.
@@progste does not explain anything. UE4 is famed for its various types of stutters. Jedi survivor is a UE4 game and it absolutely runs like crap. No nanite, no lumen, just crap.
@@etienne1062 stuttering on UE4 wasn't universal, especially in the late games. I haven't played Jedi survivor but it probably runs like crap because they did a bad job of it.
@@progste ue4 is not more optimized, the engines are essentially the same thing, in fact as of 5.4 ue5 is more optimized. Nanite is just an optional feature that you can choose to use in ue5
I agree with you, GEARS 5 is one of the best optimized games out there, it has excellent graphics and rarely stutters. The problem with developers is that they often use scalers and frame generators, why optimize when I can use upscalers? Plus they have very little time to develop their games because of their bosses.
After seeing every gaming type of media praising this game on release it's actually refreshing to see someone making valid criticism to the game, now that the dust has settled i can say that i didn't really like what was delivered, and i think the original still gives you a better story and narrative elements than the remake.
typical of people to get overhyped like fools and were mainly praising it's graphical enhancements. always about graphics... it looks flawed anyway. the original wasn't about combat either and they made it a sole purpose in the remake. a try-hard dark souls esque rip off
It feels like many were never around in the day we had to hold the face-count down, we could not "cheat" the performance via external means or say "just get a better computer bruh". I did a bunch of map making in the late 90's for my LAN party crew and learned a lot from the optimization we could do back then, the Unreal engine blew my mind with how it worked and performed compared to Id's BSP engine games. DLSS getting in via "deal-making" makes a lot of sense with how quick easy and free FSR is but you never see it, while games have every single envyDiaf thing in existence jammed in there. Ick. Then: I'm making a hand-rail, I know the player will see it in third person view. I will make it on a cylinder with 16 sides o ensure it never looks off for anybody but the most sharp-eyed. Now: I'm making a hand-rail, my nose is up to the screen to ensure it looks photo-realistic, even the booger-welds will be be smooth, the game engine will automatically make the LOD models, surely.
"how quick easy and free FSR is" FSR looks like complete shit, end of story. it's completely pointless. at least DLSS has a valid use case. the fact that nvidia "bribe" developers to implement it isn't as incriminating as one might think either. I highly doubt any developer intentionally makes games perform worse in order to justify DLSS. remember, the consoles have AMD graphics and are always the target platform for anything multi-platform, so if the games run badly on AMD hardware that's just a case of incompetent developers. no way nvidia pays them enough that they would jeopardize how the game runs or looks on console.
@@forasago Correction* FSR (yes 1.0, this is even the worst case scenario too) looks like bad at any native resolution below 1440p set lower than quality on fsr. Its not total shit like you claim but it hardly holds a candle to DLSS 3.0 or newer.
Not a Dev, but in general just a curious person, so I appreciate this channel and the awareness it provides. I learn a thing or two while watching these videos tho. Thanks for the insights :)
After replaying the OG just recently, it's even more embarrassing seeing how the PS2 original's fog seems more dynamic and "living" than the 2024 version. The original Team Silent also stressed that they wanted the fog to truly "surround" the player, not just be a wall in front of them. Yet the SH2R clearly has an empty "bubble" around the player, that blocks the mist from getting close to James (best seen in the static camera angles mod demonstration videos).
Yeah, i'm finding way too many UE5-based games to perform really bad on a 4080 Super. DLSS Quality is almost always a must, not to mention removing the over used post effects (piss filter) and blur etc. When DLAA is realistic, it's always nice to see, as it's usually ok'ish. But the issues you highlight make it more clear why this is the case. I game at 1600p UW.
@@spyrochrisgaming yes, 144Hz. Running it at 120Hz 10-bit mode for compatibility and smooth gsync functionality though. Some games don't like *odd ratios for frame delivery. Usually 60/80/90/120 work the best from experience.
@@wille84fin what’s the difference between choosing to use it and having to use it if the end result in IQ and fps is the same? It’s clear spending rendering budgets in pushing more native pixels and higher frame rates is a waste of the budget if the end result can look as good without needing to be entirely native
In all fairness, it seems like most modern devs don't know there is a graphics pipeline.. or maybe they know it's there they just don't they don't know how it works or what it does and it's best to just you know never check a box that isn't already checked by default. It's how we lost the developers console, and most of the graphical options that used to be standards sadly. Honestly the worst thing about unreal is the way epic touts its best practices and unfortunately a lot of its customers don't challenge that the way they should. And honestly they might even be in agreement.. I've noticed a suspicious lack of good eyesight in the development community.
@@JackWse people know at least the basic, the problem is that unreal engine goal is to create real life graphics no matters what even if they fail in the road to that, so that engine is not for common games right now because it's visuals are fked up, try other options like unity, cryengine, o3de, Godot, those are still following the old path which is to create fun games, the visuals are important but they're not the most important thing, what matters is the gameplay, see among us they made millions and it is a stupid 2d
I appreciate when someone actually takes the time to figure out why something is performance heavy rather than making assumptions based solely on what they see. Now with that said, I do also want to remind everyone that when a question is raised on "why did the developers use this non-performant system when there's an obvious better choice here," the answer DOES tend to be "because the publishers and/or directors told them to." I'll never forget a back and forth I had with one of the people who worked on the opening to Skyrim. They were talking about how much trouble they had with the realtime cart sequence at the start, and when I asked them "couldn't you guys have just baked an animation instead of making it realtime and having to fix it every time the geometry was changed even a little," they responded with something like "Yeah that would have been so much easier and would have avoided so many headaches and wasted time." With that response alone, they told me everything I needed to know.
I can absolutely understand your decision to stay with Unreal Engine, but justifying it by claiming that all other engines are just a wasted effort is ridiculous. That's like saying you don't need free market because your state already gives you everything you need. And if it doesn't give you something - that means you don't need it. Sure. Investing into an engine like Godot that is so much smaller and doesn't focus on AAA games might seem like a wasted effort, but with Unreal Engine last time I checked the licensing it said you're not allowed to make your own version without a special agreement with Epic. The fact that Unreal Engine is source-available doesn't mean you can legally sell games running on your own modified buids of the engine. With Godot you can do that, because it's open-source. Unreal Engine is proprietary and source-available. I hope you're aware of the legal bindings that are present around here.
Exactly. Not everyone wants to create photorealistic AAA quality games. Yeah, you wont get the same graphical fidelity in Godot than you will in UE5, but Godot isn't even that old. As you said, the licensing is a major reason for some people. Also think about storage. UE5 games take up several GB, and the engine itself is several GB. The entire Godot engine is at most 200 MB and you can still get decent 3D graphics. I pretty much agreed with everything in the video, but saying working on game engines other than UE is wasted effort is just silly. People need alternatives. I love Godot because its super lightweight and opens in a couple of seconds.
@@How2Bboss I use Godot myself - I have a game project called Liblast (open-source). I professionally work in game development as a tech/3D/enviro artist with Unity though. Godot has all the basic tools to make decent 3D graphics. I am working on a asset workflow and pipeline to achieve quite good visuals with much reduced artist workload due to handling common effects like edge wear, beveling etc at a shader level with using of special custom textures. I see people pushing Godot to it's limits and it can get things done. Surely it's not as fast at rendering as UE, but it's still way more lightweight. I'd rather see smart people investing in open technologies like Godot, Bevy and the like, rather than funneling all their energy into something proprietary that they can never own or have true control over.
@@wallacesousuke1433 That is a gross mischaracterization. Godot has come a long way in the field of 3D. There just aren't many high-profile examples of 3D games made with it (or games in general, honestly). But even if it isn't on par with Unreal, it is at least on par with Unity.
I haven't seen this level of in depth detail about going into how individual games designed how their game utilizes the graphics card & all the different graphical settings. Amazing work
Great video. This is my first video of yours I’ve seen. I appreciate the use of realistic dialog when describing this stuff. During the first couple minutes I thought “what is bro yapping about”, and by the end I thought “damn bro, you ate frfr”
Thank you so much for providing an actual explanation as to whats going on and why there are issues. Way too many armchair generals on the internet keep making vague general complaints about why UE sucks and "is fail" or whatever but no one ever actually explains why. I appreciate your work here.
if you guys wanna really play the game smoothly now and cant wait for patches to come, I strongly recommend the DXVK GPLAsync Mod. It runs the game in Vulkan, eliminating stutters nearly 80 %. But you lose DLSS and HDR.
I love that this is your passion. I don't have the brain power to have any kind of deep intuition of graphics rendering like you apparently do. I think it's great you put pressure on game developers and game engine programmers to do better.
This is exactly what we need more of on RUclips and in the gaming scene in general. I may not fully understand around 30% of the language, as I'm not in the industry, but I can see that your heart is in the right place. Protect the consumer and the keep the integrity of your profession alive. Please never stop!
I wasn't expecting this type of video and I'm pleasantly surprised. I'm looking forward to seeing more explanations of these topics when it comes to videogame development, particularly UE5.
Im on a 7800x3d and 4080 and only a mod from Ultra+ maker Lucia managed to rein in the stutter. She also fixed several other ue5 games for me. So clearly SOME people like that modder and Threat know how to handle it
@@mirukuteea oh s**t I didn’t know what, due to how many stuff is still wrong on the gpu side lol. Do you have more info on why traversal stutter is in the cpu side?
Thanks for continuing to make such well put together content, being effectively a layman it's helped a lot in my beginning to dive deeper into these topics. Kudos
Yes, or any kind of volume-based solution would've been nice too. While it wouldn't bring smaller-scale GI to the same level, the resultant performance or image quality and stability would've been so nice.
my take: it's being pushed by either NVIDIA or Epic themselves so they are forced to use better hardware or make use of unnecessary stuff like nanite or lumen (which are cool technologies but need a lot of work). This isn't about incompetence, but most likely about business.
Baked lighting causes more issues. Extensive foliage can create issues, and dynamic objects are also a problem, and the lights that cast dynamic shadows won’t influence the baked lighting. Also, the game has many dynamic lighting events, and even the flashlight uses GI.
@@justinrunner9384 would you rather play a game that requires you the highest end hardware to play, or sacrifice some realism to have the smoothest experience?
I remember seeing that there are many conferences made by big studios such CDPR that advocate devs to modify Unreal's rendering system and not to use Unreal's default. Wish these would benefit the devs to make something better
Hardware tesselation is bad. (Efficiency even when tesselation is not happening, culling, not able to use high tesselation factors etc.) If one really needs non nanite tesselation, compute variant should be made.
@@pottuvoi2 Assuming your referring to Demon Souls Compute Tessellation, we would have no problem with that if it wasn't 90%+(at least) slower than properly managed HW tessellation like Nanite. Despite the recent industry noise calling HW Tessellation "slow", we'll be putting this to the test since these claims do not align with our experience.
Wow, you mentioned my comment. 🙂 Thanks for addressing my question. Your time and effort, so I understand your decision. Engines like Godot need to increase their market share. But few people want to use them because of... their small market share. A vicious circle.
>Engines like Godot need to increase their market share. Oh no, you want not only performance-troubled games, but also crappy too. >But few people want to use them because of... their small market share. Because quality of engine and feature-richness is important too. Nobody wants to use half-baked solutions. If they want, they use libraries and write their own engine.
@@widrolo I'm not going to claim to understand any of it technically, all I know is that loads of people who use Unreal (regular people, not just big devs) have argued that he ignored Nanite's real uses (and criticizes it for something it isn't supposed to solve), that it starts to make sense if you're truly cranking your geometric detail and don't want any obvious pop-in. It makes that level of visual quality feasible, while if you were trying traditional LOD the cost would increase exponentially until Nanite is, in fact, faster.
I enjoyed this video very much, and I hope Blooper Team takes notice of this video to help with performance improvements on the game. In one update they posted "Steam Deck performance improvements". While this did not change too much in optimization and I don't think sh2 remake will get steam deck verified, I really hope Blooper Team continues these improvements and maybe get us at least half way there.
Would it be possible for you to check some of the 2024 games we usually consider well optimized? Like the Horizon Forbidden West or God of War Ragnarok PC ports.
@@gavinderulo12 yeah I kinda got that wrong, I was thinking about what's going on in this video from DF at around 5:30 ruclips.net/video/cITEbCFGutc/видео.html
@@gavinderulo12 The PS5 version does have ray traced reflections. Both Digital Foundry and NX Gamer have showcased and talked to the developers about it/ They aren't the traditional ray traced reflections and combine that data with cube maps
What would be the better choice when choosing an engine today? I am convinced Unreal is not going to listen to critique and will remain focused on their own goals.
If, like the vast majority of people asking this question, you are an amateur hobbyist, then just download Godot or Unity and start your game dev journey. You've got a lot of learning to do before it's even time to start learning the ins and outs of a rendering pipeline. If you're a professional then surely you can think of better resources than a youtube comment section to get your advice from.
If you are just starting out and want to learn, I highly recommend Godot. It's easy to understand, very well documented, and the community is usually very helpful. Also it runs on a potato, and does decently well in both 3D and 2D games. If you need something more serious, you can certainly accomplish it in Godot as well, albeit you'd have to put in more of your own work. Unity in that sense could be better in some cases.
@@GlowingOrangeOoze nah, UE is the way to go, you dont even need to waste time learning programming, just Blueprints, so the time saved can be invested in more important, artistic skills
@@wallacesousuke1433 Learning programming is not a waste of time, and using Blueprints to the exclusion of all else is a trap. Blueprints run roughly ten times slower than C++. If you don't use any C++, your game will be needlessly slow and inefficient merely for the purpose of saving developer time, which is exactly the kind of corner-cutting approach this video and the entire channel argue against. If using Blueprints only is defensible, then so is every unoptimized misstep showcased in this video, given the devs were likely under intense pressure.
Yes it is a massive waste of time if you're a solo dev, since learning C++ is a useless skill (little demand and the version used by UE isn't even the vanilla C++) that is massively time-consuming, time that you could/should better spend honing your ARTISTIC skills, cuz that's what matters at the end of the day. And BP's being slower than C++ doesn't mean anything these days, there are games made entirely with BP that perform better than AAA games with entire teams of programmers working on lol (Bright Memory and Mortal Shell for instance).
re: the initial ~1.3ms frame dispatches... this could be debug and stat capture, the game had a lot I've disabled. re: LODs. this game doesn't use LODs _at all_. even force disabling Nanite the UE LOD system groups are ignored and LOD0 is drawn everywhere. I believe this is to do with the hard culling at 60~80 meters (disable the fog in town to see it). perhaps the devs reasoned that only LOD0 would be needed due to this. however because of how UE builds HLODs this causes sudden HLOD builds (and then shaders/textures/shadows/decals on top of them) where there was no previous LOD to hide it... UEs HLOD system is already problematic (I believe it has a fence/race issue), and hard culling beyond the player's view makes this problem far worse. I've found no way yet to force proper LOD use or fix the HLOD race/hitch issue. HLODs are one of the major causes of stutters in Survivor and SH2 fog: another issue is in vanilla fog is drawn to (about) 170 metres. yet the player can only see 40~60 metres. reducing fog draw increases FPS by around 20% reflections: while reflection roughness is clamped low (0.4), reflections add to diffuse indirect lighting, qnd raising this clamp dramatically improves bounce lighting in some scenes (while it is expensive, a good balance is 0.55) AO: I have no idea why the devs chose SSAO for this game even with RT. GTAO or RTAO are very similar performance cost and look much better, and RTAO combined with lumen adds throw lighting to some scenes, improving lighting quality at no cost.
This is the most demanding game I've yet to play on PC, it really brought my 3070 to it's knees and forced me to make huge compromises in terms of performance or image quality. I had a STRONG feeling that there were missed opportunities to improve performance and cut back the insane GPU costs. & don't get me started on the stuttering, which is frankly horrific and the worst example of UE Stuttering yet
Great video. I would recommend having the script closer to the camera, so it doesn't look like you are looking down while reading. Other than that, thanks for the info :)
I really like your videos and love your critiques on how games don't run so well even though our GPU is still in good condition. Can you check out non-Unreal Engine games today like Helldivers 2 because I have a 5 5700x3d and Rx 6800xt but somehow lose fps on higher difficulties even at lower Optimization settings even it lacks of Fsr and Dlss. Hope you check how ArrowHead run game runs.
Wow, i haven't expected the speech at the end of the video. And i can't say i agree with your sentiment. While Unreal Engine becoming the industry standard for high graphic fidelity games, i don't think this is right. Unreal is a for-profit organization, which prioritizes their own benefits first, and the issues you listed here are caused exactly by that fact. And this is not the only reason behind the development of the alternative solutions. For example, the aforementioned Godot is far more lightweight, and streamlines exactly that, and a relative simplicity for the newcomers. You probably can argue that Unreal Engine can be even more simple, but since your knowledge in CG and game optimization are quite advanced, i bet you already see the issue here: UE produces terribly-optimized results when the devs use the simplicity it offers. The other reason to develop any alternative to UE is that C++ has a lot of its own issues, and also, the exploration of new possibilities is the only way to evolve what we currently have. That's why i'm sympathetic to Bevy. Their and Rust's nihilistic approach towards the established standards, which haven't changed much, is almost guaranteed to yield something revolutionary. And lastly i want to mention the problem of legacy code. I doubt i need to elaborate on that.
"Unreal is a for-profit organization, which prioritizes their own benefits first, and the issues you listed here are caused exactly by that fact." Yes, this. They're undermining their own argument by dismissing competitors so broadly. It actually legitimizes Epic's efforts to monopolize the industry. This would remove any incentive for Unreal to improve, or heed anything they say.
@@gamerelaxbr2969 Well first, there are 3D games made on Godot. The closest to AAA is probably Sonic Colors: Ultimate, which, according to wikipedia, somehow uses it under the hood. Then there's Buckshot Roulette, which, i guess, has some popularity and is quite known among indie adepts. Even if we ignore that, how does what you said contradicts my point? Not to mention that Godot is quite young compared to Unity and Unreal, and you can clearly see the correlation between how long an engine is on the market and how popular it is among bigger game dev. Godot came out when there already was Unity, which already had a free plan, while Unity itself became popular because of it's comparatively liberal licensing, which allowed smaller devs to use it either for free or by paying a small fee. My point is that there not much games made on Godot because the niche has been already taken, not because Godot is bad or does not allow you to make a high fidelity game. Moreover, it's always hard to make a realistic game with modern graphics. It requires competent graphics engineers to do so, no matter what the engine is. There are games that tried to pursue realism but due to lack of experience, the resulting product was either lacking graphics fidelity, style persistence or optimization. For an example you can look at projects like Mordhau or Escape from Tarkov. One made on Unreal Engine, and the other one on Unity. IMO, they are lacking at least two out of three aforementioned criteria. As I already stated, industry standards are quite inert. If you want to have a job in game dev, you most certainly should learn either Unity or UE. Majority of jobs out there will require one of them. But if you're a solo, or you're a bunch of enthusiasts, why would you? Only because there were games made on those engines, and you liked them? Because they looked cool? Those games were made by more competent people, that had far more resources than you do. No one should ever choose a technology for their product only because someone some day made something cool using it, and refrain from using the alternative because no one ever used it before. This approach is shallow and dumb. This approach deems any potential revolution impossible and dooms you to mediocrity.
@@gamerelaxbr2969 Sadly true, but that's more due to a lack of adoption than anything to do with its innate capabilities. Though that hasn't always been the case.
@@gamerelaxbr2969 Road To Vostok is looking pretty decent. Eventually I can see it bringing much more fame to Godot (or it's forks, since Godot decided to go woke).
After taking a look back at the UE3 games again, I am shocked how well most of them (even the late era ones like A Hat in Time, MK11 and Rocket League) were actually very well optimized! Could you take a look at some of them sometime and perhaps reflect on how the 3rd version of UE was way better of an era? Thanks, keep up the good content!
Are you running them on a system with period correct hardware? Otherwise of course beastly futuristic tech like what we have now can run those old games exceedingly well.. I liked UE3 generally speaking, however it would stutter a LOT and in almost every title (even with modern hardware, btw). - Now fortunately that stuttering could usually be fixed with a couple of config edits but you know only a fringe few of us bothered to do that. IT was so bad that anytime I got a new game I would look up the engine, specifically because if it was UE3/UDK I already knew what config entries I would make..
this channel has been great learning experience of what to avoid. wich is amazing for me as a brand new dev starting with UE5 .Already avoid Raytracing because the performance cost is not worth what could be done with good baked lighting and felt lumen was similar case. Saw to many vids say nanite is great but so glad this steered me clear from it.
The thing I love about Threat Interactive is the fact that i'm actually learning something new without feeling like the content is being dumbed down like with other tech youtubers.
really what the aaa games industry feels like to me now is "WOW SHINY NEW TECH AND TECHNICALLY SUPERIOR TO BEFORE PUT IT IN JUST PUT IT IN WHO CARES ABOUT ANYTHING." and then we get the game and its like "why is this running at 45 fps on my 7800xt?"
This is some great stuff to learn about, as our tech gets stronger it seemed like our care about truly being optimized went down the toilet, it's good to see that people do still care and are taking action. good luck!
Would love to see a video on CPU optimization. The original kingdom come deliverence will be a very interesting subject. It still brings modern cpus to their KNEES despite being from 2018
Funnily enough I can run KCD on my laptop specifically because of that. It's an officel laptop, so it has a really good CPU, but a lacking GPU - MX110 :((( I like the funny česká hra na hrdiny
Yeah I was shocked when I loaded up UE5 for the first time and realized it was missing. Their solution was to use nanite which is crazy imo because if you just enable nanite on an asset you get a performance hit that is pretty drastic. I'm really hoping they don't do away with the ability to do bakes in favor of an all lumen pipeline next.
Tbf I dont think this is too bad. Not talking about hardware tesselation in particular but removing old features is important. I've worked a ton on different frameworks (not game related though) and dragging along and maintaining old features often severely worsens the quality of new ones.
Just use an older version of UE that has this feature? Obviously the engine devs will want to focus on their newest advertised features instead of something less important. It's like asking why does UE5 not support directx7/9/10, since they're less resource heavy! The answer to me is quite obvious and is the same to your question, it's not necessary in the feature set of the newest engine version..
I suggest Remnant 2 for UE5 analysis. From my understanding devs fixed some issues like culling with patches and overall it's decently performing UE5 title without issues like shader/traversal stutter.
I am curious what he'd say about it now. I know the video creator once called the Remnant II devs "trash" on the Unreal forums because of its poor optimization on release and heavy reliance on Nanite.
First time viewer here! I really enjoyed this video, I’ve never watched a deep dive into a games engine and settings. I’d love to see a review of Space Marine 2. It’s a great game, but it definitely needs optimizations for rendering and It’s also very CPU heavy.
PSSR using terminology from XESS might be because XESS is open like FSR, but FSR doesn't yet have an AI version, so Sony studied XESS as a baseline on how to develop their scaler. Moores law is dead has also said that his sources say AMD did not help sony make PSSR, but sony was willing to share knowledge about PSSR after it was made, which AMD is most likely to use to make FSR 4.0, which will be an AI upscaler and only supported on 7000 and newer radeon GPU's with FSR 3 getting small updates but being the last (and best depending on your view) spatial non-AI upscaler.
man just found out this channel. Incredible goo stuff. would love to see analysis of other games engines like MH wilds and such but iunderstand focusing on UE5 might be the most important for now.
I'm no graphics developer, but surely having so much fog could had been used to decrease polycounts. As an artist, it makes no sense in rendering the foliage to the most minute detail, only to slap the dense fog on top of it
PLEASE READ FOR UPDATES & RESPONSES(and discord link):
1. Watch this video in 4K (as in streaming settings) as any lower compression will hide the visual details discussed.
2. We would really appreciate it if viewers could share on giant game related subreddits. These prevent self promotion but these can really push our channel's success by the thousands.
3. TAA commands though we suggest FXAA since Lumen really doesn't work well with it.
r.TemporalAA.Quality 2
r.TemporalAACurrentFrameWeight .6 (with vsync)
r.TemporalAASamples 2
r.TemporalAAFilterSize 0.09
r.TemporalAA.Upsampling 0
r.TemporalAA.R11G11B10History 1
r.TemporalAA.HistoryScreenPercentage 100
r.TemporalAACatmullRom 0
5. We would like to invite you to the official Threat Interactive discord: discord.gg/7ZdvFxFTba
This will be the main discord for engine development, news, and Q&A support for developers.
6. Some people seem to have issues with the comment made at 4:25 | Yes, SH2R uses one of the first versions of UE5.
This is still disgusting quality to have been included in a official release of the engine and these issues are still present in the newest preview of UE5.5.
7. We do not care about ignorant statements auguring about our decision to zero-in on UE5 exclusively. It doesn't matter if Unity has a higher market share. The people complaining about a "monopoly" are failing to understand it's already a monopoly! And the monopoly includes the major games that are trying to push better visuals. The list of UE5 games using these poor designs have grown too much! We intend to push graphics as well and show the example of visual quality we expect from an overly supported engine such as UE5. We already know improvements to UE via forks impact the industry as seen with The Finals. That was a performance enhancement, the industry also needs a visual enhancement and our videos prove it will not detrimental to performance on 9th gen console hardware.
If you are paying funds to engines like Godot, don't expect anything to get fixed in the games you'll actually be playing.
8. Regarding the comments made about the hair in 4:45. What should been have said the hair looks better in comparison with the traumatic dithered variant. In reality it could look much better, without too much performance cost and most importantly being TAA independent as shown by frostbite developers here: ruclips.net/video/ool2E8SQPGU/видео.htmlsi=_yjztNp5PBIeZkA2
9. Stay Tuned*
what is those hairs in front of your chest.....
@@BastyTHz The mic windscreen.
sloptimization
@@BastyTHz Thats his insane macho chesthair from his chad energy and detailed breakdown of graphics
I'm so glad you are making these videos. As a game dev it was so frustrating to see influencers talking out of their ass. I was looking like "old man yelling at cloud" when I talk about things like tessellation and taa
The irony of not using the fog to their advantage to hide low quality LOD for performance purposes, when that’s literally why it existed in the original game.
Irony or not. Two completely different mechanics. In the original game it wasn't "fog" just floating textures. This is volumetric, millions of particles rendering at the same time.
@@Darkcranio Fog doesn't use that much resources, there's literally a performance breakdown in the video. Just sloppy devs being sloppy devs. I couldn't believe that the game with a static environment like that and the fog advantage still stutters horribly and has UE load stutter.
What happened to "table fog" from nvidia ? At one point it was a solution to the fog problem😅
With how bad the traversal stutter is in open areas I'm genuinely surprising they didn't think to use the fog as a bandaid.
@@Darkcranio Volumetric fog doesn't/shouldn't take millions of particles to render. Volumetric fog is generally raymarched in post, with particles or drawn objects being used to populate a 3D texture or some other data structure with information about how dense the fog is at a particular location in the world, as well as maybe the material properties of that fog.
Full triangle meshes for distant trees completely obscured by fog is some real 2024 level problems
I don't even know what they were thinking when doing stuff like these, maybe it's the lack of time and rushed projects...
@@k_gy_b UE5 rewards laziness with faster development cycles
@@disseminate4it has been that way since UE4. the goal is to make people with less technical knowledge about game engine can create their own game more easily.
@@disseminate4 that's true
@@disseminate4 It's Nvidia's greedy cooperate influence they wanna sell GPU's they wanna become just like Apple and sell GPU's every year and if games were optimized they wouldn't sell new GPU's. Nvidia has alooooot of money, they can pay off developers and even game companies.
crazy how games used to look ok on lowest settings, and bumping them up added fidelity. now max settings are baseline, and by reducing them you are subjected to pixel vomit.
We are back to 2005, where games had good graphics settings, and mashed potato slop settings.
max settings looks bad too imo.
Anyone remember TES Oblivion's lowest setting?
@solitudesf8111 "pixel vomit" is an adequate description.
Even the max settings look like garbage
You think performance is bad in these outdoor scenes? Wait until you make it to the apartments. The main lobby area is a performance KILLER.
yep wtf staring in to the wall kills the fps ,in the town i was getting like 70+fps on the apartments things changed
I thought it was just me!!!😂
@@theunwantedson Wow!
That's just unhinged, terrible optimization.
The stairwell in the hospital made my pc crumble
That swimming pool outside of the hotel too. Holy shit
Wait, they use real time GI for a game with no day/night cycle?
But.... Why
Less time-consuming than baking it to meet the deadline. Either that or sheer incompetence.
This baffles me too. There's a reason Assassin's Creed Unity still looks so good
@@dingickso4098 Maybe it's both? Lack of time because team isn't experienced enough for the task?
From what I know you can't straightforward bake lumen lighting, it's just meant to be dynamic. But you can use static lights instead of moveable when you can to reduce the real time calculations
Baked reflections never hold up as well as Lumen. Lightmaps can also only be so high resolution. Per pixel lighting on paper would provide more precise GI than baked lighting, though obviously it's vastly more expensive and in this case a lot more noisy.
Ey, that’s me.
Thanks for all the work yall do. It’s always got me scratching my head why games have been so demanding. The ghosting too with reliance on TAA in particular bothers me too. I think we are in an awkward middle ground for graphics and optimization that we will end up getting past, but not yet. Actually hopeful we will- idk if UE5 is in with Nvidia and AMD to push their upscaling tech and raytracing.
Another game that could be fascinating to take a look at is Dragons Dogma 2, the game doesn’t seem to do anything ground breaking (maybe except for a dynamic day night cycle), but it is SO hard to run. Especially at native resolution, the game defaults to progressive upscaling. I also notice tons of dithering in hair and foliage. Sucks with that one too because DD2 is a fun game that shouldn’t be having these problems.
We are glad to see you here and find you in support of the topic we're covering.
You're more in the category of "overlooking" rather than "refuse to address" but that's just because there isn't enough information about what we cover so no hard feelings!
We are getting support from viewers because other channels we showed are making shameful comments regarding the quality of upscalers and we hope our content helps you others spread more correct information about what's really going on.
We 100% encourage you to view all our videos since I think they would answer a lot of your questions and help your takes on some things in the future.
Thank you for the kind support and title suggestion!
You should make a video on TAA
Excellent title suggestion vex, i second that!
@@Gustaviustwinkelberryyeah they’ve definitely improved it with CPU performance, but it is not impressive
@@ThreatInteractivedw abt it man, I didn’t take any offense. I’ve actually been watchin the content since yall started the channel.
I am usually pretty positive about DLSS, but mostly compared to FSR, XeSS, and TAA, because DLSS is just the best at what it does. However, I’ve found it weird the absolute reliance on upscaling to get anything done now.
Again thanks for all yall do and gl on the game. All this will just make games run better and I find it really interesting. I might go ahead and shoot yall an email and we could talk sometime. I think it would be fun
14:05 I get the pragmatic reason to focus on overhauling Unreal, but the FOSS ideologue in me feels compelled to point out the effort in Godot and other FOSS projects isn't wasted just because they're not used in industry. I think it's important to have a free/libre floor at least to keep pressure on proprietary, privately owned projects to not screw over developers who trust and depend on them (see Unity's "charge per install" attempt or Microsoft's AI/privacy abuse with Windows).
yeah its very narrow minded to dismiss smaller engines which are what the indie and ever shrinking AA scene thrives on, i feel any amount of competition is good and we should just give unreal a monopoly in games cuz "its the best we have"
I think it would be great, if Threat iterated more on that point, because I feel like their stance can easily be misunderstood. I assume they're specifically talking about AAA productions, where Unreal is simply ubiquitous. And it shows by the sheer volume of bigger titles released on that engine.
I'm by no means an expert and other engines are certainly very capable, but there must be more reasons than just documentation and support, why almost no team uses engines like Unity or Godot to develop large scale games in the vein of a Battlefield or open world games. Unity game releases like Escape from Tarkov (BSG really struggles with this one), BattleBit or Cities Skylines are few and far between. And for such huge productions, I've mostly seen devs either using Unreal or their own proprietary engine such as Frostbite.
And visually I start to dislike UE more and more.
@@CyboTyaS My understanding is that if you're making a game today with high fidelity graphical targets and don't want to make your own engine (or do comparable engine work), Unreal is basically your only option. Unity can look just as good, but not out of the box like Unreal. Godot is functional for 2D games, but sub-par for 3D.
There's also the matter that if everyone is using Unreal, companies can easily find employees already familiar with it, saving on training time/costs. Good documentation and community support also shouldn't be underestimated. The behind the scenes reports of bad releases often include tons of loss due to engine and interface issues (Anthem and Mass Effect: Andromeda come to mind with Frostbite).
@@stormburn1 The _actual, core_ reason for using Unreal is simply that large publishers prefer having one single engine that everyone is trained in - so that when they fire 20% of their developers to 'reduce costs', they won't have to spend months training the replacement developers they hired for half the money straight out of school.
A highly experienced dev team could have avoided some, most, or all of the issues discussed in this video even if they were forced to use Unreal. But experienced developers cost _money,_ and therefore get _fired._ Such an experienced dev team could have even made this game in Godot if they wanted, modifying it for their needs as required (that's the real best thing about Godot btw, it's extremely easy to work with) but with Unreal you already get so many features out of the box that you actually have to turn many of them _off_ first. This is why a fresh full Unreal Engine install is _60GB_ and a fresh full Godot Engine install is _55MB_ (yes with an M).
@@stormburn1 "but sub-par for 3D"
Don't agree, on the hands of good developers, Godot is a very capable 3D engine, proven by the fact that, The Road To Vostok maker changed from Unity to Godot 4x and was able to still keep most of the visual fidelity.
And don't assume that is only Godot 4 making it possible, there's a guy making a visually impressive 3D horror/stalker/Fear inspired shooter in Godot 3x as well, called Leaden Sky.
A engine is only as good as the person using it. And a good experienced developer can create impressive games in tools many people think are bad.
It's baffling to see bad practices like that used in a AAA game. Using fog to hide imperfect rendering is something old games were doing from the near start. I mean, the first Silent Hill was doing it ! It was the perfect occasion to use it. Also, geometry LOD is a basic optimisation, combining both would have been perfect for this game. Looks like Wirth's law is becoming a threat rather than an inconvenience.
And it's also baffling to see TAA to be used for nearly everything. I wonder if it's related to the fact UE5 use defferred rendering by default . Because, if i'm not wrong, it's the only form of light AA working with it, with SSAA (MSAA don't work with deffered rendering). I'm pretty sure you won't have that kind of problems with the next DOOM Game, which use a forward rendering engine.
The abusive use of TAA is far from using "deferred". We have deferred games that use MSAA, SMAA, SRAA, reasonable TAA, and more. It's just that:
1. Most developers do not care about the image quality issues or simply like the blurry upscaler look(we've seen the praise).
2. Investors/high level employers do not care to set standards or pay to development of better approaches.
He says in the video that it's because Lumen requires TAA to smooth out the noisy lights.
@@MyAmazingUsername Watch our first video. Far more simple things needlessly require smeary TAA in UE4+.
@@ThreatInteractive Do I understand correctly that there is difference between UE 5 TAA and "reasonable TAA"? Where would you place modern demanding TAA methods like DLSS/FSR/XESS (they take a lot more frametime, but seem to look good).
In BG3 default TAA and upscaling seem to look a lot better than SMAA.
@@ninele7 DLSS/FSR3/XeSS are upscalers, not TAA.
"The hair looks fine" *shows horribly aliased hair* xD
Your right! The default hair was so traumatizing is made the less dithered hair look better.
The best hair shown in real-time rendering is actually done independently of TAA!
ruclips.net/video/ool2E8SQPGU/видео.htmlsi=fV3W6LuI0eCTq3cz (frostbite)
And this is why I think people are really underestimating art style.
Sometimes, simpler render engine pipelines with a better combination of textures and models is enough to make those visions pop without burning your house down.
And without abusing your eyes with all this temporal slop
Seeing the first uncharted game on ps3 made me question graphical horsepower was overrated.
I already hated UE5 being adopted by developers. Forcing people to buy video cards ever-increasing in price for graphical improvements that get overshadowed by a smeary picture and poor performance. Now I can articulate it even better!
You can't.
This is all in the short term, the hardware will become better and these games will look much better in the future, because they are not pooping in and out optimised geometry.
You don’t think silent hill 2 in quality mode on consoles looks impressive? Or hellblade 2, nobody wants to die, jusant, and robocop? Genuine question
@@TheReferrer72 "pooping in and out" forever and ever?
@@raumfahreturschutze You got me, should not be commenting on mobile!
@@TheReferrer72"Hardware will become better"
Intel: "Soo, our new CPUs are now slower. Enjoy!"
AMD: Zen 5%
I hate Unreal Engine with a passion as it entices devs into constructing a suboptimal design.
Its parallels the movie industry where they now focus on big budget movies full of CGI that fail because they just use money to attempt to substitute creativity.
It's like everyone sees those 4k 4090 tech demos and thinks that's what unreal engine is but that is with a 4090 at 30 fps going through RUclips compression. But in reality it's a lot flashy features like Luman, Nanite, and ray tracing that get it a lot of attention and praise but then you look into it and it sucks and most other engines can probably do the exact same quality without the crutch of Nanite and other unreal engine stuff.
I simply do not buy UE games unless I know the devs basically reverse engineered the whole bloody thing and managed to make a game that is properly optimised (Lies of P for example)
It's not the fault of the engine itself (the tech is actually really good) it's the fact that devs are choosing to ignore optimization because "ehh Nanite and lumen is enough", especially since older games that run on UE3 like the first three Arkham games actually perform really well while looking really good. The "auto optimization" that UE5 has just makes them think that they don't need to optimise
@@lollingrock Are you serious?! The devs test the games on different hardware. There is no way they couldn't have seen and felt first hand that the "auto optimization" does fuck all and the games run like partially frozen molasses.
I dont think Godot and other engines/frameworks are a waste - having a variety of choices is NEVER a bad thing.
"It needs to stop!" made me laugh.
what is this channel? what am i listening to? its so refreshing! omg no bs!
It’s bizarre isn’t it?
feels like finally a professional is voicing and demonstrating what us gamers have been complaining about for years, with absolutely zero alterior motive other than exposing and educating.
This guy makes Digital Foundry look like shill amateurs, I’m not sure if that’s why they really are, I might be wrong, but the difference between the two approaches is gigantic, one feels like truth, the other feels more like an advert.
@@George-um2vc digital foundry is pretty honest from what I've seen. They often don't always have nice things to say because so many games have issues like the ones listed here.
@@SoulbentAnime that’s true, they are more critical than many give them credit for, but I prefer this style of analysis, although it somewhat depressed me (since all future games are essentially doomed for the foreseeable future) and I will likely be playing Witcher 3 and RDR2 till things change.
@@SoulbentAnime df play nice and downplay issues though, this dude is direct and straight to the point and demonstrates real practical solutions
@@Gustaviustwinkelberry they said jedi survivor was the worst pc port of the year. I don't think that's downplaying it. This dude is definitely more in depth obviously.
This game would be perfect for baked lighting. Much better performance and since there's no time of day or anything like that I dont see the point of GI. Baked lighting could potentially look better and have more control over the atmosphere
If you want everything to look cohesive and firmly planted into the game world, you actually don't want to do baked lighting. Dynamic objects and dynamic lights (Jame's flash light, certain lights changing color or intensity in later areas) don't blend well with pre-baked lighting. Also working with Lumen is so much faster for artists, they can make adjustments in real time and testing dozens of iterations a day, rather than needing to wait for the lights to bake (which can take anywhere from 5 to 45 minutes per iteration). This also leaves the door open for modders to do cool things with the lights.
@@03chrisvThe point about iteration speed is a very good one, but using the full pathtracer could be used for development, and baked lighting be calculated every once in a while to make sure that the final results are still what you’re expecting. Won’t change the issue of flashlights not really playing nice with that ofc, but even that could be worked around by fading pre-baked lighting out near the camera, and only using lumen for closeup stuff I’d imagine.
@@03chrisv Bounce lighting for flashlights is unironically the easiest thing to do. You just sample a few colours at where the beam is hitting and create phantom point lights there. The Last of Us - the original one, on the PS3 - did this.
@@AlphaGargthere's also not using GI for realtime lights, like Unity with directional lightmaps, which blend realtime direct lighting with baked indirect.
Not sure about the time of day thing. There's different lighting as the fog gets heavier, weaker, etc.
when threat interactive posts you dont just watch, you sit your ass down and listen
Real
Real
Real
Real
Real
GPU companies make money from telling you that your graphics performance is bad and you need to upgrade. They're financially invested in not pushing for optimizations by developers. That's pretty much it.
Make optimized game gamer will still complained about it.
@@arenzricodexd4409have you played DOOM and DOOM eternal ?
I see no one complaining about those games. They look nice, and perform well. As they should, since the gameplay is fast paced, and the environment not so open-world. But these days, you can't take that for granted
@@etienne1062 people praise doom for it's graphic but in reality they did not really push it's graphic that hard (already play both). So it is faster to run. Plus it is one of the popular franchise out there sometimes they got a pass like in the graphic department. But others? Just look with Space Marine 2. I see the graphic is just as good as Doom if not better in certain aspect but people complaining because despite it's good performance developer are not using things like high resolution texture. Same with black myth wukong. Dev did a good quite a good job where their low setting really mean low setting since you can really see the difference but then people still complaining about low res asset.
@@arenzricodexd4409As a performance over graphic gamer, this is very funny.
@@arenzricodexd4409 With good reason.. Look at CS2, Valve hasn't even properly optimized this easy looking game .. these large companies have become complacent
dude that first minute in your video was impressively good and exactly what i want to hear when i see a thumbnail like yours thank you so much for what you do stranger!
Ah the wonders of modern gaming. Pixelated, aliasing, flickering, shimmering, ghosting, blurry, messy visuals, 1080p, sub 60 fps...
I have next to non of this on a ps5 in quality mode besides a little ghosting on fog and obviously 30fps. Perhaps pc is suffering more with this particular game?
It's just like a well shooken Polaroid... That was out of focus... Of a moving object... And then you tripped while you were taking it! Photo realistic!
Ok but AA in games is still garbage like TAA, nothing like the good old days, instead of distance changing the effect of AA it all is blurry/waterpaint and the end result is like looking at 2D. Modern problems self created, and worse is TAA is baked into games
@@neilranson4185nah, you just suck at noticing.
@@neilranson4185 it has absolutely 0 to do with platform. PS5 is basically a pc. PS5 is not a console like PS1 or 2 where they had their own tech for FX or post-processing. Whatever issue you have on PC, you will have in a PS5. And if you don't, then it's because the devs adapted the game to the fixed hardware that a console is.
It's so weird to point out the problems with UE and then discount alternatives, why put effort into improving someone else's commercial product you're literally doing free work for a multi-billion dollar company that doesn't care about you at all.
To me it seemed more like a "We know how the AAA industry is shifting towards UE5 so we'll focus on UE5". These will be the games that will "force" people to get new GPUs. The smaller AA or indie market was never and it's probably not gonna be the one showing the issues present in this video. But I totally get your point since I'm also an advocate for FOSS
@@faultyaxiom hm thing is. indies who do choose UE will not be able to know what they are really doing. Lack of documentation etc
Absolutely. If Threat Interactive went and put their efforts into improving Godot, they would be helping to build a Godot into a powerful alternative to Epic's monopoly. Godot isn't reliant on temporal solutions like Unreal is. It's easier to build on what's already there, rather than spending months fixing Epic's mistakes. A powerful Godot would democratize game development and shatter the hold that large corporations and proprietary engines have over the space.
But if Epic decides Threat Interactive's efforts on a custom version of Unreal are detracting from all the money and advertising they've poured into Nanite and DLSS, they could very well shut the project down and ensure their work never again sees the light of day. None of these solutions that Threat Interactive talks about are going to help Epic make more money, so at best they're inessential, and at worst, an tangible threat to Epic's "innovations".
@@jwhi419 Lumen, Nanite, etc. are all on by default. And a lot of indie devs just leave them on, not not knowing what they are or what they're intended for, leading to games that are horribly-optimized for no reason. It's easy to say it's the devs' fault, but that ignores Unreal's responsibility in teaching its tools to devs in an intuitive way...
I understand the issues with TAA, specifically with the most basic and bad implementations, because there have been improvements made (by Guerilla Interactive, for instance). But honestly I've never seen SMAA handle specular well, and that includes Frostbite. Specular in NFS 2015 just doesn't look nice, for instance. It shimmers and shifts. While subjective, I'd still go for TAA, but ONLY at very high framerates and rez because of the flaws it brings and its inherent temporal and rez-bound nature. TAA at 30fps or 1080p is just a thing I never want to see, personally. As for TAA being an excuse for poor implementation of things like AO, transparency and translucency - this here I can get behind 100%. Don't like the trend of everything being built around that.
A side note for Unreal and its overall lighting implementation... You know, it's funny. I've worked with UE since it's version 2 days and it's always been like this with Epic. First, they insisted on Lightmaps for years (for obvious reasons at first), offering very poor alternatives or stuff that is just not enough or never leaves beta implementation (Hello, LPVs!) . So, for the longest time, it was Lightmpaps or bust. It's not that they are bad, per se, but it's just not optimal for all scenarios either.
Then comes around UE5 and now we see the exact opposite. Away with the lightmaps, welcome to Lumen. No other less resource-intensive solution. I just don't get it. Why is Epic always like that when it comes to lighting solutions. They have such amazing tools for so many things but their lighting pipeline has been so damn stringent for decades.
Forward vs deferred - another story altogether. It's like a hypetrain... We have Doom Eternal's ID Tech working on forward rendering and doing miracles. But no, "outdated technology". And this brings with it some many pains too. I'm still waiting for UE to implement custom-mesh GBuffer decals (not to be confused with simple masked custom-mesh decals or decals with a mask-dither pattern+AA solution, bleh...). It only supports GBuffer decal actors with this functionality, but not custom meshes. So, pipelines for materials like those used in Alien Isolation, Doom Eternal, etc. are just not available to you in the same capacity.
UE is often being lauded as being so flexible. And quite often it's deserved. But in other cases - not at all.
SMAA should only be used for basic staircased aliasing/input(for spatial upscaling, which is what FSR1 lacks). For specular aliasing there are many solutions such as filtered mipmaps such as the ones offered in cryengine, extremely cheap specular AA as seen in Half Life Alyx and CS2(or the Square Enix version), and a whole new approach we'll be covering soon. And of course, we're not against sub pixel sampling via more intelligently implemented MSAA or Temporal Subpixel Sampling like 9:20 but without "Standard MSAAx2" sampling and infinite frame holding accumulation buffer.
Lots of aliasing issues are *specific* and have their own solution. The problem right now is getting all those solutions into one engine the industry uses, so that they can be combined.
We would like to see you in the discord if you don't mind listing more issues you've personally noticed with UE.
Thank you for your comment.
@@ThreatInteractiveis there any research documents on specular aa for implementing in another engine?
Edit nevermind, I found the paper
Dude, imagine the performance had it used baked lighting, lumen/RT in this game doesn't even look good anyway with it being flickery and shimmering.
Everything's gotta be "dynamic" these days... including how they exploit players into paying more money.
We really don’t need dynamic lighting, given all the sceneries in Silent Hill 2 are static
@ Correct.
We need good performance.
Please check Gears 5. It is a Ue4 game but still one of the most optimized game of all time for me. While we know most of the Coalition's staff are ex-Ue developers, what did they do to the Ue4 pipeline to get good optimizations that the industry still can not achieve.
it's UE4 so it's more optimized just by not being UE5 (no nanite for example).
@@progste does not explain anything. UE4 is famed for its various types of stutters.
Jedi survivor is a UE4 game and it absolutely runs like crap. No nanite, no lumen, just crap.
@@etienne1062 stuttering on UE4 wasn't universal, especially in the late games.
I haven't played Jedi survivor but it probably runs like crap because they did a bad job of it.
@@progste ue4 is not more optimized, the engines are essentially the same thing, in fact as of 5.4 ue5 is more optimized. Nanite is just an optional feature that you can choose to use in ue5
I agree with you, GEARS 5 is one of the best optimized games out there, it has excellent graphics and rarely stutters.
The problem with developers is that they often use scalers and frame generators, why optimize when I can use upscalers? Plus they have very little time to develop their games because of their bosses.
I would pay this guy to optimize my game if i had one
@@jcdentonunatco have u ever shipped a game in ur life?
@@jcdentonunatco can i see ur portfolio?
also can u point out what misinfo he spread in this vid in terms of optimization?
@@heisenberg3206 I'd like it if you could elaborate on what said "misinfo" is.
After seeing every gaming type of media praising this game on release it's actually refreshing to see someone making valid criticism to the game, now that the dust has settled i can say that i didn't really like what was delivered, and i think the original still gives you a better story and narrative elements than the remake.
typical of people to get overhyped like fools and were mainly praising it's graphical enhancements. always about graphics... it looks flawed anyway. the original wasn't about combat either and they made it a sole purpose in the remake. a try-hard dark souls esque rip off
You are really smart man, happy to have you around!
It feels like many were never around in the day we had to hold the face-count down, we could not "cheat" the performance via external means or say "just get a better computer bruh". I did a bunch of map making in the late 90's for my LAN party crew and learned a lot from the optimization we could do back then, the Unreal engine blew my mind with how it worked and performed compared to Id's BSP engine games. DLSS getting in via "deal-making" makes a lot of sense with how quick easy and free FSR is but you never see it, while games have every single envyDiaf thing in existence jammed in there. Ick.
Then: I'm making a hand-rail, I know the player will see it in third person view. I will make it on a cylinder with 16 sides o ensure it never looks off for anybody but the most sharp-eyed.
Now: I'm making a hand-rail, my nose is up to the screen to ensure it looks photo-realistic, even the booger-welds will be be smooth, the game engine will automatically make the LOD models, surely.
"how quick easy and free FSR is"
FSR looks like complete shit, end of story. it's completely pointless. at least DLSS has a valid use case. the fact that nvidia "bribe" developers to implement it isn't as incriminating as one might think either. I highly doubt any developer intentionally makes games perform worse in order to justify DLSS. remember, the consoles have AMD graphics and are always the target platform for anything multi-platform, so if the games run badly on AMD hardware that's just a case of incompetent developers. no way nvidia pays them enough that they would jeopardize how the game runs or looks on console.
@@forasago Correction* FSR (yes 1.0, this is even the worst case scenario too) looks like bad at any native resolution below 1440p set lower than quality on fsr. Its not total shit like you claim but it hardly holds a candle to DLSS 3.0 or newer.
16 sides? id use 6, if not only 4 but keep the face normals as smooth as possible to give it the impression that it's smooth.
Not a Dev, but in general just a curious person, so I appreciate this channel and the awareness it provides.
I learn a thing or two while watching these videos tho.
Thanks for the insights :)
To think that in beginning of Silent Hill devs used fog to hide stuff and ease performance load, and now basically right opposite happens
After replaying the OG just recently, it's even more embarrassing seeing how the PS2 original's fog seems more dynamic and "living" than the 2024 version.
The original Team Silent also stressed that they wanted the fog to truly "surround" the player, not just be a wall in front of them. Yet the SH2R clearly has an empty "bubble" around the player, that blocks the mist from getting close to James (best seen in the static camera angles mod demonstration videos).
this is one of the most noble channels i’ve come across.
oooh a channel about hating the modern vaseline blur and temporal noise artifacts AND wanting to do something about it
You are ahead of the game brother. Your final message hits home. We shouldn't be reinventing wheel but innovate upon it instead.
Yeah, i'm finding way too many UE5-based games to perform really bad on a 4080 Super. DLSS Quality is almost always a must, not to mention removing the over used post effects (piss filter) and blur etc. When DLAA is realistic, it's always nice to see, as it's usually ok'ish. But the issues you highlight make it more clear why this is the case. I game at 1600p UW.
Fellow 1600p UW (hopefully 144Hz or more) man of culture I see.
@@spyrochrisgaming yes, 144Hz. Running it at 120Hz 10-bit mode for compatibility and smooth gsync functionality though. Some games don't like *odd ratios for frame delivery. Usually 60/80/90/120 work the best from experience.
What’s wrong with using dlss to achieve your desired IQ and fps?
@@neilranson4185Using it? Nothing. Having to use it almost by default, a lot. The video highlights the why part very well imo.
@@wille84fin what’s the difference between choosing to use it and having to use it if the end result in IQ and fps is the same? It’s clear spending rendering budgets in pushing more native pixels and higher frame rates is a waste of the budget if the end result can look as good without needing to be entirely native
Its crazy how I've been building and optimizing computers since probably before you were born, yet you spit knowledge on my nerdbrain.
Subbed.
never saw a so explanatory critic about a game's game engine graphics pipeline
In all fairness, it seems like most modern devs don't know there is a graphics pipeline.. or maybe they know it's there they just don't they don't know how it works or what it does and it's best to just you know never check a box that isn't already checked by default. It's how we lost the developers console, and most of the graphical options that used to be standards sadly. Honestly the worst thing about unreal is the way epic touts its best practices and unfortunately a lot of its customers don't challenge that the way they should. And honestly they might even be in agreement.. I've noticed a suspicious lack of good eyesight in the development community.
@@JackWse people know at least the basic, the problem is that unreal engine goal is to create real life graphics no matters what even if they fail in the road to that, so that engine is not for common games right now because it's visuals are fked up, try other options like unity, cryengine, o3de, Godot, those are still following the old path which is to create fun games, the visuals are important but they're not the most important thing, what matters is the gameplay, see among us they made millions and it is a stupid 2d
I appreciate when someone actually takes the time to figure out why something is performance heavy rather than making assumptions based solely on what they see.
Now with that said, I do also want to remind everyone that when a question is raised on "why did the developers use this non-performant system when there's an obvious better choice here," the answer DOES tend to be "because the publishers and/or directors told them to."
I'll never forget a back and forth I had with one of the people who worked on the opening to Skyrim. They were talking about how much trouble they had with the realtime cart sequence at the start, and when I asked them "couldn't you guys have just baked an animation instead of making it realtime and having to fix it every time the geometry was changed even a little," they responded with something like "Yeah that would have been so much easier and would have avoided so many headaches and wasted time." With that response alone, they told me everything I needed to know.
I hope videos like this will make engine better
I can absolutely understand your decision to stay with Unreal Engine, but justifying it by claiming that all other engines are just a wasted effort is ridiculous.
That's like saying you don't need free market because your state already gives you everything you need. And if it doesn't give you something - that means you don't need it.
Sure. Investing into an engine like Godot that is so much smaller and doesn't focus on AAA games might seem like a wasted effort, but with Unreal Engine last time I checked the licensing it said you're not allowed to make your own version without a special agreement with Epic. The fact that Unreal Engine is source-available doesn't mean you can legally sell games running on your own modified buids of the engine. With Godot you can do that, because it's open-source. Unreal Engine is proprietary and source-available.
I hope you're aware of the legal bindings that are present around here.
Exactly. Not everyone wants to create photorealistic AAA quality games. Yeah, you wont get the same graphical fidelity in Godot than you will in UE5, but Godot isn't even that old. As you said, the licensing is a major reason for some people. Also think about storage. UE5 games take up several GB, and the engine itself is several GB. The entire Godot engine is at most 200 MB and you can still get decent 3D graphics.
I pretty much agreed with everything in the video, but saying working on game engines other than UE is wasted effort is just silly. People need alternatives. I love Godot because its super lightweight and opens in a couple of seconds.
@@How2Bboss I use Godot myself - I have a game project called Liblast (open-source). I professionally work in game development as a tech/3D/enviro artist with Unity though.
Godot has all the basic tools to make decent 3D graphics. I am working on a asset workflow and pipeline to achieve quite good visuals with much reduced artist workload due to handling common effects like edge wear, beveling etc at a shader level with using of special custom textures.
I see people pushing Godot to it's limits and it can get things done. Surely it's not as fast at rendering as UE, but it's still way more lightweight.
I'd rather see smart people investing in open technologies like Godot, Bevy and the like, rather than funneling all their energy into something proprietary that they can never own or have true control over.
@@How2Bboss who the hell wants to play games made on that garbage engine though? Thing can't even deliver PS2 level quality
@@wallacesousuke1433 look at the game “Road to Vostok”. Looks a bit better than a PS2 imo.
@@wallacesousuke1433 That is a gross mischaracterization. Godot has come a long way in the field of 3D. There just aren't many high-profile examples of 3D games made with it (or games in general, honestly). But even if it isn't on par with Unreal, it is at least on par with Unity.
Great initiative, all the support!
I haven't seen this level of in depth detail about going into how individual games designed how their game utilizes the graphics card & all the different graphical settings. Amazing work
Thanks for sharing your deep knowledge and showing the industry how it should be done. Thank you again.
This deep knowledge at thie young age is really commendable .
Great video. This is my first video of yours I’ve seen. I appreciate the use of realistic dialog when describing this stuff. During the first couple minutes I thought “what is bro yapping about”, and by the end I thought “damn bro, you ate frfr”
Thank you so much for providing an actual explanation as to whats going on and why there are issues.
Way too many armchair generals on the internet keep making vague general complaints about why UE sucks and "is fail" or whatever but no one ever actually explains why. I appreciate your work here.
It is very interesting seeing every single thing happening in the pipeline
30 seconds into the video and i see the TAA doing it's "work" on the hair
i really love your dedication, work ethic and your persistence to make sure your message is heard. keep it up man
if you guys wanna really play the game smoothly now and cant wait for patches to come, I strongly recommend the DXVK GPLAsync Mod. It runs the game in Vulkan, eliminating stutters nearly 80 %. But you lose DLSS and HDR.
I love that this is your passion. I don't have the brain power to have any kind of deep intuition of graphics rendering like you apparently do. I think it's great you put pressure on game developers and game engine programmers to do better.
This is exactly what we need more of on RUclips and in the gaming scene in general. I may not fully understand around 30% of the language, as I'm not in the industry, but I can see that your heart is in the right place. Protect the consumer and the keep the integrity of your profession alive. Please never stop!
Thank you so much for your support!
You are doing a very good and useful thing. I work with Unreal 5 on the other side of the world and I am very grateful to you. Success to the channel.
Its gonna be so funny if it turns out sony's "custom uscaler" is just xess
I wasn't expecting this type of video and I'm pleasantly surprised. I'm looking forward to seeing more explanations of these topics when it comes to videogame development, particularly UE5.
PLEASE TELL US ABOUT TRAVERSAL SUTTER AND WHAT DEVS CAN DO TO STOP IT
What tools they have, and so on.
This is my favourite new channel.
It's stutter from cpu side not gpu
@@mirukuteeaDoes it matter if not even a 7800x3d can’t run it properly? It is UE5 fault.
@@fawneight7108 what i meant is that this channel might not be the best fit to analyze the stuttering since they are focusing on GPU/shading
Im on a 7800x3d and 4080 and only a mod from Ultra+ maker Lucia managed to rein in the stutter. She also fixed several other ue5 games for me. So clearly SOME people like that modder and Threat know how to handle it
@@mirukuteea oh s**t I didn’t know what, due to how many stuff is still wrong on the gpu side lol. Do you have more info on why traversal stutter is in the cpu side?
praying for all of your success!!!
If you told me in 2012 that games would look this SH*T in 2024, i wouldnt have believed you
But the game still looks good
Thanks for continuing to make such well put together content, being effectively a layman it's helped a lot in my beginning to dive deeper into these topics. Kudos
not baking the GI in this game is such an absolute joke ...
Yes, or any kind of volume-based solution would've been nice too. While it wouldn't bring smaller-scale GI to the same level, the resultant performance or image quality and stability would've been so nice.
my take: it's being pushed by either NVIDIA or Epic themselves so they are forced to use better hardware or make use of unnecessary stuff like nanite or lumen (which are cool technologies but need a lot of work). This isn't about incompetence, but most likely about business.
"ohh shiny!"
Baked lighting causes more issues. Extensive foliage can create issues, and dynamic objects are also a problem, and the lights that cast dynamic shadows won’t influence the baked lighting. Also, the game has many dynamic lighting events, and even the flashlight uses GI.
@@justinrunner9384 would you rather play a game that requires you the highest end hardware to play, or sacrifice some realism to have the smoothest experience?
I remember seeing that there are many conferences made by big studios such CDPR that advocate devs to modify Unreal's rendering system and not to use Unreal's default. Wish these would benefit the devs to make something better
what? why would they remove support for hardware tesselation? if they have a better approach fine make that default but why remove it entirely?
Tessellation has been back since 5.3... Not sure why he chose not mention that fact.
HW Tessellation IS NOT back since 5.3, they added NANITE tessellation which is far slower and completely different.
Hardware tesselation is bad. (Efficiency even when tesselation is not happening, culling, not able to use high tesselation factors etc.)
If one really needs non nanite tesselation, compute variant should be made.
@@pottuvoi2 Assuming your referring to Demon Souls Compute Tessellation, we would have no problem with that if it wasn't 90%+(at least) slower than properly managed HW tessellation like Nanite. Despite the recent industry noise calling HW Tessellation "slow", we'll be putting this to the test since these claims do not align with our experience.
@@ThreatInteractive And similar, yes.
Quite many games use compute pass for it now a days.
Wow, you mentioned my comment. 🙂
Thanks for addressing my question.
Your time and effort, so I understand your decision.
Engines like Godot need to increase their market share.
But few people want to use them because of... their small market share.
A vicious circle.
>Engines like Godot need to increase their market share.
Oh no, you want not only performance-troubled games, but also crappy too.
>But few people want to use them because of... their small market share.
Because quality of engine and feature-richness is important too. Nobody wants to use half-baked solutions. If they want, they use libraries and write their own engine.
babe wake up new Threat Interactive
so silent hill 2 is badly optimized because of Fortnite?
Hope this channel blows up. Not enough people pushing back on temporal AA for it to ever change
Optimization god posted again, knowledge increased.
Got the Nanite stuff completely wrong, not an "optimization god" just because he seems like he knows his stuff.
@@colbyboucher6391 then go ahead elaborate, how we are supposed to know if he is saying truth if you can't prove it
@@colbyboucher6391 got what wrong exactly?
@@widrolo I'm not going to claim to understand any of it technically, all I know is that loads of people who use Unreal (regular people, not just big devs) have argued that he ignored Nanite's real uses (and criticizes it for something it isn't supposed to solve), that it starts to make sense if you're truly cranking your geometric detail and don't want any obvious pop-in. It makes that level of visual quality feasible, while if you were trying traditional LOD the cost would increase exponentially until Nanite is, in fact, faster.
@@colbyboucher6391 If you don't personally "understand any of it technically" then why don't you STFU?!
very well explained and detailed breakdown 👏👏
I enjoyed this video very much, and I hope Blooper Team takes notice of this video to help with performance improvements on the game. In one update they posted "Steam Deck performance improvements". While this did not change too much in optimization and I don't think sh2 remake will get steam deck verified, I really hope Blooper Team continues these improvements and maybe get us at least half way there.
Would it be possible for you to check some of the 2024 games we usually consider well optimized?
Like the Horizon Forbidden West or God of War Ragnarok PC ports.
Ragnarok isn't that good of a PC port, it still lacks graphical features that the PS5 has (like raytraced reflections)
@@Nate_M_PCMRthe ps5 version doesn't have raytraced reflections
@@gavinderulo12 yeah I kinda got that wrong, I was thinking about what's going on in this video from DF at around 5:30 ruclips.net/video/cITEbCFGutc/видео.html
@@gavinderulo12 The PS5 version does have ray traced reflections. Both Digital Foundry and NX Gamer have showcased and talked to the developers about it/ They aren't the traditional ray traced reflections and combine that data with cube maps
@@crestofhonor2349 which can totally be done in PC but they decided not to.
Omg I was looking foward to this!
What would be the better choice when choosing an engine today? I am convinced Unreal is not going to listen to critique and will remain focused on their own goals.
If, like the vast majority of people asking this question, you are an amateur hobbyist, then just download Godot or Unity and start your game dev journey. You've got a lot of learning to do before it's even time to start learning the ins and outs of a rendering pipeline.
If you're a professional then surely you can think of better resources than a youtube comment section to get your advice from.
If you are just starting out and want to learn, I highly recommend Godot. It's easy to understand, very well documented, and the community is usually very helpful. Also it runs on a potato, and does decently well in both 3D and 2D games.
If you need something more serious, you can certainly accomplish it in Godot as well, albeit you'd have to put in more of your own work. Unity in that sense could be better in some cases.
@@GlowingOrangeOoze nah, UE is the way to go, you dont even need to waste time learning programming, just Blueprints, so the time saved can be invested in more important, artistic skills
@@wallacesousuke1433 Learning programming is not a waste of time, and using Blueprints to the exclusion of all else is a trap. Blueprints run roughly ten times slower than C++. If you don't use any C++, your game will be needlessly slow and inefficient merely for the purpose of saving developer time, which is exactly the kind of corner-cutting approach this video and the entire channel argue against. If using Blueprints only is defensible, then so is every unoptimized misstep showcased in this video, given the devs were likely under intense pressure.
Yes it is a massive waste of time if you're a solo dev, since learning C++ is a useless skill (little demand and the version used by UE isn't even the vanilla C++) that is massively time-consuming, time that you could/should better spend honing your ARTISTIC skills, cuz that's what matters at the end of the day.
And BP's being slower than C++ doesn't mean anything these days, there are games made entirely with BP that perform better than AAA games with entire teams of programmers working on lol (Bright Memory and Mortal Shell for instance).
Thank you for this video! I'd love to see a similar one for Stalker 2 when it comes out, as that game will likely be a big gerUE5 game than SH2
re: the initial ~1.3ms frame dispatches... this could be debug and stat capture, the game had a lot I've disabled.
re: LODs. this game doesn't use LODs _at all_. even force disabling Nanite the UE LOD system groups are ignored and LOD0 is drawn everywhere. I believe this is to do with the hard culling at 60~80 meters (disable the fog in town to see it). perhaps the devs reasoned that only LOD0 would be needed due to this. however because of how UE builds HLODs this causes sudden HLOD builds (and then shaders/textures/shadows/decals on top of them) where there was no previous LOD to hide it... UEs HLOD system is already problematic (I believe it has a fence/race issue), and hard culling beyond the player's view makes this problem far worse. I've found no way yet to force proper LOD use or fix the HLOD race/hitch issue. HLODs are one of the major causes of stutters in Survivor and SH2
fog: another issue is in vanilla fog is drawn to (about) 170 metres. yet the player can only see 40~60 metres. reducing fog draw increases FPS by around 20%
reflections: while reflection roughness is clamped low (0.4), reflections add to diffuse indirect lighting, qnd raising this clamp dramatically improves bounce lighting in some scenes (while it is expensive, a good balance is 0.55)
AO: I have no idea why the devs chose SSAO for this game even with RT. GTAO or RTAO are very similar performance cost and look much better, and RTAO combined with lumen adds throw lighting to some scenes, improving lighting quality at no cost.
I knew that you would be doing this video once the reviews on performance started rolling, let my grab some snacks, already liked the video
This is the most demanding game I've yet to play on PC, it really brought my 3070 to it's knees and forced me to make huge compromises in terms of performance or image quality. I had a STRONG feeling that there were missed opportunities to improve performance and cut back the insane GPU costs. & don't get me started on the stuttering, which is frankly horrific and the worst example of UE Stuttering yet
"Boo hoo I can't play muh game on ultra settings, urr durr"
@@wallacesousuke1433 You're a fucking retard. I said nothing about Ultra settings, this game runs like shit and stutters on ANY settings
@@b1thearchitect401 I'm on 3080 and it runs fine with dlss and everything maxed out
Great video. I would recommend having the script closer to the camera, so it doesn't look like you are looking down while reading. Other than that, thanks for the info :)
I really like your videos and love your critiques on how games don't run so well even though our GPU is still in good condition. Can you check out non-Unreal Engine games today like Helldivers 2 because I have a 5 5700x3d and Rx 6800xt but somehow lose fps on higher difficulties even at lower Optimization settings even it lacks of Fsr and Dlss. Hope you check how ArrowHead run game runs.
Try DX11 in that game.
@@Funnky the problem DX11 is far stuttery and worser than DX12, thats why I rely on Lossless scaling to gain the Smoothness
Is helldivers 2 not unreal engine 4?
@@Gustaviustwinkelberry no
@@Funnky i just looked it up and it doesn't use unreal engine 4, it definetly has that unreal engine 4 look to it tho...
I'm working on a game in Unreal 5 and I'd love to hear more on this!
Thanks for the analysis!
Me: Having a histology exam tomorrow.
Also me: Hmmm, this is interesting.
Awesome video. I'm making my own game in UE5.
i have no idea who you are but im subbing right away, i love this kind of raw data
Wow, i haven't expected the speech at the end of the video. And i can't say i agree with your sentiment. While Unreal Engine becoming the industry standard for high graphic fidelity games, i don't think this is right. Unreal is a for-profit organization, which prioritizes their own benefits first, and the issues you listed here are caused exactly by that fact.
And this is not the only reason behind the development of the alternative solutions. For example, the aforementioned Godot is far more lightweight, and streamlines exactly that, and a relative simplicity for the newcomers. You probably can argue that Unreal Engine can be even more simple, but since your knowledge in CG and game optimization are quite advanced, i bet you already see the issue here: UE produces terribly-optimized results when the devs use the simplicity it offers.
The other reason to develop any alternative to UE is that C++ has a lot of its own issues, and also, the exploration of new possibilities is the only way to evolve what we currently have. That's why i'm sympathetic to Bevy. Their and Rust's nihilistic approach towards the established standards, which haven't changed much, is almost guaranteed to yield something revolutionary.
And lastly i want to mention the problem of legacy code. I doubt i need to elaborate on that.
"Unreal is a for-profit organization, which prioritizes their own benefits first, and the issues you listed here are caused exactly by that fact." Yes, this. They're undermining their own argument by dismissing competitors so broadly. It actually legitimizes Epic's efforts to monopolize the industry. This would remove any incentive for Unreal to improve, or heed anything they say.
Godot does not have a well-known 3D game -- and I'm not talking about realistic games. In fact, godot's biggest game is brotato.
@@gamerelaxbr2969 Well first, there are 3D games made on Godot. The closest to AAA is probably Sonic Colors: Ultimate, which, according to wikipedia, somehow uses it under the hood. Then there's Buckshot Roulette, which, i guess, has some popularity and is quite known among indie adepts.
Even if we ignore that, how does what you said contradicts my point? Not to mention that Godot is quite young compared to Unity and Unreal, and you can clearly see the correlation between how long an engine is on the market and how popular it is among bigger game dev.
Godot came out when there already was Unity, which already had a free plan, while Unity itself became popular because of it's comparatively liberal licensing, which allowed smaller devs to use it either for free or by paying a small fee. My point is that there not much games made on Godot because the niche has been already taken, not because Godot is bad or does not allow you to make a high fidelity game.
Moreover, it's always hard to make a realistic game with modern graphics. It requires competent graphics engineers to do so, no matter what the engine is. There are games that tried to pursue realism but due to lack of experience, the resulting product was either lacking graphics fidelity, style persistence or optimization. For an example you can look at projects like Mordhau or Escape from Tarkov. One made on Unreal Engine, and the other one on Unity. IMO, they are lacking at least two out of three aforementioned criteria.
As I already stated, industry standards are quite inert. If you want to have a job in game dev, you most certainly should learn either Unity or UE. Majority of jobs out there will require one of them. But if you're a solo, or you're a bunch of enthusiasts, why would you? Only because there were games made on those engines, and you liked them? Because they looked cool? Those games were made by more competent people, that had far more resources than you do. No one should ever choose a technology for their product only because someone some day made something cool using it, and refrain from using the alternative because no one ever used it before. This approach is shallow and dumb. This approach deems any potential revolution impossible and dooms you to mediocrity.
@@gamerelaxbr2969 Sadly true, but that's more due to a lack of adoption than anything to do with its innate capabilities. Though that hasn't always been the case.
@@gamerelaxbr2969 Road To Vostok is looking pretty decent. Eventually I can see it bringing much more fame to Godot (or it's forks, since Godot decided to go woke).
Thanks a lot for your work showing these frame analyses, I wouldn't know about it otherwise
After taking a look back at the UE3 games again, I am shocked how well most of them (even the late era ones like A Hat in Time, MK11 and Rocket League) were actually very well optimized! Could you take a look at some of them sometime and perhaps reflect on how the 3rd version of UE was way better of an era? Thanks, keep up the good content!
Are you running them on a system with period correct hardware? Otherwise of course beastly futuristic tech like what we have now can run those old games exceedingly well..
I liked UE3 generally speaking, however it would stutter a LOT and in almost every title (even with modern hardware, btw). - Now fortunately that stuttering could usually be fixed with a couple of config edits but you know only a fringe few of us bothered to do that. IT was so bad that anytime I got a new game I would look up the engine, specifically because if it was UE3/UDK I already knew what config entries I would make..
@@OGPatriot03 I never have stuttering issues in almost every UE3 game I play, even with no config tweaks.
this channel has been great learning experience of what to avoid. wich is amazing for me as a brand new dev starting with UE5 .Already avoid Raytracing because the performance cost is not worth what could be done with good baked lighting and felt lumen was similar case. Saw to many vids say nanite is great but so glad this steered me clear from it.
The thing I love about Threat Interactive is the fact that i'm actually learning something new without feeling like the content is being dumbed down like with other tech youtubers.
Why does nobody else talk about these things in games? In such great technological detail too. Thank you Threat Interactive
really what the aaa games industry feels like to me now is "WOW SHINY NEW TECH AND TECHNICALLY SUPERIOR TO BEFORE PUT IT IN JUST PUT IT IN WHO CARES ABOUT ANYTHING."
and then we get the game and its like "why is this running at 45 fps on my 7800xt?"
excellent video! S2
How can i integrate all these optimizations in my own game development?
This is some great stuff to learn about, as our tech gets stronger it seemed like our care about truly being optimized went down the toilet, it's good to see that people do still care and are taking action. good luck!
Would love to see a video on CPU optimization. The original kingdom come deliverence will be a very interesting subject. It still brings modern cpus to their KNEES despite being from 2018
that's because it's a pile of shite
Funnily enough I can run KCD on my laptop specifically because of that. It's an officel laptop, so it has a really good CPU, but a lacking GPU - MX110 :(((
I like the funny česká hra na hrdiny
@@thecardboardsword nah it's a great game
For something that has tons of unique NPCs on screen with extremely complex schedules it runs surprisingly well
Because it's the CryEngine of the time (crysis remasters), so very cpu limited from the get go (and main thread limited) :/
Immaculate coverage
Epic removed what? they removed HARDWARE TESSELATION SUPPORT? WHAT?! 13:29
Yeah I was shocked when I loaded up UE5 for the first time and realized it was missing. Their solution was to use nanite which is crazy imo because if you just enable nanite on an asset you get a performance hit that is pretty drastic. I'm really hoping they don't do away with the ability to do bakes in favor of an all lumen pipeline next.
Tbf I dont think this is too bad. Not talking about hardware tesselation in particular but removing old features is important. I've worked a ton on different frameworks (not game related though) and dragging along and maintaining old features often severely worsens the quality of new ones.
yeah it is stupid. i wanted to dynamically tesselate my ground based on camera distance and learnt it is not possible lmao.
@@pygmalion8952 it is possible, do your research.
Just use an older version of UE that has this feature? Obviously the engine devs will want to focus on their newest advertised features instead of something less important.
It's like asking why does UE5 not support directx7/9/10, since they're less resource heavy! The answer to me is quite obvious and is the same to your question, it's not necessary in the feature set of the newest engine version..
I suggest Remnant 2 for UE5 analysis. From my understanding devs fixed some issues like culling with patches and overall it's decently performing UE5 title without issues like shader/traversal stutter.
I am curious what he'd say about it now. I know the video creator once called the Remnant II devs "trash" on the Unreal forums because of its poor optimization on release and heavy reliance on Nanite.
First time viewer here! I really enjoyed this video, I’ve never watched a deep dive into a games engine and settings.
I’d love to see a review of Space Marine 2. It’s a great game, but it definitely needs optimizations for rendering and It’s also very CPU heavy.
PSSR using terminology from XESS might be because XESS is open like FSR, but FSR doesn't yet have an AI version, so Sony studied XESS as a baseline on how to develop their scaler. Moores law is dead has also said that his sources say AMD did not help sony make PSSR, but sony was willing to share knowledge about PSSR after it was made, which AMD is most likely to use to make FSR 4.0, which will be an AI upscaler and only supported on 7000 and newer radeon GPU's with FSR 3 getting small updates but being the last (and best depending on your view) spatial non-AI upscaler.
If what you are saying is true then man...
Amd is slacking
XeSS ain't open xd, not even open source
FSR hasn't been a spatial upscaler since 1.0. FSR 2.0 and onwards are all temporal upscalers.
man just found out this channel. Incredible goo stuff. would love to see analysis of other games engines like MH wilds and such but iunderstand focusing on UE5 might be the most important for now.
Regarding MH wilds: ruclips.net/user/postUgkxW_p7sIjNUYK1L4jhHJDDR6IN8_YH9VFs
Watching this video as an Unreal Engine 5 dev and discovering new ways to optimize games is kinda dope
I'm no graphics developer, but surely having so much fog could had been used to decrease polycounts. As an artist, it makes no sense in rendering the foliage to the most minute detail, only to slap the dense fog on top of it