Immortals of Aveum PC Tech Review: High-End Visuals Demand High-End Hardware
HTML-код
- Опубликовано: 30 июл 2024
- Don Allen takes a look at the PC version of Immortals of Aveum. Just like the console versions, all Unreal Engine 5 features are deployed and they're suitably taxing on hardware - the difference this time is that superior upscaling options are available and in line with its demanding recommended hardware specs, it is possible to power your way to a good experience.
Subscribe for more Digital Foundry: bit.ly/DFSubscribe
Join the DF Patreon to support the team more directly and to get access to everything we do via pristine quality downloads: bit.ly/3jEGjvx
Want some DF-branded tee-shirts, mugs, hoodies or pullovers? Check out our store: bit.ly/2BqRTt0
For commercial enquiries, please contact business@digitalfoundry.net Игры
Traversal stutters have been a persistent thorn in recent UE4 games. It's worrisome to see this issue occur in a UE5 game.
It's the most annoying type of stutters because you can't do anything about it, only devs can try to fix that.
@@Z3t487 i hate stutters... a little stutter here and there is not that bad as long as its a microstutter a small one, but in general games should run smooth as butter, otherwhise i wouldnt play it as long as its patched...
im some games it need 2 years... release is not "ready to play with good performance on your system" :(
Unreal Engine Is Garbage.
Despite its marketing UE5 is just UE4 with some new features. Most of the UE4's systems are still there and many of its problems carry forward.
Many games, even on different engines suffer from it though. Apparently it's rather difficult to figure out a system to not bottleneck every single device in existence with huge amounts of data that need to be loaded all at once.
I certainly would've hoped someone figured that one out & standardised solutions to it by now, but apparently that's only something very few developers actually care about.
These aren't high end visuals.
Lol wut, what's high end for you then
If this is high end for you then i won't judge, i completely respect your rock bottom standards.
This would look high end back in, like, 2014 compared to killzone on the ps4 😂
@@mohamad-abdolmao
Yep.I played Dead Island 2 recently ( ue4, baked lighting ) and it looks just gorgeous. Hell, God of War 2018 looks way more impressive and it ran great on older video cards ( 2080 Super). So i don't know where that title came from, i guess if it's UE5 it automatically gets a pass.
The visuals really aren't deserving of such a need for power. There's barely anything on display. That's the kind of game that should run close to DOOM Eternal at 200+ fps on those high-end specs and DLSS.
Exactly. High paced - high adrenalin shooter like Doom. There is nothing special about it. Give us frames!
Doom Eternal looks better than this at a glance.
The difference is every asset in Doom was hand crafted and optimized for gameplay over all else. You can tell when teams do it right and when they do it wrong. With Doom there was a different mentality going into the project. Immortals was interested in having the most "next gen" graphics while doom was, gameplay first all the way through. They just had awesome artists who knew how to optimize games.
it absolutely reads better at a glance. Doing the blur your eyes trick is a good way to tell if level design is working properly. You should still be able to see enemies and routes for the player to go. Immortals would just be a blur.@@XSR1K
Doom Eternal's Engine is far more efficient in comparison to UE5. UE5 is very demanding.
Nothing says high-end visuals like 436p/720p on current-gen consoles. ❤🔥❤🔥❤🔥
Visuals != Resolution
If Digital Foundry wouldn't have told us, we wouldn't have known and would have said that the game looks so effing next gen !!!!
And that says a lot about us, gamers.
In other words, because of that,for many, Digital Foundry is ruining gaming.
@CrashBashL naah. You can pretty much tell that from the sharpness of the image. Im cool with 1080p. But anything lower than that is noticeably blurry or muddy.
The game has great particle effects, even then i'd refuse regarding it as "next gen"
@@CrashBashLThough I can only judge from personal experience, I heavily disagree with you.
simply because every time I see a game upscaled, I question its purpose. it only ever looks sharp when the game is standing still, at which point you might as well just stare at screenshots.
In motion, when upscaled from lower resolutions, it’s utter dogsh*t and only serves lazy devs who don’t want to tailor their development pipeline for specific hardware.
If you see a game upscaled from 480p, there’s is absolutely no missing the issue on a console that can output split 2K or 1080p.
@@CrashBashL Then check your eyes, because most people could see when either this or Final Fantasy 16 was an unplayable blurry mess due to the resolution drops. Some people just decide to ignore it because they are itching to play the game for some reason (guess they still find it so entertaining that they are willing to look past the myriad of performance and graphical issues).
Here's the thing though. It may be technically impressive but it looks no more impressive than countless other games... Diminishing returns have hit hard.
I’d say it’s impressive, smokes anything last gen in visuals
The Order: 1886 looks better@@oozly9291
doing tons of technically impressive things that don't have most users going wow is a waste of time and resources - this is not Crysis or Half Life 2, not even close.
bruh CP and RDR2 look a 1100000000 times better than this, what are you on lol@@oozly9291
Devs said "they were doing things the un-practical" way which wasn't smart
Can't wait to buy a PS5 pro and finally be able to play games at 720p 60 fps .
Lol
lol its not a lie. On consoles they are using upscaling to 4k with an internal resolution of 720p. Haha
Native 720p, without FSR upscaling from 360p
And pay extra money for it also lol.
When the 2080ti is the minimum spec game for a playable experience for this game , you know this game was going to be very demanding, the game doesn’t even look impressive older game look on par or even better then this game and they performed better too.
Whats exactly "high end" about these visuals? They definitely dont warrant the abysmal performance.
Hmm those red, green and blue colors are... something.
Literally 2000 games have way better visuals than this,,, Maybe not better resolution... Still
The high end visuals refer to mostly lumen and nanite which are both very taxing. However I think people look past that since the art direction isn't very good imo. It looks a bit like godfall in that respect. Goes to show you its not about the technology its how you use it.
@@jai2628looks pedestrian, not impressive at all
@@majormononoke8958 this is a joke right?
I dont get how games like doom eternal can look so amazing and run extremely well with a lot of different low end and middle range gaming cards and we now have games that dont look all that much better than doom but require a beast of a gpu.
All about the engine
Because ID Software knows what they are doing.
Because ID Software employs skilled software engineers and empowers them to make the technical decisions.
I don´t have insider knowledge in any of those two companies specifically, but that's pretty much always what it boils down to.
its all about having a engine specific for what you need and focusing on performance since day 1.
@@r3dtext The best tools in the world still need skilled people to use them effectively. It's why there are so many better looking games on older, less advanced engines.
Compare the overall visual presentation of games like this to last-gen games running on a fraction of the specs.
Games like Uncharted 4, Ryse: Son of Rome, The Order 1886, Battlefield 1, Doom Eternal, the new Battlefront 1 & 2, etc. all look on par or considerably better than new UE5 games like this overall despite using more traditional rendering techniques.
Add to that Horizon Forbidden West running on PS4, Killzone Shadowfall, RDR2, Forza Horizon, Senua's Sacrifice, and more. Heck, even Killzone 3 on PS3 looks comparable as an overall visual package.
Facts. Seems like DF has a crappy short-term memory cause they can’t seem to put things into perspective at all. Nothing about this sh*t screams “next gen” because the overall presentation doesn’t even stack up to last gen titles from almost a decade ago, lumen, nanite included.
The closest we ever got to some tangible “leap” was with that matrix demo.. and it’s just a demo made by the engine architects and aided by Coalition. It’s a pretty piss poor showing from a console generation that’s already several years old.
Forget these ''recent releases''. Go all the way back to 2013 and look at crysis 3. Or 2016 and look at battlefield 1. Or even look at forza horizon 3. Look quite realistic.
Most comparable title is Doom Eternal, it just looks and runs awesome compared to this generic tech-demo fantasy world
@@flyingplantwhale545 seems to be a case of missing the forest for the trees. Newer games have impressive technical features such as nanite, lumen, ray tracing, but these have steep performance penalties and rarely have dramatic improvements over former techniques. There has also been a huge loss in using impressive tools like cloth and hair physics and scene interactivity despite implementing lumen (which is the whole point of dynamic lighting vs baked)
Why do developers think that AF set to 16x affects the performance in any way? Even to this day, when you have graphical presets available in the graphics options, they usually also scale the AF to 4x - 8x - 16x depending on which preset you select. This hurts texture quality immensely and costs nothing in performance. I feel like they're stuck in 2003.
Yeah since forever 16x af is like 1fps less
Even in the newer consoles, the AF are often usually low. I could maybe understand the original PS4 and regular Xbox one having lower AF but the fact that even the Xbox series X has low AF is stupid.
Maybe they base the performance impact on the consoles.
DF has talked about how AF has a much bigger performance impact on consoles than it does on PC.
Was just about to say this, I have no idea why AF is even an option anymore in 2023 and I don't understand when developers choose to go with low filtering on modern consoles.
It must be some legacy optimization setting that's completely irrelevant by now.
@@vintatshBecause AF is much more expensive on the consoles because of the shared memory,in PC the cost is negligible unless its x16 but not in the consoles wich x8 is already a lot of bandwith to it.
no optimization = no sales
Most people have less than rtx 2080
all time peak was 751 players
@@tyre1337lol are you serious or just joking?
@@TerminalHeatSink serious, on steam database at least
@@tyre1337 jeez that's insanely low for a triple a game
I have a rtx 3080 ti and the performance is horrible even on low setting 1440p
I'm looking forward to seeing Alex's tech breakdown, because I honestly struggle to see what's so impressive about this game to justify these performance numbers.
I mean, at the end of the day the game doesn't look very impressive to my eyeballs so it won't really matter, I guess, but I'm still curious.
There's no amount of fancy rendering that can save you from mediocre visual design (though fancy rendering can absolutely *enhance* great visual design).
Exactly.
The scale of a lot of objects in the world are off mostly noticeable in stone. The old unreal 3 wet specular plastic is in force for this game as well.@@AndreInfanteInc
its about being correct and physically correct with light and object LOD
1. developer have way less work now setting up lights and pre bake things, its all live...
2. no need to make 3/4 version of same object for different polygon count...
so ya "faking it" can look look great, but that is coming to an and.
remaining me the time when we had pictures as the background in games in the 2000's and when we did the change to real 3D background it actually looked worse and was way heavier but the rest is history and we know how it looks today.
If it has to be explained then these devs have failed. Nobody is upgrading their computer for this result.
"High-End Visuals Demand High-End Hardware", the question is if this are indeed high-end visuals. I didn't see anything remotely impressive in any image of this game that should request such high computing power. UE was really the big disappointment of the year, so many poorly optimized games. I hope that either studios will manage this engine better in the futur or that they will just drop it given the lake of good releases using it.
UE is a disappointment every time. Any time I hear a game is made on their engine, I immediately expect poor quality
Unreal 5 is a good engine for game developers to use for games that will be on consoles as it seems to play better with AMD hardware. I don’t believe it’s going to go anywhere.
To be honest I'm getting more and more confident that we've hit a solid wall graphically in the games, there's nothing left to improve that would make a big difference. You can add more polygons, full dynamic lighting, better models, but at least we've had very good graphics for the last 3 years now and the current improvements are now questionable if they're worth the performance hit as they're very minor.
@@zq7246 yeah and PS5 and XSX runs it in 720p and XSS in 480p resolution quality is so awful that it made all UE5 improvements irrelevant and they were so far in this game very minor compared to games released in last 3 years anyway.
@@Extreme96PL that’s upscaled but those base resolution figures and end results will surely improve once FSR 3 is released. All three GPU manufacturers are leaning into upscaling technology and as always everything will improve as technologies move forward.
Unreal 5 is going to make a lot of people needing to upgrade their GPU which is great for a business for Nivida, AMD and intel.
Those "high-end visuals", are they in the room with us now? ;)
The visuals are only "high end" because it's stuffing in Nanite and Lumen to cater to visuals that look barely better than a last gen game with baked lighting. The Lumen On/Off comparison near the beginning is completely disingenuous, you could absolutely get the game to look like Lumen On without Lumen on such a static game, and probably running a ton better.
But it costs money to imprement baked stuff like the old days! Now they want everything to run in real time so they don't have to bake anything and save time/money.
I agree. They could have baked in the global illumination and everything would run a lot smoother and the requirements wouldn't be so ridiculous
@@Z3t487 And them "saving money" (which I don't buy, considering the stories we hear about budgets) makes consumers need to SPEND money on stronger hardware to actually run this bs.
3080 is not even good enough for their "medium" preset lmao jokes
@@Z3t487 Don't ask questions, just consume product and then get excited for next product.
Is there even a day/night cycle in this game?
To be honest, instead of super realistic and sterile graphics, i would rather see destructible walls for example.
That way the real time G. I. would actually make sense too. If the world is static and there's no time of day either you might as well pre-bake the whole damn thing and have it run well on machines with half the specs.
Sterile - that's exactly the word I've been looking for whenever I see any Unreal Engine game. Pretty in a shallow sense, but generic and... Yeah sterile
It doesn't even look that realistic. I've seen modded skyrim se look and run better than this.
I think we don't have much destruction in videogames because the excuse is that we have limited CPU resources. But some games in the past had good destruction (Red Faction Guerilla) and some in Crysis too for example.
What are you going in this channel then?
This is for graphic lovers
It's unacceptable for modern games to look and perform badly on all platforms, no matter which console or GPU you have nothing will deliver a good gaming experience. I've skipped so many games and decided to wait for patches or new more capable hardware because of it. This has to stop.
stop whining
@@gozutheDJ You are very much the problem of modern gaming
@@gozutheDJ Grow a spine.
@@gozutheDJ okay Arnie.
Unfortunately hardware is approaching the point of Moore's law, and diminishing returns. We are at a stage where hardware is stagnating unless a whole different approach is taken to electronics as a whole..
I think the poor DLSS 3 implementation, shadow glitchiness on some npcs, and crashes on console reveal that this game just isn’t as polished as it could have been and this probably extends to overall performance. Idk if it’s just me, but grass looks like it’s just floating on the ground, with little to no shadowing. Is AO/VSM broken?
The game does not seem to use VSM(Virtual Shadow Map) but PCSS,its a different shadow filter that been in UE4 since years but barely used by dev(specially last gen console ), low res shadow has a lot of shimmering (bellow 4096)but looks very good when high res(8192) spotlight shadows has no filter,so you get unfiltered shadow, like a game from ps3 era
Hey, just change the DLSSG dLL from 3.1 to 1.07 version and the artifacts and ghosting are gone
DOOM Eternal looks way better than this.
Yep, something tells me too that the devs where "in a rush" to say it as politely as i can! Why else would a game that is soo colourful/effect heavy in its core gameplay not support Hdr!? It is THE Game to show off Hdr imo. but for some reason nada!
What happens when Nvidia wants to be all involved and cram whatever new stuff their working on to the ruin of a tech demo...I mean video game.
This game honestly looks like it was made 5/6 years ago. The only thing that actually looks next gen is the trees. That is pretty much it. The Triple A Gaming industry are taking us all for fools.
The only game that truly looks next gen is Alan Wake 2
i think its even worse than that. a lot of the new unreal features are high cost to maintain and dont deal well with a dynamic environment.
Its the same at the game studio i work for where we now switched to unreal 5 and the games now look "prettier" as long as you dont move. and so much of our time and resources are invested in the game looking pretty that gameplay and the dynamics of the world really suffer.
It's an EA game. Never expect much from EA.
A lot of the quality gets lost in RUclips compression
It looks very unimpressive. Where are the benefits of owning a $3000 PC? I don't see them in this game.
The more 'current gen' games come out the more I really start to appreciate just how good PS4 era visuals really were.
Yeah what's really killing me, is that ever present trend that we have seen forever now where we've sacrificed clarity and and tangible visual fidelity.. or legitimate interactivity for the sake of a micro detail that the compromises often obscure anyways.. like what's the point of of all that fancy meshing and whatnot if the image isn't stable in motion, or the resolution is dynamically shifting at every moment that should be the game and it's prettiest.. and yet it's presented at it's worst.
Like Half-Life 2 wasn't the best overall game as far as gameplay went back in the day.. but the way they integrated physics, you had this impression that this was this new standard and you started to see that sort of.. regardless of how expansive Oblivion was, they made a point to integrate to havoc physics system properly.. granted it had some of the worst RPG storytelling and characters of even games at the low standards they were back then.. but that wasn't for lack of trying that was just a lack of skill LOL.. and we started to see things like terrain deformation and other environmental realism and interaction with far cry, that would culminate with crisis..
and now just the idea that you can move objects around is like some sort of novelty.... And that's intentional, sometimes you can't even move them around as part of the environment.. and you find so many different static objects and I'm sitting here like.. so you see this a lot with mods, and it's really awkward especially in source engine because you're kind of trained in source engine to interact with objects.. but the idea of a professional developer with all the modern engine tech and tools, and all the rendering performance going towards stuff that only brings it slightly closer if at all to a visual Fidelity marker, and at every turn pulling back from how good that looks in motion be it just from rendering stability or just with how simple and short-sighted the amount of fidelity they wanted to put in the primary form of interaction for the game.. you know the shooting part.. like why why did you spend so much time on these environments when you don't do anything in them and they don't do anything back.?
I think this is also a problem with the art direction for this game and a few other sub AAA games. An art director and lead designer needed to dial it all back to make gameplay reads super clear. That is a big reason AAA sony games look great and play great. Micro detail is cool but not when it is everywhere. If you blur your vision, you should be able to clearly see enemies, routes etc. Thats a huge part of making games look good. It's not just raw graphic power.@@jakedill1304
also any natural environment you see in unreal 5, at least the photo realistic ones are extremely easy to create. There are online libraries of every rock and plant imaginable. Not really every one but thousands. So those micro detailed natural aspects are actually short cuts rather than time consuming. Google megascans, you will see what I mean. @@jakedill1304
I replayed Shadow of Mordor this past winter maxed out on PC. Looks like it came out now
Shoot, some PlayStation 3 games still looks pretty decent!
man this comment section really doesn't understand REAL innovation. I, for one LOVE playing a 7th gen looking game at a native 476p resolution locked at a cinematic 24 frames per second. This is what PEAK next gen gaming looks like whether you like it or not.
The magic gun arm looking looking like watching a youtube video in 2006 on your family's PC, it's so immersive! Why did he bother mentioning ultrawide support? I'm throwing my monitor out and switching back to 2000s LCD for the true experience this game deserves.
Lol
Another absolute banger courtesy of Unreal Engine. I love when slightly better than mediocre graphics for a mediocre game brings my RTX 3090 to its knees.
Nice proper use of “its” by not spelling it “it’s”.
This game looks terrible for how it runs on my 7900xtx
Where are those "high end visuals" I don't see them anywhere at all.
How you call that "high-end" visuals is beyond me
Every Unreal Engine game is a redflag for me.
Kudos for the companies that still have in-house engines.
Unreal Engine 5*
Even Bethesda's creation engine looks amazing and next gen with starfield.
Next game from CD projekt will be Unreal Engine, they don't want to use their own anymore.
@@SwisshostHopefully CD Projekt Red will optimize The Witcher 4, 5 and 6 on UE5...
Edit: and The Witcher 1 Remake as well.
@@KneppaH not amazing ir next gen, It looks ok, nothin impressive and subpar to bf1. Decent
This game really looks mid all around. Like the developers just chose "we're gonna make the most 2023 game ever"
And that's before concerns with writing, voice acting and other issues are discussed
Whole game from top to bottom looks like it was generated by an AI
My initial thought was Forespoken, but even more generic.
literally the only thing thats good about it is that it uses lumen and nanite.
@@TheDuzx the leads on this were dating the leads on Forespoken and felt bad, now Forespoken looks better - mission accomplished.
No matter how many new technologies are used and how impressive they sound, if the game does not even look good enough to justify those demands, it's a fail.
This game looks absolutely shite in comparison to the graphical masterpieces of the PS4 era such as Arkham Knight, The Order 1886, Uncharted 4 etc. Art direction is ultimately what makes the game's graphics stand the test of time, not technology.
This game is a magic FPS with a lot of lights produced by EA and that alone makes it a fail in my book. Outside of Mass Effect and NFS I make it a point to stay away from EA.
When I'm looking at this game and compare it to Doom Eternal on Ultra Nightmare settings... I can't see what is so demanding about this game. And it's running way worse than Doom with RT enabled.
@@michaelcarson8375 Game It's not produced by EA, it was done by Ascendant Studios.
And it was published by "EA Originals" program, just like, Unravel 1&2, Lost in Random, Fe, Sea of Solitude, A Way Out, It Take Two, Wild Hearts
So that's big L for you, and stop spreading fake info.
@@calmarfps "EA Originals" are published by EA.
As expected, UE5 is shaping up to be this big, expensive tool that every kid on the block can code for. But it's no substitute for quality art design and the raw power it demands from the hardware is most likely rarely gonna be justified by the visuals.
this game is basically unoptimized tech demo...great games in UE5 are yet to be made
@@matka5130 Robocop. The Finals
Dlss and fsr should be a nice bonus for fps gain, not a necessity. Game devs need to scale back their stuff, like these visuals dont warrant a 2k dollars PC.
Try 3K.
I do sort of wish there was some sort of discussion about what performance is like *without* upscaling. I know it's assumed that anyone playing this game would be using some form of it, but I think it's important to discuss what it's like without it, especially since FSR seems not up to snuff visually for this game (specifically with the spell change animation).
Because of how nanite and lumen work there isn't much performance to be gained by just lowering the graphical settings, since there are no low res models making upscaling the best way to claw performance. Thats kind of my speculation. Having tried to mess with settings you don't really gain much performance .
I've seen a video of someone maxing it out at 4K Native on an RTX 4090: 29-50+ FPS.
FSR is like that unfortunately for basically all other games too… It’s a vastly inferior upscaler to DLSS and absolutely sucks as you saw with movement, however everyone can use it not just RTX 20-40 cards so I guess it evens out lol
@@alenko4763Hopefully Xess will become standard instead of FSR. Xess is better in everyway and can every rx6000+ cards can use it
@@TerraWare With nanite there aren't lower res models or LODs, but models do get rendered with less polygons when using lower resolutions.
3080ti for 1440p/60/med with upscaling really is all you need to know about the optimization efforts here. Visuals are not good enough to justify the performance.
Also I'm not sure if they were sponsored or not but three videos for this game that nobody really bought or cares about seems like a waste of time with all the other games out or on the horizon.
its good to get these videos out of the way due to when Starfield hits on PC it'll take a while to properly review that game
There's no conspiracy. It's the first AAA game on UE5 after years of waiting. It's obviously a game DF is going to want to closely analyze for that reason alone.
With that card and settings you can run the 100% path traced version of Cyberpunk 2077 with more fps.
Of course they were sponsored. So were a ton of streamers. And this has resulted in a sub 200 playerbase on Steam right now. So... lol
Perhaps as a community we actually are better at seeing through the shilling than I thought.
@@IHeart16Bit soooo many people don't know how sponsorship works LOL if it sponsored THEY NEED TO TELL IT or they could go to jail and have a fine... i know it's internet but it's not an outlaw area.
I know Alex is a big supporter of new tech, as should everyone here, but does this game really warrant the performance we are getting here? Will he be critical of all the things Allen pointed out and more or will he say UE5 is the future and we should embrace it. I certainly won't embrace something like this.
He most likely won't take it well.
The biggest issue is lumen. Lumen allows devs to create a level and light it up naturally. That's super expensive. Most games use pre-baked lighting calculations all over, which save tremendously on performance, but require devs to do it all manually by hand. Nanite similarly in concept, but for geometry. I think they're both the future of developing a game, but right now it'll not appear much better than what we already have and at very bad performance.
Way I see it, we kind of need games that attempt to push boundaries, if for no other reason than as a trial run for what can currently be achieved with cutting-edge tech, and how to build upon that in the future. With regards that future being Unreal, it's not just opinion at the moment. Considering how many studios are moving away from custom tech and embracing UE instead, at least the near future will indeed be all about UE5, for better or for worse, which all but legitimizes the concerns (and even some of the outrage) people have with regards to the first wave of UE5 games' system requirements, performance and visuals and whether or not the engine is actually going to be properly scalable.
That being said, after how quickly EA dropped support for the Dead Space remake and how poorly Jedi Survivor still runs, I definitely won't be buying this...
He will simp for ue5, df hast to protect their access to devs.... Money money money
@@fvalloWhat makes you believe that? Considering his been making a lot of noise about UE performance issues previously.
My problem with this game is it isn't doesn't have the kind of production values that demand technology like Lumen or Nanite. Lots of last-gen games look better and run four times as fast. I'm worried about indie and AA devs using Lumen and Nanite as a shortcut around spending time on baking lightmaps or tuning LOD even if the game would look just as good with those features.
Don't worry too much a lot of Indie devs know their audience don't have a lot of demanding hardware, I can't even think of an indie game that is hard to run so I really think they will keep it that way
might just be me, but i would like to see more AMD GPU's in your tech/game reviews. A game thats stupid demanding would be a good chance to demonstrate a 7900xtx vs 4080 and such for the high ends.
Just watch Hardware Unboxed videos for that ;-)
I mean they'll be pretty comparable in performance. The main point was to measure performance of several PC tiers, not to necessarily to compare GPU to GPU performance
its just you
nothing personal , just business. if you know what I mean 😜
I really don't see the point in UE pushing new tech if that new tech does NOTHING to solve performance issues. Having these cool techs like Nanite and Lumen would be nice if they intended to push the envelope while keeping the performance hit at a minimum, IMO that would mean it's indeed new revolutionary rendering tech.
Instead, it seems like UE's new rendering tech is supposed to make the development proccess easier while offering "higher-quality" visuals, but at a very linearly, or even exponentially higher performance cost. To me that's pretty worthless tbh.
UE makes development easier at the cost of performance. That's like it's thing. Terrible quality devs can push out 'good' looking games at awful performance because they just use all the automatic features offered by the engine. Instead of putting in time to optimize their game and build their own solutions.
@@42crazyguy Yeah, and to make matters worse we get grainy and blurry visuals for most UE titles as well... they have this "poorly denoised" look to them
I liked this game better when it was called when it was Lichdom: Battlemage and didn't demand system requirements that would melt my PC and cost a small fortune just to play at a reasonable frame rate. 🤣
To be fair Lichdom was a very demanding game for the time
Was just about to type this
@@cluclap I do agree while very true that statement, my counter point to that is at the very least Lichdom did not require mandatory upscaling for stability.
Cyberpunk 2077 looks million times more beautiful and it´s an open world game, while running much better than this. These UE 5 games need more time in the oven for the optimization proccess, or the Unreal Engine 5 is just not that great of an engine as we´ve been led to believe. Hopefully it´s not the latter..
This game runs like the path traced version of cyberpunk but doesn't even come close to the graphic quality.
@@KneppaH yep, exactly
I wonder how many copies they expect to sell with those minimum specs.
1000
It has already bombed on PC. Peak concurrent players was less than 800 from what I saw. You price people out of your game and this is what happens
@@CaptToilet Shocking, right? 😱
😂
Ye i dont know what were they thinking making these decisions
Considering it's the same company that expected Dead Space 3 to sell more than Dead Space 1 & 2 combined to not be a "commercial failure", they probably expected to out-sell the upcoming Modern Warfare 3
2015 Visuals Demand 2025 Hardware.
The art style made me think of Warframe (lots of golden highlights on old massive fantasy structures) and Warframe basically runs on toasters while looking more or less as good.
Talk about complete ass optimization. I don't think these devs know how to properly use UE5. People need to stop blaming the engine!!!!!!! As mid as it gets with these overrated AF CPU bound "AAA" mid games
@@nathane5287which is why I love it. Looks like 2030 visuals to me.
😂😂😂
Unnecessary partical effects makes this game demanding 😅the whole next gen is basically partical effects... Same for remanent 2...
EA's DICE was making better looking games in 2010s 😂
It's all about the right art style and lighting... Look how good dice's battlefront/battlefront 2 looks and then compare it with the new jedi survivor and tell me a game that's looks so bad but needs more hardware...
I'm sorry, but I find the visuals simply disapointing when looking at the gear required to run it somewhat properly. Hell, on consoles I'd even argue it looks worse than plenty of cross-gen titles, due to the fact that the low-res upscaling basically hides most (if not all) of the 'extra's' UE5 is supposed to bring. Not liking this at all. Might just be me though.
On top of it, it’s a highly controlled linear environment. What’s on display is dogsh*t even compared to some of the tech employed in last gen titles with older engines (and way more ambitious games design + physics). It’s just incompetent devs letting an advanced engine pick up the slack.. which ends up in a barely serviceable modern title.. but most definitely not “Next gen”
Yeah if this game had the same brakethrough graphics like Crysis had in 2007, this could have been excuse. But Immortals of Aveum looks just mediocre
2080 super minimum for 1080p/60fps lmao, good luck getting more than 500 people to play your game
oh wait the game had an all-time peak of 482, what a coincidence
They know how to sell a game
2080 super minimum for 720p upscaled actually!
1080 o but with dlss haha
In all games, Anisotropic Filtering should be set to 16X.
The game itself shouldn't have this setting so that players can't mess up their texture filtering.
Agree.
Getting +1 FPS (at max!) doesn't justify losing texture quality at distance.
With nanite, there are way less textures where it matters though. Also why the game is surprisingly light on VRAM usage AFAIK. A lot of the detail comes from geometry detail with a simple colour, instead of larger triangles with textures.
Still, anisotropic basically free on PC, so crank it.
I don't know if I trust modern devs to set AF to 16x considering how this game's devs seemed to think it would cause a massive performance hit.
That's why It's the one setting I have forced in the drivers.
@@42crazyguyLol I bet some intern or something thought the perf cost scaled with the multiple (4x, 8x,16x) or something like that. Still blows my mind you could be a game dev and not know how that functioned or at least the perf cost
Can't wait to play it at 480p@30 fps on my GTX 1080.
I just want optimized games without the need for constant upscaling... If a 4080 needs upscaling to get 4K 60FPS+ that just seems like a lazy excuse on the devs part not to optimize properly in the first place cause "Hey just use DLSS or FSR"...
Yep. These new consoles can't even do native 4k at raw performance without upscaling. And it's a damn shame that even on a high end rig, games struggle to stay smooth.
Seeing this video, what comes to mind is "Just because you CAN, doesn't mean you SHOULD."
Yeah EA will definitely fix traversal stutters quickly.
Just like they did in Dead Space😂😂😂
Hey, just change the DLSSG dLL from 3.1 to 1.07 version and the artifacts and ghosting are gone and for performance improvements just force enable resize bar with nvidia profile inspector
UE5 is a slow bloated engine?
More like UE5 devs should not make their whole presentation depend on nanite and especially lumen. And rather make the normally and have lumen and nanite as some special treat. Especially for a static lit game like this lumen is pretty much useless, a forward+ renderer should have been more than enough.
This game doesn't even look that much better than what was available last generation, and yet it performs this badly.
Unreal Engine 5 just hasn't proven itself to be a performant, reliable engine yet. There are too many problems.
The game's art direction is bad, that is why even though it is so demanding we don't feel this is justified enough (another example being Forspoken).
Making a UE5 game myself, I can say it is a beast, but it is tailored towards beasty games. The whole package of effects that runs with it makes it so even small game projects with barely anything at all ask for a lot of power.
In other words there's barely anything you can do to optimize small games except reducing shadows and such. About anything coming out of UE5 will kinda run like that, be it a small game with almost nothing on screen like Aveum, or the incredibly detailed Matrix Demo.
So honestly I don't think making a game like that in UE5 is a good move. UE4 would have been just fine, proof is how Godfall or Atomic Heart run much better than this.
UE5 is more suited for realtime rendering than games at this point. UE4 games like Gears 5, Mortal Kombat 1 look much better.
@@RoyalGuardGeine even UE4 games are a stutter fest, just look. Sooooooo much stuttering, sometimes even after shader load. What I understand is devs should ditch UE5 and make their own engine, and only work with UE5 if they can fully utilise it. OR, wait a couple years more maybe until GPUs are powerful enough to handle UE5
High-end visuals?? Where??
UE3 Era: Most mid end GPU can play game smoothly
UE4 Era: Some mid end GPU can play game smoothly with stutter
UE5 Era: Some high end GPU can play game smoothly with stutter
When I hear UE5 game, I think of something that looks great like the matrix demo. What I see is something more akin to looking like a last gen game while demanding current gen game specifications.
The rtx 3060 is the most popular GPU on steam. And that is around a rtx 2070/2070 super. The rtx 2080 super is a good 20 to 25% stronger. It should not really be a 1080p GPU.
Hope this game allows us to use FSR 3 and DLSS in conjunction cause, pretty much every GPU needs it. 1440p is already getting popular, and still to get 3080ti levels of performance, one would need to spend $700 at least on the GPU alone. Thats in the US. Outside expect closer to $900 to $1000. Simply not an option. Even a 3060ti/6700xt would be closer to $300 to $400 in the US and $400 to $500 outside US. Considerable money to spend for a mere 1080p experience.
I'm fine with a "mere" 1080p experience because i can always use super resolution if the game is not sharp enough for my taste, if my GPU still allows it for still a reasonable FPS.
Where is did you see 3060 being the most popular on steam? I just did the steam hardware survey and looked at the results 2 weeks ago and 1650 is still the most popular card
@@Cruxis_Angel The 1650 includes ALL the versions of the 1650. 1650 gddr5/6 desktop and 1650 gddr5/6 laptop.
If you look closely the rtx 3060 is separated into 2. The 3060 desktop and 3060 laptop. Combine the 2 (since the 1650 is doing it) and the 3060 ends up being the most popular. And the 3060 mobile generally comes with 100 to 130w TDP and will be either on par with the desktop one or within 10% of it since it has more cuda cores but sadly has half the vram.
@@Z3t487 The gtx 1650 is capable of pushing 1080p medium settings 50-60fps during combat in armored core 6.
The most popular GPU on steam, the rtx 3060 would likely do about the same FPS and settings, but at 4k as it already does 35 to 45 FPS at 4k high settings.
Now imagine it had upscaling. A budget midrange 3060 could do 4k high settings 60fps no issues.
@@siyzerix Ok, my point was that 1080p can still look good, especially if you play on a small 1080p monitor and at worst I can use DLDSR in the Nvidia Control Panel to play at an higher perceived resolution. I mean, it's not as good as the real 1440p or 4K but I'm satisfied with the image quality, so I don't feel the need to use a higher resolution native screen.
I would really want to know the multicore utilization on CPU cores. You should have actually included that in this.
We already know UE4 still has a problem with scaling performance across multiple CPU cores.
That's Alex's specialty, I'm sure he'll touch on it.
Game code in general scales poorly on multiple cores. It's very hard & very time-consuming to do things in parallel when it comes to game code since a lot of calculations require the results from a calculation that went before it, putting those on different cores while possible can also result in desyncing which could lead to bad results (glitches or straight up crashes) & testing every single thing is a huge undertaking. That's why generally you see a main thread for the bulk of the game code & a few separate threads that're capable of handling a few mostly unrelated things & now with the rather bad console ports also offloading decompressing assets to the rest of the CPU instead of using more optimal PC-centric asset formats.
@@MLWJ1993 I think its more of a Unreal engine issue. UE has a history of such titles.
What frustrates me more is how they're not looking at driver overhead, or even consider it a possibility. All they're testing with is Nvidia cards.
7900XTX is just as fast as the 4090 in this game. That should tell you something.
But apparently to DF, only Nvidia cards exist, and they're spinning their wheels in the mud trying to explain the poor performance.
When we all know Nvidia's driver overhead in CPU demanding games is a big problem.
@@The_Noticer. It vastly depends on the CPU in question. With enough physical cores driver overhead isn't an issue, when GPU limited it also doesn't really matter.
I doubt a 7900XTX is any faster than a 4080 in UE5 games.
DF exposing themselves again! Where’s Alex Boogaga?! Too ashamed to show his face after shilling for these developers and their implementation of UE5 😂 worst game “analysis” of the year right here
Savage but fair comment
There will be another video with Alex. That was stated in the video, actually
@@igorthelight So what's the point of this video? Why we need this video? Didn't seem that good at all.
@@Z3t487it’s a courtesy to allow more time for patches and optimizations which is disingenuous at best or complete favoritism at worst
But muh german nanite yaaaaaa
Lighting looks so flat in some indoor areas with forced lumen on, that only caluculates less precise form of RayTraced Global illumination and reflections. How is this "high-end" visuals? A last gen game can look and run better at the same time.
Why do we never see AMD high end cards tested? Like ever?
Nvidia shills theory
Imagine buying a 4k monitor for the sharpness and you have to play all blurry like a 1080p reescaled cus of shitty optimization 😂😂
The art direction on this game was a real misstep.
It's great to see all the 5.1 bells and whistles but the game never feels like it does any of it justice.
If you go back and play Dead Island 2 which is an unreal 4 game, it both runs and looks significantly better than Immortals.
Yep, Dead Island2 looks gorgeous, partially because of the great baked lighting & color bounces in the environments. Immortals of Aveum doesn't have a realtime day -night cycle so i don't know why they needed UE5 & Lumen.
"AAA title"
Brother DLSS performance vs FSR performance the difference is RIDICULOUS what is up with that motion holy shit
FSR has always been quite bad at any setting below 4k quality, and struggled hard with motion. Games like this with stuff moving fast, close to the screen really show it off. GOW was another bad one.
so much steam players
198 - players right now
304 - 24-hour peak
751 - all-time peak 7 days ago
The comparison at 5:18 makes it look like FSR2 was not interfaced correctly and motion vectors are missing, wholly or at least partially
Yeah, looks like the hand is not masked for the FSR2 motion vector pass at all.
But it is a lot of disocclusion, which always has been fsr2 weakpoint. Specially on the consoles , where the internal res can go REALLY low, you'll see it big time. Where things move and it has to draw something that wasn't there before, fsr2 has always looked fizzy. Also in God of War, also in Spiderman, etc.. dont think this is a wrong implementation. Its just a lot of movement in a very visible place 😔
@@jorismak Nah, it's a poor implementation. If you've played/used/modded FSR2 for a while, you get a hand on the tells.
Surely I'm not the only one who is consistently disappointed with modern games' visuals to performance ratio.
I may be going mad but games like Red dead 2, Star Wars Battlefront 2, Final Fantasy XV are older games now and they look better then a lot of these so called groundbreaking visuals in these 2023 games
This game peaked at 751 users on Steam. Deserved
That he did NOT include CPU tests ...is VERY very very telling !
Alex wasn't doing his job,when he had to read the comments online for SEVERAL days before he did,and NEVER said anything about IF he reached out to the Devs on IF they would be doing even the bare minimum to improve perf, to the standards that we see with 2077 from over a YEAR AGO ,with CPU Multicore performance in BG3!
Holy moly. That FSR is atrocious in motion.
Stop hiding performance behind DLSS. 4k DLSS quality is 1440p, just put 1440p in your graphic instead of 4k DLSS quality.
Holy shit that breakup of the amulet using FSR even in quality mode, I thought that was only happening in the consoles because they were abusing FSR to do 720p->4k, if you're gonna optimize your game with upscaling in mind maybe make sure the implementation is good?
It's shocking how dismissive this video seems of the horrible optimization in this game. It's a footnote instead of the main card. Many games look and run better on simpler hardware. This is a broken mess. FSR2 does not at all excuse how low the resolution needs to be on the PS5/series X either.
But it's not "horrible optimization", at least not on the developer's part. In fact the devs wrote a bunch of custom code to make up for some of UE5's shortcomings which is why there's no shader compilation stutter. The fact of the matter is that UE5's feature set is very CPU and GPU heavy. But just because something pushes hardware limitations doesn't automatically mean that it's "poorly optimized"
we don't know yet what the overhead of the UE5 engine is, and the developers haven't had a lot of time to figure out UE5 either.
UE has a history of being crap for ages after a new release version, UE4 still has compilation stutter and traversal issues
Surely we can only say that for sure when more UE5 lumen+nanite games come out to compare to?
@@jimmyvauBecause there's better looking games that run far better than this? Especially for such a statically lit game.
@@jimmyvau "But it's not "horrible optimization", at least not on the developer's part."
They chose the engine, argurably, after diving deep into its performance characteristics and understanding the tradeoffs.
It´s very much their part, 100% percent of it to be precise.
This game doesn't look "chugs on a 2080 at fake 1080p" good no matter what you tell me.
Definitely an interesting review, but I'm curious why the game was only benchmarked with upscaling. Should viewers assume that the game is simply unplayable when rendering the game at native resolutions?
Yes
Unreal engine. Need we say more?
Ngl, the graphics aren’t particularly beautiful.
Agreed
@@JokerX350 To be fair, The Order 1886 is literally one of the most impressive-looking games out there
I think youtube compression is really killing this one on the technical side. It's the art style that's kinda "meh"
Well it’s UE5 for you. UE is one of the worst game engines out there. The engine just wants super high end systems to run no matter what. So any game that uses UE5 will run like 💩
@@cluclapcyberpunk came out nearly 3 years ago and this looks very meh on the technical side compared to even rasterized cyberpunk.
I'm absolutely not impressed at all.
I think the devs brought out the shiniest assets they could, did no optimizations on them, slapped nanite and lumen over it, threw it into the wilderness and shouted "that pc is cool but can it run *our game*?" in hopes of being the next crysis-like hyped-up benchmark.
Compared to red dead redemption 2, this game looks like a joke I'm not even exaggerating.
I would have liked a better breakdown of exactly what the CPU bottleneck is. I have 3900 CPU, which has the core-counts required, but isn't nearly as fast as the CPUs recommended.
It's Unreal Engine, so many cores do not help with performance.
games are single-threaded by nature, zen 2 is on par (or worse) than the 6 year old coffee lake architecture in games
@@tyre1337 The nature of VFX is kinda the opposite, so more core should make a difference.
Core Count isn't really important in Video Games. Since the consoles have 8-Cores, all game engines will be optimized for 6 or 8.
AMDs fors 3 Ryzen Gens were... not impressive for Gaming
@@tyre1337 No Games are not single threaded. They haven't been for over a decade. The last "kind of single threaded game" that you can play right now is Final Fantasy 14 and that engine is from 2006.
This much is clear: This game didn't need to be on Unreal Engine 5
It only uses it because it can. It didn't need it, and it was NOT optimized for it at all.
The best use of UE5 so far have been indie demos and Fortnite custom content.
This is why we need Game engine competition, everyone is putting all their eggs in one basket.
Never thought i would hear the words "runs" "720" and "Ps5" in one sentence.
more people watched this video in the first 5minutes than the number of people playing the game over the last week
Ofc, it's an EA game.
While im not denying this game graphically looks better than others of the same type, it sounds like the devs were to lazy to optimize the game to run without an AI upscaler, and are therefore just relying on DLSS and FSR to clean up their performance issues.
I mean, ofc! I have seen the game, on a video, running even at a minimum of 29 FPS at 4K Native on a freaking RTX 4090.
Then magically at around 80 FPS using DLSS Quality.
They decided to switch to UE5 for workflow reasons, and ue5 with nanite just doesn't work without upscaling. Is not something they could optimise or fix, it's just a decision 😔
I'm sick and tired of games trying to use increasingly demanding hardware acceleration to offer things that are not even that impressive. What happened to the whole mindset of working within hardware limitations; hardware that most people have? I guess everyone's more driven by this artificial demand for increasingly expensive GPUs.
The game is horribly optimized
It looks nowhere NEAR good enough to justify how horribly it performs
720p on modern consoles is INSANE
You can't make a horribly optimized game, slap DLSS on top and call it a day
This is unacceptable
To me this game looks like a slightly better looking version of dragon age inquisition, a nine year old game, but in first person. Neither are bad looking games, for sure, but it just shows that the performance level of this game does not pay off visually.
Not only they can, this game is the proof that they will do it again
It’s not the game. It’s the game engine. UE5 is insanely demanding. Every game coming out on UE5 will need insane hardware to stay above 60. No fixing it unless Epic fixes the engine itself to be more optimal.
@@Jrfeimst2 this is simply not true, don't know what else to say
You can absolutely make a well performing game in ue5
@@Jrfeimst2 not really, layers of fear is a UE5 game. And it runs perfectly on a mid range hardware.
Hopefully the handful of people who bought the game can actually run it.
Anyway... Armored Core next pls?
This is what happens when you rely 100% on automated technologies instead of clever optimizations and behind the scenes tricks to ensure a game runs smoothly whilst giving the illusion of a living world.
A lot of the foliage just looks like stock UE5 assets we saw in tech demos where they copy paste trees
I'm surprised how irritating that curved pointer thing hanging off our characters hand looks rn
An-isotropic filtering might not necessarily affect frame rate too much in practice, but it should affect memory footprint.
So those point totals might not just be an indicator of only performance tax, but also an indicator of Vram and maybe even ram footprint.
TBH they should have added a separate VRAM usage counter like most other games have. It would've made the GPU and CPU point system more valid.
Ist this a IGN Review?
Hey Don! Thanks for the Tech Review :3 I'm very much looking forward to you and Alex nerding over PC and rendering tech. Your videos are a fine addition to the DF collection :D
Also I can't state this enough. If you need upscaling to hit the recommended settings on a recommended hardware, that's NOT "accurate" system requirement. We expect better quality of content from you Digital Foundry.
Why is the lighting all real-time when the environments are mostly static?
"High end visuals" in heavy quotes because this game doesn't look better than the crossgen God of War Ragnarok or even the other crossgen title, Horizon Forbbiden West.
I think what doesn't help is the art direction of this game
@@j.humphries8893 Horizon's art direction is awful and yet the game looks better than this one.
@@dmywololowol I mean Horizon's art direction in its world is generic, but its not awful. The lighting and foliage in that game are fantastic (I dont like the game very much overall, but I digress)
Also Horizon and GOW are $200+ million mega games that take dev studios of 350-400+ employees to make (plus contract workers). This game is from a brand new studio with like 100 employees. Its still AAA, but a ton of games per year release with this level of financial and staff support.
If this is what we can expect bare minimum visuals wise going forward on UE5, im not upset about it
I don't understand... The game isn't even a graphical marvel for the system requirements it demands
UE5 + lazy devs ;-)
Im good for 720p at 24 fps, the 90s cinematic experience
where are the high end visuals?
so where are the high end visuals at
This game is the greatest joke ever told, 2080 Super minimal recommendation lol. I wonder how it will sell.
Under 20K Steam sales, according to Steamspy, lol.
"As Tom pointed out in his video, the PS5 version runs at an internal resolution of *720p* while targeting 60fps"
"And we see the same results at *1080p using DLSS quality*
What is the internal resolution of DLSS quality at 1080p and why wasn't it mentioned?
I hope this isnt the beginning of a trend...dlss and fsr shouldnt be the default requirement for acceptable framerates they should be tools to push for 120 fps plus only not requirements to push 60. Poor effort. Battlemage 2 indeed.
Something isn't right, its not bad looking but it is not particularly good looking either, and that's fine on its own, but the performance is unjustified.
Compare this to Doom Eternal, and there are few areas of out and out visual wins if you view the graphics from a cohesive whole. Eternal also runs waaaay faster.
Eternal flies on my old ass RX 570 4GB... of course with compromises, many comes from VRAM, but I can easily go medium, with some settings, high and still getting 60+ fps constantly. And yeah, Eternal looks really nice and Immortals? Well, nothing special. Has some nice graphics here and there but the system requirements are absolutely not justified. Using UE5 won't make a game automatically pretty and optimized.
nah eternal doesnt have the same geometry/polygon level
@Metaltildeath-mg4tv that is true, but going from the 'cohesive' perspective, does this look nicer? Doom eternal in full flow has particles, lots of enemies with high geo and layered geo too. Its an interesting point of comparison
@@MetalDeathHeadAnd it still looks great. Because of the art direction and great usage of resources.
no, doom eternal looks nicer, imagine eternal with nanite and lumen, immortals really doesnt look good because of its art design.@@richardtucker5938
Its been a while since my PC was below the minimum spec for a new game. I was looking forward to the game as I like Magic-FPS like Hexen and Amid Evil and wanted to support such a game. But in this situation, I'll wait.
Try Lichdom: Battlemage ;-)
Hey, just change the DLSSG dLL from 3.1 to 1.07 version and the artifacts and ghosting are gone and for performance improvements just force enable resize bar with nvidia profile inspector
@@ytytyt2265 1.07? I know that 2.5.1 is pretty good ;-)
Doesn't even look that impressive, older games may have objectively worse graphics but posess better overall presentation.
Epic and epic devs and UE devs are being honestly stupid. If your game requires more than a 100TF GPU to run at native 4K you're doing something very wrong. Upscaling was not supposed to be a requirement. This doesn't even look better than Doom Eternal >.> If this is the cost to get full realtime lightning, I don't want it.
Using the phrase "High-end visuals" loosely I see.
It's unacceptable to almost require up-scaling to have your game run smoothly by default. That's embarrassing.