I was really excited to try it as the first game in the franchise was kind of a hidden gem, but after trying this on my 6700xt @1080p nd couldn't hit even 60, I promptly returned the game, do not support such bullshit levels of incompetent optimization.
@@ManuSaraswatno kidding? I have a 6700xt and it smashes the first game. Have yet to grab this but it seems like I'll be waiting while they just let the public bug test it.
It was bound to happen. I knew some devs would start doing this when Control came out. The real problem with REmnant 2 is the system requirements claiming you can play with a gtx 1650 and the "recommended" as a rtx 2060. This is straight up bs to lure people into buying the game.
@@teddyholiday8038dlss was only the beta of shitty game optimization, frame generation will be the opus magnum of shitty game optimization. Devs will skip any proper multithreading and optimization and slap dlss on it. Man I hate this undustry.
Makes me appreciate DOOM 2016 and DOOM Eternal that much more when I see how poorly optimized most new titles are. The guys over at ID sofware just seem to know how to create incredible graphics engines that are designed to run well on a wide range of hardware.
They have their quirks too, for example if you have First gen Ryzen Cpu, Doom eternal will crash at settings higher than Low, plus their games are super fps dependant, yoi can softlock at 30 fps in certain parts + countless bugs
@@wallacesousuke1433 Play Doom Eternal at 4k resolution, max graphics settings, HDR and ray tracing enabled. Then come back here and try to say it looks "meh".
That's exactly what happened with me unfortunately, bought a Radeon RX 6700XT 12GB and playing in 30fps , and even with that my PC crashes after about 30 minutes. Fuck gunfiregames
@@chadjr2004this game. the 6700xt is a great 1080p card and can run 1440p games quite well, too, when they're not unoptimized garbage like this game.
Well, not really u can use DLSS and have over 60 fps and also if u buy 40 series u can use DLSS3 FG it works well only with minor latency hit and it also negates any CPU bottleneck u can get away with weaker CPU but of course it sucks. Also everyone is talking like a 8GB GPU won't work for future UE5 games and this game runs perfectly on 8GB VRAM the VRAM is not an issue.
If this game used Lumen in conjuction with hardware RT, this kind of GPU and CPU performance could be justified. But, man, in it's current state Remnant II is probably the worst game in terms of performance/graphics quality ratio. It surpassed even horribly optimized TLoUP1 in this regard.
@@stangamer1151 well, if you want to be amazed, check the steam ratings..it looks like it is very well received(at the time of writing this)..so i guess people don't care for low performance of using upscalers?..it really makes me question existance haha
I was kind of surprised at that also. Compared to all the photo realistic tech demos and the hype, I fear Unreal 5 isn't gonna actually mean much for a while in terms of better looking graphics.
I barely hit 75 (my monitor's refresh rate) at 2560x1080 21:9 on 13600k and rtx 3090 all ultra no dlss. Was very surprised with this "performance". The previous game was working flawlessly on my gtx 1060 back then. Today's gaming is disgusting, devs basically say "just buy 4090 bro, what's the problem?"
If you see unoptimized and buggy game, just don't buy one and wait for it to get fixed. People who buy broken games, are the main problem because of which devs can comfortably launch broken, buggy game without any consequences.
Yep, companies are gonna get lazy and lean on upscaling and frame generation vs just making the game properly. The game looks good but not enough to require upscaling
The requirements to play these games are increasing exponentially but very little advancements in actual graphical quality, the pc gaming market is in a dire need of another revolution which I don't will be happening anytime soon
The problem is most improvements this gen is in the background of modern titles. Devs try to use the new hardware to create systems to make their jobs easier at the cost of performance. Considering how many companies in many industries try to cut cost now I'm not surprised devs try to use the most convenient and cost effective method of manufacturing games.
RT,Upscalers, and frame gen are those "revolutions" lol. Although we BADLY need an RPG like Fallout with smart AI that can actually interact with player's speech.
@@syncmonism Then put the resources, money and time into it. It's not normal that such a tiny fraction of the budget goes to optimization when this is one of the most crucial aspect of games. You can make the best game in the world, if it runs like shit, it's not gonna be worth playing. It's a pure waste.
I think what they said is true. DLSS works better in this game than any other game i have seen. Its flawless to the point i cant even tell if its on or off. I believe its likely they put a ton of work to get the game to run this way. The end result is odd, in that it only looks a bit better than the original did and runs much worse objectively.
@@Alex.Holland if you look at the video it looks like it's stuttering no matter what. Comparing it to other gameplay videos on this channel it still looks like it's not running that well. Even if a lot of work went in to this I would bet money this was chosen as easier/cost or time saving option. It's lazy and it shows. If you create a game that runs like shit on top spec equipment it's shit no matter how you dress it up in excuses.
The funny thing is, this game has no ray tracing. Usually upscalling is kind of required when you kick in a couple RT solutions. Imagine what it would be like if they had some RT implemented. Crazy.
@@MechAdv I also expected something like this. But I had to upgrade, my GTX 1080 + 6700K just wasn't enough for me anymore. I upgraded to a RTX 4090 + 7950X3D. And I changed my monitor to that new Asus 240 refresh 1440P OLED monitor. That way at least it will stay relevant for some time.
Lumen would absolutely destroy performance. It's such a shame devs aren't going for a normal approach on optimization and just slap upscaling to smooth that out without caring about players.
Me: I'm going to buy a 4090 so I can play games on ultra 1440p without ever having to lower settings or use upscaling! Remant 2: I'm going to ruin this man's whole career
It goes hand in hand then, devs becoming more reliant on upscaling tech, while gpu maker selling said proprietary upscaling tech with little to no effort on actually increasing performance..hmmm
I'd say that's partly true. For some game devs the ability to have some DYNAMIC render resolution setup is low hanging fruit. Spend months of optimization at different parts of the game to find those areas you're dropping down to 20FPS or lower at times or just let the game drop as low as it needs to, even 360p causing massive blurriness. Hopefully people vote with their wallets and make it clear this is unacceptable. I certainly wanted the new Jedi game but didn't buy it.
Game looks good, but not good to require any amount of upscaling. Plague tale requiem ran better at native on my system than this game with FSR quality, INSANE. And we all know that game has one of the most beatiful graphics we have today. Plus when you change quality settings, there is no clear visual change. Basically this game is trash
It's crazy cos I thought even Plague's Tale should be running better when it came out since even tho it looks gorgeous, it's a really linear, straight forward game. This game doesnt even look anywhere near as good, yet, runs worse. Gaming is in the mud.
I can’t believe I switched from console to PC because console games were starting to be very low res upscaled by FSR, and here I am with my new PC about to experience the same thing 😂 fml
You dont have to play those few unoptimized games on PC. Luckily you can play thousands other games on PC, compared to like 50-100 on any console. And most popular games on PC run perfectly smooth even on low-end systems.
A small AA studio was able to include all three upscaling technologies, and frame generation? Where did they find the resources to do that? It must have taken so much time and effort, they probably had to starve themselves. Thank heavens big AAA studios are not going for such sacrifices.
Definitely would like to see the same tests on the AMD range of GPUs. Currently have a 6750XT. Don’t care how long the video is, very interesting stuff. Well done sir 👍🏻.
Hope to see a follow up with some AMD GPUs like the 6700XT and 7900XTX and the Intel A750. Also other CPU configs like a 12400, 12600K, 7600, 5600X, 5800X and the 5800X3D, how strong does the CPU need to be to stop the stutters?
With a 6800XT and a 5800X3D I get occasional stutters, but nothing too crazy. Going into a new area can cause stutters for a few moments but then it goes away, with an average frame rate of 70-100 fps, typically in the 80s and 90s. This is on mostly high settings with shadows and effects on medium, with FSR Quality. I was honestly pretty sad about the performance. It's playable, but I figured I'd get better numbers. I feel for anyone with a weaker PC, it must be miserable in some cases. Edit: forgot to mention this is at 1440p. Haven't tested 1080p as it screws with my second monitor
Just another garbage UE engine game,stutter non stop and just low FPS all around,cant get stable performance with an rx6800 r5 5600x 32gb ram and nvme SSD,im done with unreal engine games and not playing another one any time soon...
Nanite automatically culls everything not visually-perceptable in real time so the argument doesn't make sense. That's why LOD isn't relevant in these games, they simply cull CG-quality assets in real time. It will however try to load your GPU with geometries as long as there is headroom. So I think what happens here is Nanite can't gauge how much headroom your GPU has - a common theme in PC space. It seems to work very well on consoles.
I am 100% certain that CULLING *IS* being done. Do you know how I know? I know because the shadows in this game are a screen space effect, and as soon and the source of the shadow is out of the frame, the shadow itself disappears!!
I have a modest computer, Ryzen 3 with RX 6600, can run this game at high 1080p. But there are stutters from time to time, using FSR on quality makes it work really well at 60fps. But i agree this is not a solution, the game should be playable as is, not to mention people with older cards that don't benefit from upscaling. Not to mention the game isn't exceptionally demanding in itself, like some Capcom games that have facial rendering where you can't even tell if the characters are 3D models or real people. PS: It's even weirder, the fact that this game looks so similar to the first Remnant game in terms of graphics, yet they managed to make it require so much more resources. The first game ran even on a potato.
Ya, that's rarely how it works. Games are complicated, especially if you're learning new software. USUALLY the issue is upper management setting deadlines that are just not possible. I wish people would stop blaming the coders.
@@photonboy999 Id usually agree with this stance, sadly doesn't work in this case as the devs outright stated the game was designed with DLSS/FSR in mind.
Yeah i should not need to have on DLSS to gain above 60fps at 1080p on a 3070 but here we are, finished the game yesterday. Really hoping the devs can solve the performance issues.
I was talking to my friend a while ago about my hate for fsr and dlss. They are not bad technologies and have their use, but this exact situation was why I despise it. Dlss frame generation and the way they marketed it was the giveaway for me. The writing was on the wall, just can believe it wasn't limited to the gpu companies, and their want to sell underpowered hardware for higher prices.
@hastesoldat nah, not that deep. With the technology that is available, game devs don't have to fully optimize their games anymore. Essentially, "Who cares if the game doesn't have a fps over 60, dlss frame generation makes up for it.". It's Oversimplified, but that's the point. It boils down to laziness or cutting corners.
@@cosmickirby580 Still don't see how gpu makers and engineers are at fault there. It's the game devs that are incompetent and lazy as usual and are using the technologies wrong.
@hastesoldat again, gpu makers and engineers are not conspiring with game devs. it's not that deep. What I'm trying to tell you is that the technology, regardless of its intended purpose, allows for corners to be cut both on the manufacturing side and the actual data being processed. In oversimplified words, gpus don't need to be as powerful because supersampling makes up for it. Games don't need to be as optimized because supersampling makes up for it. This does not mean that they have to be conspiring with each other. In reality, they are working towards the same goal, making more money with less time invested, and this technology simply allows it to be done.
It could be that developers are starting to become lazy and instead of writing better code they rely on upscaling or they simply say just buy a better GPU. It's insane that even the 4090 can't keep up.
Thats the atitude alot of people ive run into in the PC games community have too. You're just expected to upgrade or stop complaining, which is horrible with the economy the way it is.
I want to play this but I'll be holding off buying this for now until they get around to finishing it. I miss the days where an optimized game was implied by the term "release version". Thanks for the heads up Daniel!
the game runs fine if you arent pushing over 60fps. i pre ordered and played the game fine enough. small stutters here and there but nothing bad until final boss. less than 30 fps on med settings
@@iPeanutJelly "If you aren't pushing over 60fps" "I pre ordered" Pfft okay. Really talking to the masses with this. The latter makes it sound to me like you're justifying the waste of money after the fact.
Yet you keep slapping that pre-order button! And "few years" is a bit of a stretch. Try almost 10 years! Since Witcher III - basically. That was the first big test the water "FUCK YOU" from developers. Putting NDA's on reviewing the game, because of the EPIC downgrade in visuals and the blatent "drop port" the PC version was of the console versions, with CDPR protecting their pre-order millions by threatening that german website with legal action for leaking the truth days before the release - in a desperate attempt to expose CDPR and allow people to get their money back in time by cancelling pre-orders based on FALSE ADVERTISING. Now the devs don't even try to hide it anymore - cos ya'll KEEP ON SLAPPING that pre-order button. Oh and you keep paying Nvidia $400 for entry level GPU's too. And $300 for their media centre options. GG. WP.
@@BladeCrew Just buy games that are already out - have been reviewed - and those reviewers said "Great game! Well optimized!". SURE - take my money! Not pre-ordering nonsense 2 years in advance and then getting trash like this - and having to spend $500 on DLC while you wait for the trash to get patched! XD Gamers all lost their minds! XD
DigitalFoundry's video about this game on the consoles says that the resolution is around ~1200p upscaled to 1440p for 30 fps. And 792p for balanced and 720p for performance. It's unoptimized and they're using upscaling as a crutch for playable framerates.
*sigh* same story with Forspoken, Jedi Survivor, and now this. I guess we should be getting use to this as the norm for most of us with mid ranged PCs, though I do plan on building a 7900xt build in future.
@@Stardomplay turn the setting to medium and most of the problem will be solved (unless it is game engine issue like those shader compilation stutter stuff). the problem is i see people take ultra setting as a measure to see if the game is well optimized or not when ultra have always been build to punish even the fastest hardware available at the time.
My brothers and sisters with last gen mid range cards this is a blessing the backlog has been waiting for us for years. Maybe in 2025 we can play this game at 1440p 100+ fps as God intended Pc games be played.
Tell that to the people who are still waiting for a GPU worthy enough to get a stable 60fps on Attila: Total War. The game came out 8 years ago, and the developers claimed it was "made with future GPUs in mind" when they received backlash for the performance.
@@beri4138 Indeed it does. Still runs like shit even with the latest cpus. Newer and older games perform much better. Very bad optimization regardless.
Awesome work! The one thing that's missing is checking it with a fast/modern Intel processor (or a fast/modern non-x3d AMD) to see if it's just pure raw power that's needed or 3D V-cache related, and then a 5800X3D just to cross check. Also why no AMD cards checked?
when dlss first released i loved the idea of it giving us a boost but sadly its now being used as a lazy way to optimize. years ago we managed to run games fine without dlss but now its like a mandatory thing. its so lazy. we could have insane graphics combined with insane fps but studios dont wanna optimize their games anymore. thats why i appreciate the last two doom games and also atomic heart for running so smoothly despite looking really good.
I bought this game a few days ago and tried it this morning. First time I've ever refunded a game on Steam. The gameplay was nice but man, the performance is so bad. I have a 2080Super and 9900k in this system and I wasn't hitting 60fps with DLSS on, and huge frametime spikes/stutters. I want to like this game and I'm sure I will if they ever fix it, but right now it's a mess.
@@od1sseas663 just say your a shill and cannot afford a 7800x3D or 5800x3D which wipes the floor with a 13900K in benchmarks, keep drinking that koolaid bro, let me how it feels when the copium wears off
@waitwhat1320 I got it for almost 500$ when it released. My GPU alone costs more than your whole pc you poor kid 😂😂 130fps in Last of Us. Yeah, such a "low end" pc 😂😂😂
I am afraid of the UE5 future... It feels like developers get a lot of convenience out of the engine, like offering unlimited detail with nanite, ridicolous amount of layers for physically based materials, cinematic lighting and camera effects. Yet we simulated all of that stuff, for a fraction of the cost 10 years ago. I think what will happen is that development studios will just cut corners in terms of staff, and publishers will profit more. The only AAA game I have been impressed me A LOT these last couple of years has been Half Life Alyx (and Cyberpunk, which also has it's own in-house engine). Which looks pre-rendered in VR. They use highly optimized old-school techniques, and it's impressive that they reach that level of quality while rendering two scenes simultaneously for each eye at insane resolutions and high refresh rates. Heck. I play at 5400x2700 at a solid 90fps with a RX6800. The game renders wonderfully at 4320x2160 on a RX6600XT. Yes, they use a combination of baked and dynamic lighting, yes they use parallax corrected cube maps for reflections, but honestly. Why not!? when they look that good? I'm looking forward to Counter Strike 2, just to see how good the Source2 will look and how performant the game will be. :)
I really like this style of video. Where you alternating step your way up in performance from the GPU and CPU. Really lets you get a fell for how the game performs on most hardware. Would love to see it with another CPU option although in know that's more work than just slotting in another graphics card or would require another system.
add another one to the pile I guess…regardless of what the devs claim, this is poorly optimized and shouldn’t have released until it was actually finished. Sadly, people will use this info in their GPU wars, instead of holding the devs accountable.
I think running Remnant II on a GTX 1650 (as per their listedn "minimum requirements", was incredibly optimistic of the devs. Remnant 1 (UE4) was very playable with a GTX 1080 Ti in 1440p native. But it didn't exactly flat-line against the monitor refresh cap (mostly 60-90 fps). And for reference, a 1080 Ti makes a 1650 look like a bad joke. I seriously doubt a 1650 could even come close to running Remnant 1 in 1440p. The footage of Remnant II (at 540p upscaled to 1080p) looks pretty bad. Lots of stutter, and the eye candy isn't even as pretty as Remnant 1 on UE4. Remnant one was _terrible_ at pre-compiling content as you approach new terrain. It would crash somewhat frequently because of not pre-loading early, and then freaking out when you move forward and suddenly need new render content. And that was with 11GB on a 1080 ti. Excellent testing by the way. Clearly the cpu/mobo/ram is critical for this game. There's a LOT of gaming pc's out there with a 9600K or less. That's gonna hurt Remnant II a lot. Adding "eye candy" (UE5) while relying on huge amounts of upscaling to make it work, is counter-productive. Actually looks worse than if they just made the game on UE4, running in native rez. Also, wow, a beastly system with a 4060 @ 1080p/medium can't 60fps? That's horrendously unoptimized. And a 4090 @ 1440p ultra only hits 70 fps, with no ray tracing in the game????? That's just broken. This may be the "hardest-to-run" game in existence. Generally cyberpunk with max RT is the lowest fps you can get on a game. But this beats cyberpunk for low FPS. You really NEED very high fps to play Remnant too. It's got a lot of content which requires 'immediate' reflexes. It's basically a gun-toting version of Dark Souls with much better multiplayer. I have to say this is not a very good pc port. Which is a pity, because Remnant is an amazing game.
This game doesn't have RT, but it makes heavy use of UE5 Nanite. Its likely that having the option to turn this off would dramatically improve performance. You touched on this in your Fortnite video when they moved to UE5 ruclips.net/video/WcCUL3dR_V0/видео.html
@@arenzricodexd4409 Yeah but in that case you can't just have an option to turn it on or off. Otherwise they would have to implement both ways, which would defeat the purpose of using nanite in the first place.
@@WeaselPaw implementing both old and new method is common when everyone still on transition phase. that's why we see when new DX being released majority of game did not use new API exclusively. rather they release on both old and newer API. most often the older API serve as a way for people with older hardware or slower hardware to play the game. but as usual for some gamer especially those with newer hardware they will accuse developer being lazy because developer did not use newer API exclusively and fully optimized the game with newer API feature when they did this. i think something similar also happen here.
Remember back when game devs used to brag about how real a world looked, how luscious the plant life, the shadows, the water, etc? These days Devs would rather you turn down all those settings so it looks like crap.
Your testing is also very generous; fill the screen with particle effects and more characters running their Ai Behavior and timers, and you'll definitely see the frame drops.
Something tells me the ultra setting just adds unnecessary stuff that devs left there for the players to get the best experience possible from the game if their system can support it but not needed for most people. Still the game is too demanding and needs some serious revision, but for now would like to see some testing at medium and high settings. Either way, it’s DOA imo
Those spikes most likely is the game building new shaders. The areas you already visited before has no stutter as the shaders are already loaded. It's common Unreal engine issue and devs 99% of the time don't optimize shaders
@@BlackParade01 Unreal Engine 4 had already a solution for this. Just build the shaders before the game starts, but this is just another developer studio that doesn't care about performance at all. They admitted in their own post they had done no optimization and relied fully on temporal up scaling.
How come everyone made games 5-10 years ago, that graphically look exactly the same as new games, and ran perfectly fine on old hardware like a quad core i5 with GTX 700/AMD HD 7000 at 1080 60fps? Now you need an RTX 3060/RX 6600 for basically same performance without upscaling, for same looking games
I think that it is ABSOLUTELY DISGUSTING that the 4090 required upscaling at any resolution with no ray-tracing to give good performance! The dev's on this game definitely DID NOT do their jobs and just relied on upscaling as a crutch for getting good frame rates. I think that this game is going to get a hard nope from me!
I remember only a couple of years ago, the 3080 was seen as THE card to have in terms of price to performance for 1440 and 4K gaming. Now it's been relegated to a 1080p card by this game. I have a 4090 and 7950x, and I can barely hold 45-50FPS at native 4K, even when it's overclocked to 3Ghz and pulling over 500w. Just insanity. I could understand it a little more if the game looked ridiculously good, but it doesn't.
In a low-cpu system locking the FPS to say at 60, 50, or 40 FPS will take out the strain from the cpu for a smoother gameplay. Do not unclock the FPS in the game settings! Also as grazy at it sounds in a low-cpu system DLSS balanced or even quality setting can give you better performance than the performance setting since DLSS takes strain out of the GPU but increases the strain on the cpu.
What's the point in all of this? Why even pretend we should use UE5 if this is the performance expectation when running 720p internally on current consoles... and when even a 4090 system can have issues running the game, then clearly it's far too soon for the engine to be utilized. It's not worth it.
They didn't make the game with upscaling in mind , they are hiding their incompetence with upscaling, this is ridiculous , the visuals are nice but nothing to write home about and games should be made for current hardware, it's like they didn't look at the steam hardware survey.
It literally crashes my system and I just got $400 AMD 16GB GPU for this game. I have to stay at 30fps with lowest setting. At this point Mad Max looks better on my computer than this game.
Remnants, the first one. 7800x3d and 7900xtx and 64gb ddr5 6000 cl30 max settings 3440x1440, i get 180-300fps. Looks good. Remnants 2, same 3440x1440, same ultra settings (no FSR) and I get 56fps AVERAGE. Turn on FSR QUALITY and I get 110-120fps but game looks like literal shit. The game isnt that much prettier than the first game.
The original choppiness you expirenced when using the i5 9600k, its because that cpu is one of the worst modern cpus for gaming. It only has 6 cores and 6 threads which makes it fequently perform worse than lower powered 4 core 8 thread chips. You wouldve likely had a better expirence using one of the minimum requirements cpus instead honestly. I expected a bad expirence the moment you said "9600k" i suspected this would happen and am glad you tested it with other cpus.
So on my PC with a 13700k 4090 in the city by the crystal at 4k high preset dlss off frame gen off I get around 50 fps. I play it with dlss set to balanced and get around 100 on average about 30 hours played so far. I'm still loving the game but it runs horribly.
This seems like Nvidia's Hairworks 2.0 is now playing ball with performance. This is my suspicion, cause a game running native in 1080p with so many stutters is highly not normal. Let see Nvidia put out a driver fix for this game in the near future and fix this very odd poor performance of this game.
As I've been stating for months now that the RTX 4090 makes more sense paired with a 1440p / 240hz monitor.(for a wide variety of games: Single Player RPG's Max settings / Ray tracing, etc. @ ~90FPS+ FPS@ up to 240FPS etc.) This upsets some as 4K monitors have become much more popular ( many more have them and want to justify the purchase obviously) but it really shouldn't come as a big surprise given that even older games such as Cyberpunk 2077, Control, Halo Infinite, etc. are plenty demanding at 1440p with a corresponding optimal ~ 90+FPS experience running on an RTX 4090 PC. So why anybody would be surprised that newer UE5 games wouldn't somehow just farther make this point clear that the RTX 4090 is more ideally suited for 1440p is just bewildering.. 1080p / 1440p /4K is NOT what it use to mean, an issue many seem not to grasp.. Really though having both a 4K/ 120Hz / VRR/ OLED and 1440p / 240hz / G-Sync display I've preferred gaming on my 1440p monitor overall with my RTX 4090 PC. The largest leap in visuals is going from 1080p to 1440p, 1440p to 4K isn't nearly as mind blowing (diminished returns after 1440p) when it comes to gaming. For example, if the option is 4K/ 60FPS or 1440p/ 90FPS I'd choose 1440p EVERY time. The consoles have more VRAM (~ 13.5GB "usable" for games) than they literally know what to do with and this is the reason for so many of the terrible "PC Ports" IMO which is a shame. The GPU's in these consoles (~ RTX 2070 at best) aren't nearly powerful enough to justify gaming at 4K and thus the reason for the upscaling / dynamic resolution gimmicks and even still the performance "FPS" is quite abysmal ~ 30FPS. Regardless of this terrible error via pandering to ignorance and marketing "4K" (4K more better...)with a $500 budget, game developers should NOT let this be a lame excuse to release terrible PC Ports that run like crap due to absurd VRAM requirements so that the consoles can run high textures at ~30FPS.. Pretty visuals don't mean much when your "in motion" performance is sub-par. As far as CPU requirements, again no real surprise that the new platform (AM5 / DDR5) will begin to start showing its advantages more over the older (DDR4) platform as time passes. People love to complain (a favorite pastime of many / Nvidia, AMD, Intel is out to get me, etc..) about the price of admission for AM5 / DDR5 but it's not like there would be no benefit to spending more to enjoy the benefits well into the future. It pays not to be short sighted. The RTX 4060 is not nearly a terrible as many have made it out to be based on many reviews / benchmarks I've seen. The problem by in large is that most reviewers aren't providing the proper context but instead pander to the outrage mob whom demand outrage porn. Very EASY to do considering the horrendous state of the economy in general - Just look at the comments with most "likes" and this point becomes crystal clear. People very often don't like hearing the reality (often involves more $$$) of things but it is what it is regardless of that the mob wants to hear. As far as having to use "upscaling" to play the game optimally I find it difficult to become remotely bothered by this really. I mean if the game still looks and plays well is it really that big of a deal? I mean who would have thought that over time games would become more demanding to run… Again comparing 1080p /1440p/4K in the past to now is like comparing apples to oranges. Another thing people seem not to understand is that viewing videos on RUclips (Decompression, 60FPS cap, etc.) is NOT the same as what the game looks like when playing on a high refresh monitor natively at home.. So the comments about how the game doesn't look good, etc. are pretty irrelevant / farther demonstrates the appetite for outrage porn - people are broke / life sucks and why did I purchase this 4K monitor again?
Good comment, but the consoles don't actually have as much VRAM as you think. Very few games on Xbox X or PS5 actually use more than 8GB, 10 or 11GB absolute max. The ps5 reserves a huge chunk (over 3GB) of the shared RAM for the OS, and obviously not ALL of the remaining RAM is used for video!
Nanite can be really heavy and is directly tied to number of pixels being pushed, its inherent to how the tech works. It basically tries to cull the number of polys to the number of pixels for a per pixel effect, thats why upscaling works so well here. Now imagine when we get games with Lumen as well. Hopefully the tech will mature nicely tho and we will see big performance jumps.
While i was disappointed with the mandatory dlss, I was blown away at how well the dlss worked in this game compared to any otehr implementation i had seen before. Other than the standard parallel line distortion effect, I legit cannot tell when dlss is on or off, or what setting its on. 3070 and I tweaked settings to get a rock solid 72 fps at 1440p and the game is extremely playable.
SAPPHIRE 7900XTX NITRO+ - 5800X3D = THE GAME OPTIMIZATION IS CRAP -------------->> DONT EVEN BOTHER TO PLAY THIS GAME --->> LIKE THE DEVELOPERS DON'T BOTHER TO OPTIMIZE THE GAME
I was expecting a similar system requirement as the previous game, graphically the game looks pretty much identical to the first one, I don't see any benefits from using UE5 other than bumping new GPUs sales of course.
I have a Ryzen 7 5800x, RX 6600 XT, 32gb DDR4 at 2800mhz, and the game is on a gen 4 NVMe SSD. Smart Access Memory (SAM) is on, and at 3440x1440 (ultrawide) I can choke out anywhere between 12-63 fps WITHOUT UPSCALING. Like what the actual hell. I can run something like Metro Exodus with RAY TRACING at 60fps minimum. Yet, this game makes me belive I'm still using a Vega 8 iGPU to play it. Cranking down my resolution to 720p of all things lets me run the game at a stable 50-70 fps. The game is fun, until you play it in open areas or against enemies that have moves with special effects. The devs should be ashamed of themselves for releasing a Crysis in 2023, and this makes me dislike/hate upscaling even more. It was meant to be a nice bonus to increase performance to hit those sweet high frame rates (120-144+), not be a requirement for playing a game...
DLSS was a crutch for lower-end systems, now it's a crutch for people who have no business making games to call themselves "devs". If it didn't exist, Remnant 2 would not exist. Anyone can make a POS unoptimized mess, not touching any engine code, only using BPs, not even knowing basic CS concepts, and not optimizing any of their environments. The hardware is not "behind", it's the software that is lagging behind the hardware insanely. The software isn't utilizing the hardware properly. UE4/5, apart from physics/anims is primarily single-threaded. You pay $300 for your fancy 5800X3D only for the game to use like 2 cores for important things like drawing and the game thread.
Given the recommended specs, I don't see why higher end systems would need upscaling. I'm running a Ryzen 7 3800X/6900XT, which is well above the "recommended", especially on the GPU side. This is a real problem, I've been trying to give developers the benefit of the doubt when it comes to PS5/Series ports, but it doesn't seem like Remnant 2 has that excuse. Eventually, this is going to become a problem for developers, because people are quickly getting sick of the idea of release now/optimize later.
If we go by what devs said regarding the use of upscaling tech then those requirements are for Dlss performance at 1080p in the case of 2060 for 60 fps and FSR performance at 1080p in the case of 1650 for 30 fps makes perfect sense.
Quando comprei minha RTX3060Ti fiquei super feliz, capaz de rodar todos os games atuais e ainda por cima em QHD (alguns com ajuda do DLSS para manter o frametime estável), agora ver que pouco tempo se passou e me sinto usando minha velha GTX970 de novo (olha que ela durou viu), essas otimizações estão porcas de mais.
i mean if they cant do their job i simply wont buy their game. a game requiring upscaling is a shit game and doesnt need to be supported w my money. im blown away the 4090 only gets 70 fps in 1440p in this game. i wish you would have done high settings along w the ultra. most of the time you cant discern between the 2. at least i know i cant.
And now perpare that this BS is going to be the future for the next couple of years. UE5 is just broken at its core. I've developed quite a few Projects in UE4 and tried many of them in UE5, generally performance is cut in half even when not using Nanite, Lumen and Virtual Shadow Maps. 2 completely identical Projects (down to the Editor Settings), in UE4 i'm able to get around ~200fps while in UE5 the same scene with the exact same settings cuts the fps down to ~95fps. Power consumption and usage of the individual parts is still higher even while pushing way less fps (and while having all new features disabled). Another good showcase that this Engine in it's current form is just broken is Epic's beloved CashCow Fortnite. Before the big switch to UE5 i very rarely dropped below 170fps on Epic Settings (120ish with RayTracing enabled) however since the switch to UE5 i barely manage to scratch 100fps on MEDIUM to HIGH settings (again disabling the advanced features does not seem to help very much) in a game that still looks like its from 2010 and on top of all that the stuttering has become unbearable. Also keep in mind that this was tested on a quite powerful system with an RTX 3080TI.
I have a feeling Nanite is responsible for the dramatic improvement made by the upscaling compared to other games. Nanite likely tries to put more detail in an object the more pixels it takes up on screen (provided there is more detail to be had), thus decreasing the resolution, either manually, or via upscaling has the side effect that it changes both the amount of work the gpu/cpu needs to do for the scene in general on top of the fewer pixels it needs to fill. Not sure if nanite has a multiplier for how much detail it works to put on screen, and the remnant devs set it too high, or maybe nanite doesn't provide that knob (or make it easy to change) so a resolution change is unfortunately the "easiest" way to trade off detail for performance.
For your first point you are correct, but from my understanding of Nanite you are incorrect on the second. For Nanite to work as intended you need a target framerate, if FPS < Target FPS : Nanite lowers details if FPS > target : Nanite Increases details Uncapped FPS = Max Details at all times
Oh oh, but mister Owen, my PC is ready, clearly you missed the part where i didnt buy a 1440p or 4k monitor, so my new pc feels stacked in any game at 1080p.
This is the future of games. Unoptimized, TAA-forced, DLSS-forced BLURRYNESS FESTIVAL. Whats the point of Nanite, for example, if the image quality is so crap that you can't even appreciate it?. It's really beyond me how we've got to this point.
This is one of the first UE5 titles. UE4 released in 2014 and was relevant for the Xbox One / PS4 generation. For an engine that is going to be a cornerstone for the next decade, it SHOULD tax current hardware. I'm not saying the game is optimized. This also isn't a AAA release. The implementation of Nanite and the geometry is impressive coming from a team of this size. Without implementing next generation lighting, sure it may look a little flat and less flattering. That being said, I would be more concerned if early UE5 titles released didnt push the envelope or ran flawlessly on a CPU or budget GPU (X060) released five years ago.
People are saying ''the game doesn't even look good'' but graphically it does, the art direction is just VERY specific and liking it is subjective. You have a valid argument about the performance, no need to bash something for no reason.
the graphics look average/decent to me but the graphics are nowhere near good enough to warrant those system requirements. There's a level in black ops 3 zombies called "Zetsubou No Shima" which has a similar art style to what's shown in this video, but it runs perfectly on my gtx 970 and the graphical quality is pretty much the same. and that game released 8 years ago
@@j-swag7438 I took a look at a few RUclips videos of Zetsubou No Shima and no, that isn't ''pretty much the same''. The details and overall quality are obviously far better in Remnant 2. Like I said, the argument about performance is valid but saying it looks 'bad' is subjective.
Thank you SOOOO much for this review, saved me from buying it and being disappointed yet again with a new AAA title on PC. There is no way I am paying money now for something that will be fixed later. I didn't see this issue raised in any of the reviews I watched. Again, thanks!
i use a 4080 at 1080p and with a 13600k and this is 1 of the games that actually pushed my gpu to the max and without dlss/framegen i was having around 100 fps average AT 1080p and in some areas it would actually drop to as low as 74 frames
While this game doesn’t appear to be breathtaking visually, remember that UE5 demos WITHOUT Nanite or Lumen from 2 years ago were crushing the 3090. Just because the devs didn’t do a good job artistically doesn’t make the engine any less challenging to push frames through. UE5 is the reason that I’m waiting until next gen to upgrade my GPU. There isn’t enough perf on offer with this generation to get 3-4 years out of a card from this generation.
DIGITAL Foundry just released a performance analyze of this game on consoles and its running internal res of 720p @ 60fps on balanced mode with drops to 50 upscaled to 1440p. quality node @ 1200p @ 30 fps consoles also performed very badly its not just a pc issues its how gaming has become now days.
No game should require upscaling to be playable with modern hardware. A 4090 struggling at 1440p without upscaling is a joke and sign of incompetence
I was really excited to try it as the first game in the franchise was kind of a hidden gem, but after trying this on my 6700xt @1080p nd couldn't hit even 60, I promptly returned the game, do not support such bullshit levels of incompetent optimization.
@@ManuSaraswatno kidding? I have a 6700xt and it smashes the first game. Have yet to grab this but it seems like I'll be waiting while they just let the public bug test it.
You are not educated enough to know that not everything is possible without upscaling these days yet.
@@KeepAnOpenMindyou shouldn't need upscaling with a 3080 at 1080p that is just bad.
and the fact that the game doesn't even look good is basically salt on the injury. i mean com'mon, rdr 2 wipes the floor with this game in visuals.
That's just outright foul. Upscaling was supposed to help lower-mid tier cards gain playable framerates, not serve as a crutch for poor optimization.
It was bound to happen. I knew some devs would start doing this when Control came out.
The real problem with REmnant 2 is the system requirements claiming you can play with a gtx 1650 and the "recommended" as a rtx 2060. This is straight up bs to lure people into buying the game.
Al least dlss looks really good in Remnant 2. The game looks crisp even on performance mode
That my good sir, is why dlss and fsr were created. For poorly unoptimized games!
That's so true.
At least it doesn't stutter as long as you are not using an ancient CPU
The fact that a game which looks as average as this NEEDS up-scaling is just ridiculous.
I have a feeling this is only gonna get worse
I would say shadow of war looks better LMAO
@@edzymodsI just watched a video from kevduit playing it. And I was like “damn this game looks good as hell still”
@@teddyholiday8038dlss was only the beta of shitty game optimization, frame generation will be the opus magnum of shitty game optimization. Devs will skip any proper multithreading and optimization and slap dlss on it. Man I hate this undustry.
@@edzymods I would say that torchlight 2 looks better than this.
Makes me appreciate DOOM 2016 and DOOM Eternal that much more when I see how poorly optimized most new titles are. The guys over at ID sofware just seem to know how to create incredible graphics engines that are designed to run well on a wide range of hardware.
They have their quirks too, for example if you have First gen Ryzen Cpu, Doom eternal will crash at settings higher than Low, plus their games are super fps dependant, yoi can softlock at 30 fps in certain parts + countless bugs
They look meh though
@@wallacesousuke1433 Play Doom Eternal at 4k resolution, max graphics settings, HDR and ray tracing enabled. Then come back here and try to say it looks "meh".
@@DeadPixel1105 Doom eternal looks awesome, but Id is known for optimizing their engines well. Thank Carmack I guess.
@@DeadPixel1105 the art style is meh, no amount of graphical fidelity can save these games :P and I hate FPS's so I'll glad decline your suggestion
Ah, the magical process of seeing my 3080 turn into 1080p card with each new release.
Unreal Engine 5 game that looks like a Unreal Engine 4 game and has the performance of an Unreal Engine 5 . Amazing!
Haha yes!!! I didn't even realize it was an unreal engine 5 game. Until today. Had me scratching my head
It actually has the performance of a game made 5 years in the future. Mind-blowing 😂😂😂
@@AntonioLexanTeh 😂😂 that is the truth.
Science.
Even ue5 runs like shit and should not run like that.
Imagine paying $400-ish for GPU and being able to play new games at 1080p 30fps 💀
That's exactly what happened with me unfortunately, bought a Radeon RX 6700XT 12GB and playing in 30fps , and even with that my PC crashes after about 30 minutes. Fuck gunfiregames
@@FloVelo941 playing this game? Or in general?
@@chadjr2004this game. the 6700xt is a great 1080p card and can run 1440p games quite well, too, when they're not unoptimized garbage like this game.
@@denks7849yeah the 6700 xt is a beast budget card, would recommend to anyone
Well, not really u can use DLSS and have over 60 fps and also if u buy 40 series u can use DLSS3 FG it works well only with minor latency hit and it also negates any CPU bottleneck u can get away with weaker CPU but of course it sucks. Also everyone is talking like a 8GB GPU won't work for future UE5 games and this game runs perfectly on 8GB VRAM the VRAM is not an issue.
A new game coming out barely looking better than some decade old games while running like absolute garbage?
Now that's a certified modern dev moment
@@Noob._gamer on ps5 the game runs at 720p 60 fps
@@japanesesamurai4945It upscales to 1440P though and at least it's much cheaper. As a PC gamer I don't like being shortchanged.
If this game used Lumen in conjuction with hardware RT, this kind of GPU and CPU performance could be justified. But, man, in it's current state Remnant II is probably the worst game in terms of performance/graphics quality ratio. It surpassed even horribly optimized TLoUP1 in this regard.
@@stangamer1151 well, if you want to be amazed, check the steam ratings..it looks like it is very well received(at the time of writing this)..so i guess people don't care for low performance of using upscalers?..it really makes me question existance haha
I was kind of surprised at that also. Compared to all the photo realistic tech demos and the hype, I fear Unreal 5 isn't gonna actually mean much for a while in terms of better looking graphics.
I barely hit 75 (my monitor's refresh rate) at 2560x1080 21:9 on 13600k and rtx 3090 all ultra no dlss. Was very surprised with this "performance". The previous game was working flawlessly on my gtx 1060 back then. Today's gaming is disgusting, devs basically say "just buy 4090 bro, what's the problem?"
Yeah just buy 4090 and turn on dlss balanced aswell
Locking to 1440p native with this level of hardware is insane, bet there will be optimization guided for good frame rates
Yeah Remnant from the Ashes looked amazing on an i7-4790k and GTX 1070. Don't know why they wanted to use this engine on their small playerbase
If you see unoptimized and buggy game, just don't buy one and wait for it to get fixed. People who buy broken games, are the main problem because of which devs can comfortably launch broken, buggy game without any consequences.
Yep, companies are gonna get lazy and lean on upscaling and frame generation vs just making the game properly. The game looks good but not enough to require upscaling
No wonder when Nvidia is working their butts to promote DLSS3 as something everybody must use. Damn greedy bastards.
I keep telling noobs who have been partying with DLSS and FSR like it's the greatest thing ever. It's native or go home for me.
The requirements to play these games are increasing exponentially but very little advancements in actual graphical quality, the pc gaming market is in a dire need of another revolution which I don't will be happening anytime soon
The problem is most improvements this gen is in the background of modern titles. Devs try to use the new hardware to create systems to make their jobs easier at the cost of performance.
Considering how many companies in many industries try to cut cost now I'm not surprised devs try to use the most convenient and cost effective method of manufacturing games.
Funny that next revolution is in the AI which ironically goes back to upscaling technologies.
DLSS and framegen is that revolution
@@GewelRealNo it's fucking not
RT,Upscalers, and frame gen are those "revolutions" lol.
Although we BADLY need an RPG like Fallout with smart AI that can actually interact with player's speech.
Peope warned us about DLSS being used as a crutch. Looks like we’ve reached that point.
Since they invented java, nobody cares about optimization. They just add more cores.
how do you max out your 4k monitor without using dlss?
@@Xcepter i dont know what you are talking about. why cant a 4090 get 60fps? what is CP
@@bigturkey1cyberpunk
@@Xcepter good
"designed with upscaling in mind" translates to "did not optimise". It's pretty much that simple and this video proves it.
It is NOT that simple. It's way harder than you think it is.
@@syncmonism Then put the resources, money and time into it.
It's not normal that such a tiny fraction of the budget goes to optimization when this is one of the most crucial aspect of games.
You can make the best game in the world, if it runs like shit, it's not gonna be worth playing. It's a pure waste.
what game have you played in the last year that maxed out a 4k 120hz panel without using dlss?
I think what they said is true. DLSS works better in this game than any other game i have seen. Its flawless to the point i cant even tell if its on or off. I believe its likely they put a ton of work to get the game to run this way. The end result is odd, in that it only looks a bit better than the original did and runs much worse objectively.
@@Alex.Holland if you look at the video it looks like it's stuttering no matter what. Comparing it to other gameplay videos on this channel it still looks like it's not running that well. Even if a lot of work went in to this I would bet money this was chosen as easier/cost or time saving option. It's lazy and it shows. If you create a game that runs like shit on top spec equipment it's shit no matter how you dress it up in excuses.
It's embarrassing how little of a graphical upgrade there is with this much of a performance hit
The game doesn't even look as good to justify the requirements and poor optimization, even the scaling between graphical options is bad
No game looks good enough to justify poor optimization. Some games do however look good enough to justify sub par performance.
same, i was like what is it even rendering? everything looks like ass
@@Aleph-Noll85% disagree lol glad ure in the minority
The funny thing is, this game has no ray tracing. Usually upscalling is kind of required when you kick in a couple RT solutions. Imagine what it would be like if they had some RT implemented. Crazy.
Start putting money aside for the RTX 5090 :)
UE5 is the reason I skipped this graphics generation. Everything below 4080 and XTX is gonna be obsolete by next year if you play AAA titles.
@@MechAdv I also expected something like this. But I had to upgrade, my GTX 1080 + 6700K just wasn't enough for me anymore. I upgraded to a RTX 4090 + 7950X3D. And I changed my monitor to that new Asus 240 refresh 1440P OLED monitor. That way at least it will stay relevant for some time.
Lumen would absolutely destroy performance. It's such a shame devs aren't going for a normal approach on optimization and just slap upscaling to smooth that out without caring about players.
@@lombredeshakuras1481 85% disagree I guess lol
Since they wont optimize the game to run on current hardware. Just refuse to buy the shitty game until 3 years later when its 80% discounted.
720p gaming is back with a new name! Not only does it not look special at native res but you can't see any detail at such aggressive upscaling.
Me: I'm going to buy a 4090 so I can play games on ultra 1440p without ever having to lower settings or use upscaling!
Remant 2: I'm going to ruin this man's whole career
It goes hand in hand then, devs becoming more reliant on upscaling tech, while gpu maker selling said proprietary upscaling tech with little to no effort on actually increasing performance..hmmm
I'd say that's partly true. For some game devs the ability to have some DYNAMIC render resolution setup is low hanging fruit. Spend months of optimization at different parts of the game to find those areas you're dropping down to 20FPS or lower at times or just let the game drop as low as it needs to, even 360p causing massive blurriness.
Hopefully people vote with their wallets and make it clear this is unacceptable. I certainly wanted the new Jedi game but didn't buy it.
Game looks good, but not good to require any amount of upscaling. Plague tale requiem ran better at native on my system than this game with FSR quality, INSANE. And we all know that game has one of the most beatiful graphics we have today. Plus when you change quality settings, there is no clear visual change. Basically this game is trash
it doesnt have AA. upscaling looks great with some sharpening
have you played the game?
game looks like a PS4 game not PS5 let alone UE5
@@nannnanaa "upscaling looks great" and you look like a corporate shill. Off course upscaling looks great when your native resolution looks trash.
It's crazy cos I thought even Plague's Tale should be running better when it came out since even tho it looks gorgeous, it's a really linear, straight forward game. This game doesnt even look anywhere near as good, yet, runs worse. Gaming is in the mud.
I can’t believe I switched from console to PC because console games were starting to be very low res upscaled by FSR, and here I am with my new PC about to experience the same thing 😂 fml
You dont have to play those few unoptimized games on PC. Luckily you can play thousands other games on PC, compared to like 50-100 on any console. And most popular games on PC run perfectly smooth even on low-end systems.
Plus you end up spending more and the devs advice for optimising there games is throwing money at your pc.
@@garrusvakarian8709 yep that's exactly why I sold mine. I think Jedi Survivor is like 800p internally on xbox and ps5 😂💀
@@yeahbuddy300lbs 648p
@@garrusvakarian8709 Like Forsbroken which comes from that same square Enix 🤣
A small AA studio was able to include all three upscaling technologies, and frame generation? Where did they find the resources to do that? It must have taken so much time and effort, they probably had to starve themselves. Thank heavens big AAA studios are not going for such sacrifices.
It's on geforce now at launch
Definitely would like to see the same tests on the AMD range of GPUs. Currently have a 6750XT. Don’t care how long the video is, very interesting stuff. Well done sir 👍🏻.
Hope to see a follow up with some AMD GPUs like the 6700XT and 7900XTX and the Intel A750.
Also other CPU configs like a 12400, 12600K, 7600, 5600X, 5800X and the 5800X3D, how strong does the CPU need to be to stop the stutters?
With a 6800XT and a 5800X3D I get occasional stutters, but nothing too crazy. Going into a new area can cause stutters for a few moments but then it goes away, with an average frame rate of 70-100 fps, typically in the 80s and 90s. This is on mostly high settings with shadows and effects on medium, with FSR Quality. I was honestly pretty sad about the performance. It's playable, but I figured I'd get better numbers. I feel for anyone with a weaker PC, it must be miserable in some cases.
Edit: forgot to mention this is at 1440p. Haven't tested 1080p as it screws with my second monitor
@@Rexxxed Use XeSS, it looks better and gives you more performance in this game.
@@MaxIronsThird I tried using it, and while it does look better it actually decreased performance by a few frames in my case
r7-5800x3d/i5-13600k is minimum to remove bottolneck
Just another garbage UE engine game,stutter non stop and just low FPS all around,cant get stable performance with an rx6800 r5 5600x 32gb ram and nvme SSD,im done with unreal engine games and not playing another one any time soon...
I'm convinced that CULLING is not done on certain meshes in certain dungeons/areas, worst offender being dungeons in N'Erud world.
What's culling?
@@wallacesousuke1433 objects not in your field of view not rendering increasing performance
Nanite automatically culls everything not visually-perceptable in real time so the argument doesn't make sense. That's why LOD isn't relevant in these games, they simply cull CG-quality assets in real time. It will however try to load your GPU with geometries as long as there is headroom. So I think what happens here is Nanite can't gauge how much headroom your GPU has - a common theme in PC space. It seems to work very well on consoles.
I am 100% certain that CULLING *IS* being done. Do you know how I know? I know because the shadows in this game are a screen space effect, and as soon and the source of the shadow is out of the frame, the shadow itself disappears!!
@@mimimimeow Don't forget that consoles also heavily rely on upscaling, and the game has a performance and quality mode there too.
It’s crashed over 15 times for me already.. extremely frustrating when I never buy games at full price 🙄 great game, but crashes are a huge turn off
I have a modest computer, Ryzen 3 with RX 6600, can run this game at high 1080p. But there are stutters from time to time, using FSR on quality makes it work really well at 60fps. But i agree this is not a solution, the game should be playable as is, not to mention people with older cards that don't benefit from upscaling. Not to mention the game isn't exceptionally demanding in itself, like some Capcom games that have facial rendering where you can't even tell if the characters are 3D models or real people.
PS: It's even weirder, the fact that this game looks so similar to the first Remnant game in terms of graphics, yet they managed to make it require so much more resources. The first game ran even on a potato.
To see how CPU affects GPU performance is eye opening
Those faces of them devs should be slapped inside out to teach them how to code and optimize so that it still looks great and runs butter.
Ya, that's rarely how it works. Games are complicated, especially if you're learning new software. USUALLY the issue is upper management setting deadlines that are just not possible. I wish people would stop blaming the coders.
@@photonboy999 Id usually agree with this stance, sadly doesn't work in this case as the devs outright stated the game was designed with DLSS/FSR in mind.
what game have you played in the last year that maxed out a 4k 120hz panel without using dlss?
Yeah i should not need to have on DLSS to gain above 60fps at 1080p on a 3070 but here we are, finished the game yesterday. Really hoping the devs can solve the performance issues.
worth it ? how many hours mate ?
I was talking to my friend a while ago about my hate for fsr and dlss.
They are not bad technologies and have their use, but this exact situation was why I despise it.
Dlss frame generation and the way they marketed it was the giveaway for me. The writing was on the wall, just can believe it wasn't limited to the gpu companies, and their want to sell underpowered hardware for higher prices.
So you think gpu makers are secretly bribing game developers to sabotage their games optimization? oO
@hastesoldat nah, not that deep. With the technology that is available, game devs don't have to fully optimize their games anymore.
Essentially, "Who cares if the game doesn't have a fps over 60, dlss frame generation makes up for it.".
It's Oversimplified, but that's the point. It boils down to laziness or cutting corners.
@@cosmickirby580 Still don't see how gpu makers and engineers are at fault there. It's the game devs that are incompetent and lazy as usual and are using the technologies wrong.
@hastesoldat again, gpu makers and engineers are not conspiring with game devs. it's not that deep.
What I'm trying to tell you is that the technology, regardless of its intended purpose, allows for corners to be cut both on the manufacturing side and the actual data being processed.
In oversimplified words,
gpus don't need to be as powerful because supersampling makes up for it.
Games don't need to be as optimized because supersampling makes up for it.
This does not mean that they have to be conspiring with each other. In reality, they are working towards the same goal, making more money with less time invested, and this technology simply allows it to be done.
It could be that developers are starting to become lazy and instead of writing better code they rely on upscaling or they simply say just buy a better GPU. It's insane that even the 4090 can't keep up.
Thats the atitude alot of people ive run into in the PC games community have too. You're just expected to upgrade or stop complaining, which is horrible with the economy the way it is.
Great work Daniel, love these type of vids
Edit: The floating torso to point to a part of the screen is always great
I want to play this but I'll be holding off buying this for now until they get around to finishing it. I miss the days where an optimized game was implied by the term "release version". Thanks for the heads up Daniel!
the game runs fine if you arent pushing over 60fps. i pre ordered and played the game fine enough. small stutters here and there but nothing bad until final boss. less than 30 fps on med settings
That's going to take a while.
@@iPeanutJelly "If you aren't pushing over 60fps"
"I pre ordered"
Pfft okay. Really talking to the masses with this. The latter makes it sound to me like you're justifying the waste of money after the fact.
@@MaxIronsThird If the past few years have taught me anything it's patience.
@@iPeanutJelly Go to consoles if you don't want more than 60FPS. And STOP advising others on wasting their money like you did.
game devs being lazy? NO WAY! the actual state of gaming the last few years have been unacceptable
Yet you keep slapping that pre-order button!
And "few years" is a bit of a stretch. Try almost 10 years! Since Witcher III - basically. That was the first big test the water "FUCK YOU" from developers. Putting NDA's on reviewing the game, because of the EPIC downgrade in visuals and the blatent "drop port" the PC version was of the console versions, with CDPR protecting their pre-order millions by threatening that german website with legal action for leaking the truth days before the release - in a desperate attempt to expose CDPR and allow people to get their money back in time by cancelling pre-orders based on FALSE ADVERTISING.
Now the devs don't even try to hide it anymore - cos ya'll KEEP ON SLAPPING that pre-order button.
Oh and you keep paying Nvidia $400 for entry level GPU's too. And $300 for their media centre options.
GG. WP.
hes right devs are lazy, you can see whats wrong with the optimizations within the stats.
have you played the game?
Buy indie games, those devs make sure you can run your game even on a potato or toaster.
@@BladeCrew Just buy games that are already out - have been reviewed - and those reviewers said "Great game! Well optimized!". SURE - take my money!
Not pre-ordering nonsense 2 years in advance and then getting trash like this - and having to spend $500 on DLC while you wait for the trash to get patched! XD
Gamers all lost their minds! XD
DigitalFoundry's video about this game on the consoles says that the resolution is around ~1200p upscaled to 1440p for 30 fps. And 792p for balanced and 720p for performance. It's unoptimized and they're using upscaling as a crutch for playable framerates.
*sigh* same story with Forspoken, Jedi Survivor, and now this. I guess we should be getting use to this as the norm for most of us with mid ranged PCs, though I do plan on building a 7900xt build in future.
@@Stardomplay turn the setting to medium and most of the problem will be solved (unless it is game engine issue like those shader compilation stutter stuff). the problem is i see people take ultra setting as a measure to see if the game is well optimized or not when ultra have always been build to punish even the fastest hardware available at the time.
Can you see if XESS and FSR has the same effect on the frame rate stability?
XeSS is significantly faster in this game for some reason.
@@sergtrav have to try it out then, thanks.
My brothers and sisters with last gen mid range cards this is a blessing the backlog has been waiting for us for years. Maybe in 2025 we can play this game at 1440p 100+ fps as God intended Pc games be played.
Tell that to the people who are still waiting for a GPU worthy enough to get a stable 60fps on Attila: Total War. The game came out 8 years ago, and the developers claimed it was "made with future GPUs in mind" when they received backlash for the performance.
@@RGInquisitor That game runs almost exclusively on your CPU.
GPU is irrelevant. I played it on a Gtx 260 back in the day.
@@beri4138 Indeed it does. Still runs like shit even with the latest cpus. Newer and older games perform much better. Very bad optimization regardless.
I play at 1440p and get average of 120fps on a 3080 and 10900k. just need to upgrade my guy
@@escape808 No, I do not. You also do not get an average of 120fps on Atilla on any hardware.
Awesome work! The one thing that's missing is checking it with a fast/modern Intel processor (or a fast/modern non-x3d AMD) to see if it's just pure raw power that's needed or 3D V-cache related, and then a 5800X3D just to cross check. Also why no AMD cards checked?
Tell me you did not watch the whole video without telling lol
when dlss first released i loved the idea of it giving us a boost but sadly its now being used as a lazy way to optimize. years ago we managed to run games fine without dlss but now its like a mandatory thing. its so lazy. we could have insane graphics combined with insane fps but studios dont wanna optimize their games anymore. thats why i appreciate the last two doom games and also atomic heart for running so smoothly despite looking really good.
I bought this game a few days ago and tried it this morning. First time I've ever refunded a game on Steam. The gameplay was nice but man, the performance is so bad. I have a 2080Super and 9900k in this system and I wasn't hitting 60fps with DLSS on, and huge frametime spikes/stutters. I want to like this game and I'm sure I will if they ever fix it, but right now it's a mess.
All UE games have heavy CPU bottlenecks, add raytracing on top of that and its murder.
just say you're poor and cannot afford a 13900KS
@@od1sseas663If you're raising a family with bills to pay your first priority isn't an expensive cpu. Many of us are on a tight budget.
@@od1sseas663 just say your a shill and cannot afford a 7800x3D or 5800x3D which wipes the floor with a 13900K in benchmarks, keep drinking that koolaid bro, let me how it feels when the copium wears off
@waitwhat1320 I got it for almost 500$ when it released. My GPU alone costs more than your whole pc you poor kid 😂😂
130fps in Last of Us. Yeah, such a "low end" pc 😂😂😂
@waitwhat1320 Upload a video with your high end specs then
I am afraid of the UE5 future... It feels like developers get a lot of convenience out of the engine, like offering unlimited detail with nanite, ridicolous amount of layers for physically based materials, cinematic lighting and camera effects. Yet we simulated all of that stuff, for a fraction of the cost 10 years ago. I think what will happen is that development studios will just cut corners in terms of staff, and publishers will profit more. The only AAA game I have been impressed me A LOT these last couple of years has been Half Life Alyx (and Cyberpunk, which also has it's own in-house engine). Which looks pre-rendered in VR. They use highly optimized old-school techniques, and it's impressive that they reach that level of quality while rendering two scenes simultaneously for each eye at insane resolutions and high refresh rates. Heck. I play at 5400x2700 at a solid 90fps with a RX6800. The game renders wonderfully at 4320x2160 on a RX6600XT. Yes, they use a combination of baked and dynamic lighting, yes they use parallax corrected cube maps for reflections, but honestly. Why not!? when they look that good?
I'm looking forward to Counter Strike 2, just to see how good the Source2 will look and how performant the game will be. :)
How does the game perform on AMD gpus ? Any chance you will make another video ?
horrible
@5:30 we later determined that it was CPU bottlenecked. But why does the CPU utilization not show 100% usage on any of the cores?
The game is an ugly mess, looks like a game from 5 year ago. It only shows that UE 5 is an unoptimized bloatware.
I really like this style of video. Where you alternating step your way up in performance from the GPU and CPU. Really lets you get a fell for how the game performs on most hardware. Would love to see it with another CPU option although in know that's more work than just slotting in another graphics card or would require another system.
PC Gaming is turning into a rat race
It always was lmao
Yep it's just for the rich now
add another one to the pile I guess…regardless of what the devs claim, this is poorly optimized and shouldn’t have released until it was actually finished. Sadly, people will use this info in their GPU wars, instead of holding the devs accountable.
More like Unreal isn’t ready. I miss when games were actually optimized 😅
I think running Remnant II on a GTX 1650 (as per their listedn "minimum requirements", was incredibly optimistic of the devs.
Remnant 1 (UE4) was very playable with a GTX 1080 Ti in 1440p native. But it didn't exactly flat-line against the monitor refresh cap (mostly 60-90 fps).
And for reference, a 1080 Ti makes a 1650 look like a bad joke. I seriously doubt a 1650 could even come close to running Remnant 1 in 1440p.
The footage of Remnant II (at 540p upscaled to 1080p) looks pretty bad. Lots of stutter, and the eye candy isn't even as pretty as Remnant 1 on UE4.
Remnant one was _terrible_ at pre-compiling content as you approach new terrain. It would crash somewhat frequently because of not pre-loading early, and then freaking out when you move forward and suddenly need new render content. And that was with 11GB on a 1080 ti.
Excellent testing by the way. Clearly the cpu/mobo/ram is critical for this game. There's a LOT of gaming pc's out there with a 9600K or less. That's gonna hurt Remnant II a lot.
Adding "eye candy" (UE5) while relying on huge amounts of upscaling to make it work, is counter-productive. Actually looks worse than if they just made the game on UE4, running in native rez.
Also, wow, a beastly system with a 4060 @ 1080p/medium can't 60fps? That's horrendously unoptimized.
And a 4090 @ 1440p ultra only hits 70 fps, with no ray tracing in the game????? That's just broken.
This may be the "hardest-to-run" game in existence. Generally cyberpunk with max RT is the lowest fps you can get on a game. But this beats cyberpunk for low FPS.
You really NEED very high fps to play Remnant too. It's got a lot of content which requires 'immediate' reflexes. It's basically a gun-toting version of Dark Souls with much better multiplayer.
I have to say this is not a very good pc port. Which is a pity, because Remnant is an amazing game.
I find capping framerate usually fixes terrible frametimes in most games, usually have to do it with rivatuner but sometimes ingame works.
Ah yea cap it at 30 fps 😂😂😂
You did watch the video? If you did, you'll know the problem is with LOW frame rates. Just how much lower would you like it capped? ;)
One of your best videos yet, a fantastic realworld look at performance across various options.
Second. Great video
This game doesn't have RT, but it makes heavy use of UE5 Nanite. Its likely that having the option to turn this off would dramatically improve performance. You touched on this in your Fortnite video when they moved to UE5 ruclips.net/video/WcCUL3dR_V0/видео.html
You can't just turn nanite off lol
With nanite "off" it would probably just run at 4 fps as it would have more stuff to render.
@@WeaselPaw depends on game implementation. they can go back to older way of doing things which will have more performance but less detail.
@@arenzricodexd4409 Yeah but in that case you can't just have an option to turn it on or off. Otherwise they would have to implement both ways, which would defeat the purpose of using nanite in the first place.
@@WeaselPaw implementing both old and new method is common when everyone still on transition phase. that's why we see when new DX being released majority of game did not use new API exclusively. rather they release on both old and newer API. most often the older API serve as a way for people with older hardware or slower hardware to play the game. but as usual for some gamer especially those with newer hardware they will accuse developer being lazy because developer did not use newer API exclusively and fully optimized the game with newer API feature when they did this. i think something similar also happen here.
And graphically the game doesn't look worthy of it's requirements
Remember back when game devs used to brag about how real a world looked, how luscious the plant life, the shadows, the water, etc? These days Devs would rather you turn down all those settings so it looks like crap.
Your testing is also very generous; fill the screen with particle effects and more characters running their Ai Behavior and timers, and you'll definitely see the frame drops.
I hope Digital Foundry cover this game.
Already did
@@cosmosofinfinity for PC specifically?
Human nature dictates taking the path of least resistance
An Unreal game that is poorly optimized? How novel 😄
The worst part is that the majority of people will just enable it and play the game, and this will become the norm for lack of backlash.
And they are doing it. See it anywhere, whether steam or reddit, you will see people hating on the ones that highlight the poor performance.
Something tells me the ultra setting just adds unnecessary stuff that devs left there for the players to get the best experience possible from the game if their system can support it but not needed for most people.
Still the game is too demanding and needs some serious revision, but for now would like to see some testing at medium and high settings.
Either way, it’s DOA imo
Ultra setting crashes Remnant 2 on the spot for me and with this new GPU I got, I can play Forza 5 at near ultra settings without hiccups.
Super interested in seeing your AMD benchmarks. Other channels have shown the 7900XTX matching or even beating the 4090 in this game.
Probably was since it wasn't optimized properly, it just inherintly works better on amd cos of consoles
Those spikes most likely is the game building new shaders. The areas you already visited before has no stutter as the shaders are already loaded. It's common Unreal engine issue and devs 99% of the time don't optimize shaders
Unreal Engine 5.2 is supposed to have a built-in solution for this, and Remnant 2 is built on 5.2.
@@BlackParade01 Unreal Engine 4 had already a solution for this. Just build the shaders before the game starts, but this is just another developer studio that doesn't care about performance at all. They admitted in their own post they had done no optimization and relied fully on temporal up scaling.
How come everyone made games 5-10 years ago, that graphically look exactly the same as new games, and ran perfectly fine on old hardware like a quad core i5 with GTX 700/AMD HD 7000 at 1080 60fps? Now you need an RTX 3060/RX 6600 for basically same performance without upscaling, for same looking games
I think that it is ABSOLUTELY DISGUSTING that the 4090 required upscaling at any resolution with no ray-tracing to give good performance! The dev's on this game definitely DID NOT do their jobs and just relied on upscaling as a crutch for getting good frame rates. I think that this game is going to get a hard nope from me!
In other words, the devs were lazy and tried to cut costs by having hardware brute force it rather than spending time optimizing.
what game have you played in the last year that maxed out a 4k 120hz panel without using dlss?
I remember only a couple of years ago, the 3080 was seen as THE card to have in terms of price to performance for 1440 and 4K gaming. Now it's been relegated to a 1080p card by this game. I have a 4090 and 7950x, and I can barely hold 45-50FPS at native 4K, even when it's overclocked to 3Ghz and pulling over 500w. Just insanity. I could understand it a little more if the game looked ridiculously good, but it doesn't.
Yes true and Red Dead Redemption 2 has better graphics as well as star war jedi survivor
@@latambox Both of those games are demanding as fuck
@@beri4138 nowhere near this game. i can run RDR2 at max everything at native 5120x2160 at ~110fps with my 4090. not very demanding
Game doesn't look very good either
In a low-cpu system locking the FPS to say at 60, 50, or 40 FPS will take out the strain from the cpu for a smoother gameplay. Do not unclock the FPS in the game settings! Also as grazy at it sounds in a low-cpu system DLSS balanced or even quality setting can give you better performance than the performance setting since DLSS takes strain out of the GPU but increases the strain on the cpu.
What's the point in all of this?
Why even pretend we should use UE5 if this is the performance expectation when running 720p internally on current consoles... and when even a 4090 system can have issues running the game, then clearly it's far too soon for the engine to be utilized. It's not worth it.
They didn't make the game with upscaling in mind , they are hiding their incompetence with upscaling, this is ridiculous , the visuals are nice but nothing to write home about and games should be made for current hardware, it's like they didn't look at the steam hardware survey.
Its crazy how this game barely looks any better than the first game but crushes that system... !?!?!? WHAT
It literally crashes my system and I just got $400 AMD 16GB GPU for this game. I have to stay at 30fps with lowest setting. At this point Mad Max looks better on my computer than this game.
Remnants, the first one. 7800x3d and 7900xtx and 64gb ddr5 6000 cl30 max settings 3440x1440, i get 180-300fps. Looks good.
Remnants 2, same 3440x1440, same ultra settings (no FSR) and I get 56fps AVERAGE. Turn on FSR QUALITY and I get 110-120fps but game looks like literal shit. The game isnt that much prettier than the first game.
The original choppiness you expirenced when using the i5 9600k, its because that cpu is one of the worst modern cpus for gaming. It only has 6 cores and 6 threads which makes it fequently perform worse than lower powered 4 core 8 thread chips. You wouldve likely had a better expirence using one of the minimum requirements cpus instead honestly. I expected a bad expirence the moment you said "9600k" i suspected this would happen and am glad you tested it with other cpus.
So on my PC with a 13700k 4090 in the city by the crystal at 4k high preset dlss off frame gen off I get around 50 fps. I play it with dlss set to balanced and get around 100 on average about 30 hours played so far. I'm still loving the game but it runs horribly.
This seems like Nvidia's Hairworks 2.0 is now playing ball with performance.
This is my suspicion, cause a game running native in 1080p with so many stutters is highly not normal.
Let see Nvidia put out a driver fix for this game in the near future and fix this very odd poor performance of this game.
As I've been stating for months now that the RTX 4090 makes more sense paired with a 1440p / 240hz monitor.(for a wide variety of games: Single Player RPG's Max settings / Ray tracing, etc. @ ~90FPS+ FPS@ up to 240FPS etc.) This upsets some as 4K monitors have become much more popular ( many more have them and want to justify the purchase obviously) but it really shouldn't come as a big surprise given that even older games such as Cyberpunk 2077, Control, Halo Infinite, etc. are plenty demanding at 1440p with a corresponding optimal ~ 90+FPS experience running on an RTX 4090 PC.
So why anybody would be surprised that newer UE5 games wouldn't somehow just farther make this point clear that the RTX 4090 is more ideally suited for 1440p is just bewildering.. 1080p / 1440p /4K is NOT what it use to mean, an issue many seem not to grasp..
Really though having both a 4K/ 120Hz / VRR/ OLED and 1440p / 240hz / G-Sync display I've preferred gaming on my 1440p monitor overall with my RTX 4090 PC. The largest leap in visuals is going from 1080p to 1440p, 1440p to 4K isn't nearly as mind blowing (diminished returns after 1440p) when it comes to gaming. For example, if the option is 4K/ 60FPS or 1440p/ 90FPS I'd choose 1440p EVERY time.
The consoles have more VRAM (~ 13.5GB "usable" for games) than they literally know what to do with and this is the reason for so many of the terrible "PC Ports" IMO which is a shame. The GPU's in these consoles (~ RTX 2070 at best) aren't nearly powerful enough to justify gaming at 4K and thus the reason for the upscaling / dynamic resolution gimmicks and even still the performance "FPS" is quite abysmal ~ 30FPS.
Regardless of this terrible error via pandering to ignorance and marketing "4K" (4K more better...)with a $500 budget, game developers should NOT let this be a lame excuse to release terrible PC Ports that run like crap due to absurd VRAM requirements so that the consoles can run high textures at ~30FPS.. Pretty visuals don't mean much when your "in motion" performance is sub-par.
As far as CPU requirements, again no real surprise that the new platform (AM5 / DDR5) will begin to start showing its advantages more over the older (DDR4) platform as time passes. People love to complain (a favorite pastime of many / Nvidia, AMD, Intel is out to get me, etc..) about the price of admission for AM5 / DDR5 but it's not like there would be no benefit to spending more to enjoy the benefits well into the future. It pays not to be short sighted.
The RTX 4060 is not nearly a terrible as many have made it out to be based on many reviews / benchmarks I've seen. The problem by in large is that most reviewers aren't providing the proper context but instead pander to the outrage mob whom demand outrage porn. Very EASY to do considering the horrendous state of the economy in general - Just look at the comments with most "likes" and this point becomes crystal clear.
People very often don't like hearing the reality (often involves more $$$) of things but it is what it is regardless of that the mob wants to hear.
As far as having to use "upscaling" to play the game optimally I find it difficult to become remotely bothered by this really. I mean if the game still looks and plays well is it really that big of a deal? I mean who would have thought that over time games would become more demanding to run… Again comparing 1080p /1440p/4K in the past to now is like comparing apples to oranges.
Another thing people seem not to understand is that viewing videos on RUclips (Decompression, 60FPS cap, etc.) is NOT the same as what the game looks like when playing on a high refresh monitor natively at home.. So the comments about how the game doesn't look good, etc. are pretty irrelevant / farther demonstrates the appetite for outrage porn - people are broke / life sucks and why did I purchase this 4K monitor again?
Good comment, but the consoles don't actually have as much VRAM as you think. Very few games on Xbox X or PS5 actually use more than 8GB, 10 or 11GB absolute max. The ps5 reserves a huge chunk (over 3GB) of the shared RAM for the OS, and obviously not ALL of the remaining RAM is used for video!
Holy wall of text...
@@wallacesousuke1433 Yeah but I space it out to be easier to read..
just for everyone else: 2070 is a 6600
6600 = 2070 =< 3060 < 6600xt < 6650xt
Nanite can be really heavy and is directly tied to number of pixels being pushed, its inherent to how the tech works. It basically tries to cull the number of polys to the number of pixels for a per pixel effect, thats why upscaling works so well here. Now imagine when we get games with Lumen as well. Hopefully the tech will mature nicely tho and we will see big performance jumps.
This is a good explanation, thank you.
While i was disappointed with the mandatory dlss, I was blown away at how well the dlss worked in this game compared to any otehr implementation i had seen before. Other than the standard parallel line distortion effect, I legit cannot tell when dlss is on or off, or what setting its on. 3070 and I tweaked settings to get a rock solid 72 fps at 1440p and the game is extremely playable.
SAPPHIRE 7900XTX NITRO+ - 5800X3D = THE GAME OPTIMIZATION IS CRAP -------------->> DONT EVEN BOTHER TO PLAY THIS GAME --->> LIKE THE DEVELOPERS DON'T BOTHER TO OPTIMIZE THE GAME
I was expecting a similar system requirement as the previous game, graphically the game looks pretty much identical to the first one, I don't see any benefits from using UE5 other than bumping new GPUs sales of course.
I have a Ryzen 7 5800x, RX 6600 XT, 32gb DDR4 at 2800mhz, and the game is on a gen 4 NVMe SSD. Smart Access Memory (SAM) is on, and at 3440x1440 (ultrawide) I can choke out anywhere between 12-63 fps WITHOUT UPSCALING. Like what the actual hell. I can run something like Metro Exodus with RAY TRACING at 60fps minimum. Yet, this game makes me belive I'm still using a Vega 8 iGPU to play it. Cranking down my resolution to 720p of all things lets me run the game at a stable 50-70 fps. The game is fun, until you play it in open areas or against enemies that have moves with special effects. The devs should be ashamed of themselves for releasing a Crysis in 2023, and this makes me dislike/hate upscaling even more. It was meant to be a nice bonus to increase performance to hit those sweet high frame rates (120-144+), not be a requirement for playing a game...
DLSS was a crutch for lower-end systems, now it's a crutch for people who have no business making games to call themselves "devs". If it didn't exist, Remnant 2 would not exist.
Anyone can make a POS unoptimized mess, not touching any engine code, only using BPs, not even knowing basic CS concepts, and not optimizing any of their environments.
The hardware is not "behind", it's the software that is lagging behind the hardware insanely. The software isn't utilizing the hardware properly.
UE4/5, apart from physics/anims is primarily single-threaded.
You pay $300 for your fancy 5800X3D only for the game to use like 2 cores for important things like drawing and the game thread.
Given the recommended specs, I don't see why higher end systems would need upscaling. I'm running a Ryzen 7 3800X/6900XT, which is well above the "recommended", especially on the GPU side. This is a real problem, I've been trying to give developers the benefit of the doubt when it comes to PS5/Series ports, but it doesn't seem like Remnant 2 has that excuse. Eventually, this is going to become a problem for developers, because people are quickly getting sick of the idea of release now/optimize later.
If we go by what devs said regarding the use of upscaling tech then those requirements are for Dlss performance at 1080p in the case of 2060 for 60 fps and FSR performance at 1080p in the case of 1650 for 30 fps makes perfect sense.
Very poorly optimised. Almost like they haven't got round to it yet. Or they don't intend to.
is not all games shit when they launch?
@@sudd3660 You were shit when you launched
@@sudd3660 They never used to be.
@@mrpositronia it is common enough now that is almost standard.
you have to go back before updates when they actually finished games before release.
Quando comprei minha RTX3060Ti fiquei super feliz, capaz de rodar todos os games atuais e ainda por cima em QHD (alguns com ajuda do DLSS para manter o frametime estável), agora ver que pouco tempo se passou e me sinto usando minha velha GTX970 de novo (olha que ela durou viu), essas otimizações estão porcas de mais.
Still have a 970 ☠️
@@KING0SISQO in GPU world 970, 980, 1070 and 1080 are kings... but otimizations and thats sh*s upscalers and fg kill all good raw power GPU.
i mean if they cant do their job i simply wont buy their game. a game requiring upscaling is a shit game and doesnt need to be supported w my money. im blown away the 4090 only gets 70 fps in 1440p in this game. i wish you would have done high settings along w the ultra. most of the time you cant discern between the 2. at least i know i cant.
And now perpare that this BS is going to be the future for the next couple of years. UE5 is just broken at its core. I've developed quite a few Projects in UE4 and tried many of them in UE5, generally performance is cut in half even when not using Nanite, Lumen and Virtual Shadow Maps. 2 completely identical Projects (down to the Editor Settings), in UE4 i'm able to get around ~200fps while in UE5 the same scene with the exact same settings cuts the fps down to ~95fps. Power consumption and usage of the individual parts is still higher even while pushing way less fps (and while having all new features disabled).
Another good showcase that this Engine in it's current form is just broken is Epic's beloved CashCow Fortnite. Before the big switch to UE5 i very rarely dropped below 170fps on Epic Settings (120ish with RayTracing enabled) however since the switch to UE5 i barely manage to scratch 100fps on MEDIUM to HIGH settings (again disabling the advanced features does not seem to help very much) in a game that still looks like its from 2010 and on top of all that the stuttering has become unbearable.
Also keep in mind that this was tested on a quite powerful system with an RTX 3080TI.
Thanks for putting in the work to get this out. Great overview. 100% not supporting this type of development.
I have a feeling Nanite is responsible for the dramatic improvement made by the upscaling compared to other games.
Nanite likely tries to put more detail in an object the more pixels it takes up on screen (provided there is more detail to be had), thus decreasing the resolution, either manually, or via upscaling has the side effect that it changes both the amount of work the gpu/cpu needs to do for the scene in general on top of the fewer pixels it needs to fill.
Not sure if nanite has a multiplier for how much detail it works to put on screen, and the remnant devs set it too high, or maybe nanite doesn't provide that knob (or make it easy to change) so a resolution change is unfortunately the "easiest" way to trade off detail for performance.
this is 100% it.
For your first point you are correct, but from my understanding of Nanite you are incorrect on the second.
For Nanite to work as intended you need a target framerate,
if FPS < Target FPS : Nanite lowers details
if FPS > target : Nanite Increases details
Uncapped FPS = Max Details at all times
Oh oh, but mister Owen, my PC is ready, clearly you missed the part where i didnt buy a 1440p or 4k monitor, so my new pc feels stacked in any game at 1080p.
This is the future of games. Unoptimized, TAA-forced, DLSS-forced BLURRYNESS FESTIVAL. Whats the point of Nanite, for example, if the image quality is so crap that you can't even appreciate it?. It's really beyond me how we've got to this point.
This is one of the first UE5 titles. UE4 released in 2014 and was relevant for the Xbox One / PS4 generation.
For an engine that is going to be a cornerstone for the next decade, it SHOULD tax current hardware.
I'm not saying the game is optimized. This also isn't a AAA release. The implementation of Nanite and the geometry is impressive coming from a team of this size. Without implementing next generation lighting, sure it may look a little flat and less flattering.
That being said, I would be more concerned if early UE5 titles released didnt push the envelope or ran flawlessly on a CPU or budget GPU (X060) released five years ago.
People are saying ''the game doesn't even look good'' but graphically it does, the art direction is just VERY specific and liking it is subjective. You have a valid argument about the performance, no need to bash something for no reason.
the graphics look average/decent to me but the graphics are nowhere near good enough to warrant those system requirements. There's a level in black ops 3 zombies called "Zetsubou No Shima" which has a similar art style to what's shown in this video, but it runs perfectly on my gtx 970 and the graphical quality is pretty much the same. and that game released 8 years ago
@@j-swag7438 I took a look at a few RUclips videos of Zetsubou No Shima and no, that isn't ''pretty much the same''. The details and overall quality are obviously far better in Remnant 2. Like I said, the argument about performance is valid but saying it looks 'bad' is subjective.
Thank you SOOOO much for this review, saved me from buying it and being disappointed yet again with a new AAA title on PC. There is no way I am paying money now for something that will be fixed later. I didn't see this issue raised in any of the reviews I watched. Again, thanks!
i use a 4080 at 1080p and with a 13600k and this is 1 of the games that actually pushed my gpu to the max and without dlss/framegen i was having around 100 fps average AT 1080p and in some areas it would actually drop to as low as 74 frames
should game at 4k with 4080
@@bigturkey1 im gonna game at whatever Resolution i want its my pc xdd
@@nebsi4202 thats dumb
@@bigturkey1 my money my GPU i can dp what i want with it Simple
While this game doesn’t appear to be breathtaking visually, remember that UE5 demos WITHOUT Nanite or Lumen from 2 years ago were crushing the 3090. Just because the devs didn’t do a good job artistically doesn’t make the engine any less challenging to push frames through. UE5 is the reason that I’m waiting until next gen to upgrade my GPU. There isn’t enough perf on offer with this generation to get 3-4 years out of a card from this generation.
DIGITAL Foundry just released a performance analyze of this game on consoles and its running internal res of 720p @ 60fps on balanced mode with drops to 50 upscaled to 1440p.
quality node @ 1200p @ 30 fps
consoles also performed very badly its not just a pc issues its how gaming has become now days.