Your PC isn't ready for this Unreal Engine 5 game! Remnant 2 PC Performance

Поделиться
HTML-код
  • Опубликовано: 2 окт 2024

Комментарии • 2 тыс.

  • @A.Froster
    @A.Froster Год назад +1992

    No game should require upscaling to be playable with modern hardware. A 4090 struggling at 1440p without upscaling is a joke and sign of incompetence

    • @ManuSaraswat
      @ManuSaraswat Год назад +193

      I was really excited to try it as the first game in the franchise was kind of a hidden gem, but after trying this on my 6700xt @1080p nd couldn't hit even 60, I promptly returned the game, do not support such bullshit levels of incompetent optimization.

    • @julianmalarz5227
      @julianmalarz5227 Год назад +18

      ​@@ManuSaraswatno kidding? I have a 6700xt and it smashes the first game. Have yet to grab this but it seems like I'll be waiting while they just let the public bug test it.

    • @KeepAnOpenMind
      @KeepAnOpenMind Год назад +23

      You are not educated enough to know that not everything is possible without upscaling these days yet.

    • @moichamoy2274
      @moichamoy2274 Год назад +160

      ​@@KeepAnOpenMindyou shouldn't need upscaling with a 3080 at 1080p that is just bad.

    • @oogwaythesussyturtle
      @oogwaythesussyturtle Год назад +85

      and the fact that the game doesn't even look good is basically salt on the injury. i mean com'mon, rdr 2 wipes the floor with this game in visuals.

  • @KurisuKun
    @KurisuKun Год назад +880

    That's just outright foul. Upscaling was supposed to help lower-mid tier cards gain playable framerates, not serve as a crutch for poor optimization.

    • @J3f3R20n
      @J3f3R20n Год назад +117

      It was bound to happen. I knew some devs would start doing this when Control came out.
      The real problem with REmnant 2 is the system requirements claiming you can play with a gtx 1650 and the "recommended" as a rtx 2060. This is straight up bs to lure people into buying the game.

    • @Biaccovich
      @Biaccovich Год назад +6

      Al least dlss looks really good in Remnant 2. The game looks crisp even on performance mode

    • @west1329
      @west1329 Год назад +7

      That my good sir, is why dlss and fsr were created. For poorly unoptimized games!

    • @Steelninja77
      @Steelninja77 Год назад +1

      That's so true.

    • @GewelReal
      @GewelReal Год назад +2

      At least it doesn't stutter as long as you are not using an ancient CPU

  • @andrewwowk9385
    @andrewwowk9385 Год назад +530

    The fact that a game which looks as average as this NEEDS up-scaling is just ridiculous.

    • @teddyholiday8038
      @teddyholiday8038 Год назад +51

      I have a feeling this is only gonna get worse

    • @edzymods
      @edzymods Год назад +26

      I would say shadow of war looks better LMAO

    • @L_e_t_h_e_r_e_a_l
      @L_e_t_h_e_r_e_a_l Год назад +4

      @@edzymodsI just watched a video from kevduit playing it. And I was like “damn this game looks good as hell still”

    • @enricod.7198
      @enricod.7198 Год назад +37

      ​@@teddyholiday8038dlss was only the beta of shitty game optimization, frame generation will be the opus magnum of shitty game optimization. Devs will skip any proper multithreading and optimization and slap dlss on it. Man I hate this undustry.

    • @BladeCrew
      @BladeCrew Год назад +9

      @@edzymods I would say that torchlight 2 looks better than this.

  • @techhd541
    @techhd541 Год назад +326

    Makes me appreciate DOOM 2016 and DOOM Eternal that much more when I see how poorly optimized most new titles are. The guys over at ID sofware just seem to know how to create incredible graphics engines that are designed to run well on a wide range of hardware.

    • @antont4974
      @antont4974 Год назад +10

      They have their quirks too, for example if you have First gen Ryzen Cpu, Doom eternal will crash at settings higher than Low, plus their games are super fps dependant, yoi can softlock at 30 fps in certain parts + countless bugs

    • @wallacesousuke1433
      @wallacesousuke1433 Год назад +1

      They look meh though

    • @DeadPixel1105
      @DeadPixel1105 Год назад +71

      @@wallacesousuke1433 Play Doom Eternal at 4k resolution, max graphics settings, HDR and ray tracing enabled. Then come back here and try to say it looks "meh".

    • @squirrelsinjacket1804
      @squirrelsinjacket1804 Год назад +34

      @@DeadPixel1105 Doom eternal looks awesome, but Id is known for optimizing their engines well. Thank Carmack I guess.

    • @wallacesousuke1433
      @wallacesousuke1433 Год назад +2

      @@DeadPixel1105 the art style is meh, no amount of graphical fidelity can save these games :P and I hate FPS's so I'll glad decline your suggestion

  • @Zecuto
    @Zecuto Год назад +47

    Ah, the magical process of seeing my 3080 turn into 1080p card with each new release.

  • @theartofgaming905
    @theartofgaming905 Год назад +632

    Unreal Engine 5 game that looks like a Unreal Engine 4 game and has the performance of an Unreal Engine 5 . Amazing!

    • @RichLifeStories
      @RichLifeStories Год назад +50

      Haha yes!!! I didn't even realize it was an unreal engine 5 game. Until today. Had me scratching my head

    • @AntonioLexanTeh
      @AntonioLexanTeh Год назад +54

      It actually has the performance of a game made 5 years in the future. Mind-blowing 😂😂😂

    • @RichLifeStories
      @RichLifeStories Год назад +6

      @@AntonioLexanTeh 😂😂 that is the truth.

    • @teddyholiday8038
      @teddyholiday8038 Год назад +4

      Science.

    • @enricod.7198
      @enricod.7198 Год назад +2

      Even ue5 runs like shit and should not run like that.

  • @ferrino4145
    @ferrino4145 Год назад +417

    Imagine paying $400-ish for GPU and being able to play new games at 1080p 30fps 💀

    • @FloVelo941
      @FloVelo941 Год назад +46

      That's exactly what happened with me unfortunately, bought a Radeon RX 6700XT 12GB and playing in 30fps , and even with that my PC crashes after about 30 minutes. Fuck gunfiregames

    • @chadjr2004
      @chadjr2004 Год назад +1

      @@FloVelo941 playing this game? Or in general?

    • @denks7849
      @denks7849 Год назад +57

      ​​​@@chadjr2004this game. the 6700xt is a great 1080p card and can run 1440p games quite well, too, when they're not unoptimized garbage like this game.

    • @PeeNutButHer
      @PeeNutButHer Год назад +30

      @@denks7849yeah the 6700 xt is a beast budget card, would recommend to anyone

    • @parker4447
      @parker4447 Год назад +5

      Well, not really u can use DLSS and have over 60 fps and also if u buy 40 series u can use DLSS3 FG it works well only with minor latency hit and it also negates any CPU bottleneck u can get away with weaker CPU but of course it sucks. Also everyone is talking like a 8GB GPU won't work for future UE5 games and this game runs perfectly on 8GB VRAM the VRAM is not an issue.

  • @aftereight9143
    @aftereight9143 Год назад +594

    A new game coming out barely looking better than some decade old games while running like absolute garbage?
    Now that's a certified modern dev moment

    • @japanesesamurai4945
      @japanesesamurai4945 Год назад +27

      @@Noob._gamer on ps5 the game runs at 720p 60 fps

    • @playcloudpluspc
      @playcloudpluspc Год назад +18

      ​@@japanesesamurai4945It upscales to 1440P though and at least it's much cheaper. As a PC gamer I don't like being shortchanged.

    • @stangamer1151
      @stangamer1151 Год назад +25

      If this game used Lumen in conjuction with hardware RT, this kind of GPU and CPU performance could be justified. But, man, in it's current state Remnant II is probably the worst game in terms of performance/graphics quality ratio. It surpassed even horribly optimized TLoUP1 in this regard.

    • @mihai2174
      @mihai2174 Год назад +24

      @@stangamer1151 well, if you want to be amazed, check the steam ratings..it looks like it is very well received(at the time of writing this)..so i guess people don't care for low performance of using upscalers?..it really makes me question existance haha

    • @RickSFfan
      @RickSFfan Год назад +8

      I was kind of surprised at that also. Compared to all the photo realistic tech demos and the hype, I fear Unreal 5 isn't gonna actually mean much for a while in terms of better looking graphics.

  • @plesinsky
    @plesinsky Год назад +56

    I barely hit 75 (my monitor's refresh rate) at 2560x1080 21:9 on 13600k and rtx 3090 all ultra no dlss. Was very surprised with this "performance". The previous game was working flawlessly on my gtx 1060 back then. Today's gaming is disgusting, devs basically say "just buy 4090 bro, what's the problem?"

    • @GeneralS1mba
      @GeneralS1mba Год назад +9

      Yeah just buy 4090 and turn on dlss balanced aswell

    • @GeneralS1mba
      @GeneralS1mba Год назад +2

      Locking to 1440p native with this level of hardware is insane, bet there will be optimization guided for good frame rates

    • @lucidbarrier
      @lucidbarrier Год назад +1

      Yeah Remnant from the Ashes looked amazing on an i7-4790k and GTX 1070. Don't know why they wanted to use this engine on their small playerbase

  • @cyclonous6240
    @cyclonous6240 Год назад +62

    If you see unoptimized and buggy game, just don't buy one and wait for it to get fixed. People who buy broken games, are the main problem because of which devs can comfortably launch broken, buggy game without any consequences.

  • @SeventhCircle77
    @SeventhCircle77 Год назад +143

    Yep, companies are gonna get lazy and lean on upscaling and frame generation vs just making the game properly. The game looks good but not enough to require upscaling

    • @Hombremaniac
      @Hombremaniac Год назад

      No wonder when Nvidia is working their butts to promote DLSS3 as something everybody must use. Damn greedy bastards.

    • @JABelms
      @JABelms Год назад +4

      I keep telling noobs who have been partying with DLSS and FSR like it's the greatest thing ever. It's native or go home for me.

  • @gamerforlife7903
    @gamerforlife7903 Год назад +144

    The requirements to play these games are increasing exponentially but very little advancements in actual graphical quality, the pc gaming market is in a dire need of another revolution which I don't will be happening anytime soon

    • @penumbrum3135
      @penumbrum3135 Год назад +15

      The problem is most improvements this gen is in the background of modern titles. Devs try to use the new hardware to create systems to make their jobs easier at the cost of performance.
      Considering how many companies in many industries try to cut cost now I'm not surprised devs try to use the most convenient and cost effective method of manufacturing games.

    • @UniverseGd
      @UniverseGd Год назад +4

      Funny that next revolution is in the AI which ironically goes back to upscaling technologies.

    • @GewelReal
      @GewelReal Год назад +1

      DLSS and framegen is that revolution

    • @megamanx1291
      @megamanx1291 Год назад +47

      ​@@GewelRealNo it's fucking not

    • @100500daniel
      @100500daniel Год назад +4

      RT,Upscalers, and frame gen are those "revolutions" lol.
      Although we BADLY need an RPG like Fallout with smart AI that can actually interact with player's speech.

  • @teddyholiday8038
    @teddyholiday8038 Год назад +101

    Peope warned us about DLSS being used as a crutch. Looks like we’ve reached that point.

    • @kanta32100
      @kanta32100 Год назад +3

      Since they invented java, nobody cares about optimization. They just add more cores.

    • @bigturkey1
      @bigturkey1 Год назад

      how do you max out your 4k monitor without using dlss?

    • @bigturkey1
      @bigturkey1 Год назад

      @@Xcepter i dont know what you are talking about. why cant a 4090 get 60fps? what is CP

    • @luisuribe5432
      @luisuribe5432 Год назад

      @@bigturkey1cyberpunk

    • @bigturkey1
      @bigturkey1 Год назад

      @@Xcepter good

  • @hYpYz
    @hYpYz Год назад +36

    "designed with upscaling in mind" translates to "did not optimise". It's pretty much that simple and this video proves it.

    • @syncmonism
      @syncmonism Год назад

      It is NOT that simple. It's way harder than you think it is.

    • @hastesoldat
      @hastesoldat Год назад +5

      @@syncmonism Then put the resources, money and time into it.
      It's not normal that such a tiny fraction of the budget goes to optimization when this is one of the most crucial aspect of games.
      You can make the best game in the world, if it runs like shit, it's not gonna be worth playing. It's a pure waste.

    • @bigturkey1
      @bigturkey1 Год назад

      what game have you played in the last year that maxed out a 4k 120hz panel without using dlss?

    • @Alex.Holland
      @Alex.Holland Год назад

      I think what they said is true. DLSS works better in this game than any other game i have seen. Its flawless to the point i cant even tell if its on or off. I believe its likely they put a ton of work to get the game to run this way. The end result is odd, in that it only looks a bit better than the original did and runs much worse objectively.

    • @hYpYz
      @hYpYz Год назад

      @@Alex.Holland if you look at the video it looks like it's stuttering no matter what. Comparing it to other gameplay videos on this channel it still looks like it's not running that well. Even if a lot of work went in to this I would bet money this was chosen as easier/cost or time saving option. It's lazy and it shows. If you create a game that runs like shit on top spec equipment it's shit no matter how you dress it up in excuses.

  • @astoraan6071
    @astoraan6071 Год назад +9

    It's embarrassing how little of a graphical upgrade there is with this much of a performance hit

  • @stylie473joker5
    @stylie473joker5 Год назад +49

    The game doesn't even look as good to justify the requirements and poor optimization, even the scaling between graphical options is bad

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад +6

      No game looks good enough to justify poor optimization. Some games do however look good enough to justify sub par performance.

    • @Aleph-Noll
      @Aleph-Noll Год назад +4

      same, i was like what is it even rendering? everything looks like ass

    • @zaczeen1121
      @zaczeen1121 Год назад

      ​​@@Aleph-Noll85% disagree lol glad ure in the minority

  • @ericjr2601
    @ericjr2601 Год назад +143

    The funny thing is, this game has no ray tracing. Usually upscalling is kind of required when you kick in a couple RT solutions. Imagine what it would be like if they had some RT implemented. Crazy.

    • @nossy232323
      @nossy232323 Год назад +13

      Start putting money aside for the RTX 5090 :)

    • @MechAdv
      @MechAdv Год назад +8

      UE5 is the reason I skipped this graphics generation. Everything below 4080 and XTX is gonna be obsolete by next year if you play AAA titles.

    • @nossy232323
      @nossy232323 Год назад +4

      @@MechAdv I also expected something like this. But I had to upgrade, my GTX 1080 + 6700K just wasn't enough for me anymore. I upgraded to a RTX 4090 + 7950X3D. And I changed my monitor to that new Asus 240 refresh 1440P OLED monitor. That way at least it will stay relevant for some time.

    • @lombredeshakuras1481
      @lombredeshakuras1481 Год назад +1

      Lumen would absolutely destroy performance. It's such a shame devs aren't going for a normal approach on optimization and just slap upscaling to smooth that out without caring about players.

    • @zaczeen1121
      @zaczeen1121 Год назад

      ​@@lombredeshakuras1481 85% disagree I guess lol

  • @HoldinContempt
    @HoldinContempt Год назад +7

    Since they wont optimize the game to run on current hardware. Just refuse to buy the shitty game until 3 years later when its 80% discounted.

  • @nekov4ego
    @nekov4ego Год назад +14

    720p gaming is back with a new name! Not only does it not look special at native res but you can't see any detail at such aggressive upscaling.

  • @VegetarianZombie
    @VegetarianZombie Год назад +9

    Me: I'm going to buy a 4090 so I can play games on ultra 1440p without ever having to lower settings or use upscaling!
    Remant 2: I'm going to ruin this man's whole career

  • @tomthomas3499
    @tomthomas3499 Год назад +16

    It goes hand in hand then, devs becoming more reliant on upscaling tech, while gpu maker selling said proprietary upscaling tech with little to no effort on actually increasing performance..hmmm

    • @photonboy999
      @photonboy999 Год назад +6

      I'd say that's partly true. For some game devs the ability to have some DYNAMIC render resolution setup is low hanging fruit. Spend months of optimization at different parts of the game to find those areas you're dropping down to 20FPS or lower at times or just let the game drop as low as it needs to, even 360p causing massive blurriness.
      Hopefully people vote with their wallets and make it clear this is unacceptable. I certainly wanted the new Jedi game but didn't buy it.

  • @AntonioLexanTeh
    @AntonioLexanTeh Год назад +56

    Game looks good, but not good to require any amount of upscaling. Plague tale requiem ran better at native on my system than this game with FSR quality, INSANE. And we all know that game has one of the most beatiful graphics we have today. Plus when you change quality settings, there is no clear visual change. Basically this game is trash

    • @nannnanaa
      @nannnanaa Год назад

      it doesnt have AA. upscaling looks great with some sharpening

    • @nannnanaa
      @nannnanaa Год назад

      have you played the game?

    • @christonchev9762
      @christonchev9762 Год назад +13

      game looks like a PS4 game not PS5 let alone UE5

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад +1

      @@nannnanaa "upscaling looks great" and you look like a corporate shill. Off course upscaling looks great when your native resolution looks trash.

    • @spitfire1789
      @spitfire1789 Год назад +3

      It's crazy cos I thought even Plague's Tale should be running better when it came out since even tho it looks gorgeous, it's a really linear, straight forward game. This game doesnt even look anywhere near as good, yet, runs worse. Gaming is in the mud.

  • @yeahbuddy300lbs
    @yeahbuddy300lbs Год назад +48

    I can’t believe I switched from console to PC because console games were starting to be very low res upscaled by FSR, and here I am with my new PC about to experience the same thing 😂 fml

    • @Boris-Vasiliev
      @Boris-Vasiliev Год назад +11

      You dont have to play those few unoptimized games on PC. Luckily you can play thousands other games on PC, compared to like 50-100 on any console. And most popular games on PC run perfectly smooth even on low-end systems.

    • @papichuckle
      @papichuckle Год назад +2

      Plus you end up spending more and the devs advice for optimising there games is throwing money at your pc.

    • @yeahbuddy300lbs
      @yeahbuddy300lbs Год назад +1

      @@garrusvakarian8709 yep that's exactly why I sold mine. I think Jedi Survivor is like 800p internally on xbox and ps5 😂💀

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад +1

      @@yeahbuddy300lbs 648p

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад +1

      @@garrusvakarian8709 Like Forsbroken which comes from that same square Enix 🤣

  • @THU31
    @THU31 Год назад +8

    A small AA studio was able to include all three upscaling technologies, and frame generation? Where did they find the resources to do that? It must have taken so much time and effort, they probably had to starve themselves. Thank heavens big AAA studios are not going for such sacrifices.

  • @iaindyer1629
    @iaindyer1629 Год назад +9

    Definitely would like to see the same tests on the AMD range of GPUs. Currently have a 6750XT. Don’t care how long the video is, very interesting stuff. Well done sir 👍🏻.

  • @MaxIronsThird
    @MaxIronsThird Год назад +21

    Hope to see a follow up with some AMD GPUs like the 6700XT and 7900XTX and the Intel A750.
    Also other CPU configs like a 12400, 12600K, 7600, 5600X, 5800X and the 5800X3D, how strong does the CPU need to be to stop the stutters?

    • @Rexxxed
      @Rexxxed Год назад +1

      With a 6800XT and a 5800X3D I get occasional stutters, but nothing too crazy. Going into a new area can cause stutters for a few moments but then it goes away, with an average frame rate of 70-100 fps, typically in the 80s and 90s. This is on mostly high settings with shadows and effects on medium, with FSR Quality. I was honestly pretty sad about the performance. It's playable, but I figured I'd get better numbers. I feel for anyone with a weaker PC, it must be miserable in some cases.
      Edit: forgot to mention this is at 1440p. Haven't tested 1080p as it screws with my second monitor

    • @MaxIronsThird
      @MaxIronsThird Год назад +1

      @@Rexxxed Use XeSS, it looks better and gives you more performance in this game.

    • @Rexxxed
      @Rexxxed Год назад +1

      @@MaxIronsThird I tried using it, and while it does look better it actually decreased performance by a few frames in my case

    • @Kage0No0Tenshi
      @Kage0No0Tenshi Год назад

      r7-5800x3d/i5-13600k is minimum to remove bottolneck

  • @Inpulsiveproductions
    @Inpulsiveproductions Год назад +5

    Just another garbage UE engine game,stutter non stop and just low FPS all around,cant get stable performance with an rx6800 r5 5600x 32gb ram and nvme SSD,im done with unreal engine games and not playing another one any time soon...

  • @sunlightheaven
    @sunlightheaven Год назад +27

    I'm convinced that CULLING is not done on certain meshes in certain dungeons/areas, worst offender being dungeons in N'Erud world.

    • @wallacesousuke1433
      @wallacesousuke1433 Год назад

      What's culling?

    • @sunlightheaven
      @sunlightheaven Год назад +7

      @@wallacesousuke1433 objects not in your field of view not rendering increasing performance

    • @mimimimeow
      @mimimimeow Год назад +3

      Nanite automatically culls everything not visually-perceptable in real time so the argument doesn't make sense. That's why LOD isn't relevant in these games, they simply cull CG-quality assets in real time. It will however try to load your GPU with geometries as long as there is headroom. So I think what happens here is Nanite can't gauge how much headroom your GPU has - a common theme in PC space. It seems to work very well on consoles.

    • @captainthunderbolt7541
      @captainthunderbolt7541 Год назад

      I am 100% certain that CULLING *IS* being done. Do you know how I know? I know because the shadows in this game are a screen space effect, and as soon and the source of the shadow is out of the frame, the shadow itself disappears!!

    • @Franku40keks
      @Franku40keks Год назад

      @@mimimimeow Don't forget that consoles also heavily rely on upscaling, and the game has a performance and quality mode there too.

  • @LaikenBailey
    @LaikenBailey Год назад +2

    It’s crashed over 15 times for me already.. extremely frustrating when I never buy games at full price 🙄 great game, but crashes are a huge turn off

  • @draxrdax7321
    @draxrdax7321 Год назад +2

    I have a modest computer, Ryzen 3 with RX 6600, can run this game at high 1080p. But there are stutters from time to time, using FSR on quality makes it work really well at 60fps. But i agree this is not a solution, the game should be playable as is, not to mention people with older cards that don't benefit from upscaling. Not to mention the game isn't exceptionally demanding in itself, like some Capcom games that have facial rendering where you can't even tell if the characters are 3D models or real people.
    PS: It's even weirder, the fact that this game looks so similar to the first Remnant game in terms of graphics, yet they managed to make it require so much more resources. The first game ran even on a potato.

  • @JohnDoe-cw9oo
    @JohnDoe-cw9oo Год назад +18

    To see how CPU affects GPU performance is eye opening

  • @sherlockholmes7630
    @sherlockholmes7630 Год назад +23

    Those faces of them devs should be slapped inside out to teach them how to code and optimize so that it still looks great and runs butter.

    • @photonboy999
      @photonboy999 Год назад +3

      Ya, that's rarely how it works. Games are complicated, especially if you're learning new software. USUALLY the issue is upper management setting deadlines that are just not possible. I wish people would stop blaming the coders.

    • @metroided34
      @metroided34 Год назад +6

      @@photonboy999 Id usually agree with this stance, sadly doesn't work in this case as the devs outright stated the game was designed with DLSS/FSR in mind.

    • @bigturkey1
      @bigturkey1 Год назад

      what game have you played in the last year that maxed out a 4k 120hz panel without using dlss?

  • @R4in84
    @R4in84 Год назад +17

    Yeah i should not need to have on DLSS to gain above 60fps at 1080p on a 3070 but here we are, finished the game yesterday. Really hoping the devs can solve the performance issues.

    • @SALTINBANK
      @SALTINBANK Год назад

      worth it ? how many hours mate ?

  • @cosmickirby580
    @cosmickirby580 Год назад +7

    I was talking to my friend a while ago about my hate for fsr and dlss.
    They are not bad technologies and have their use, but this exact situation was why I despise it.
    Dlss frame generation and the way they marketed it was the giveaway for me. The writing was on the wall, just can believe it wasn't limited to the gpu companies, and their want to sell underpowered hardware for higher prices.

    • @hastesoldat
      @hastesoldat Год назад

      So you think gpu makers are secretly bribing game developers to sabotage their games optimization? oO

    • @cosmickirby580
      @cosmickirby580 Год назад +1

      @hastesoldat nah, not that deep. With the technology that is available, game devs don't have to fully optimize their games anymore.
      Essentially, "Who cares if the game doesn't have a fps over 60, dlss frame generation makes up for it.".
      It's Oversimplified, but that's the point. It boils down to laziness or cutting corners.

    • @hastesoldat
      @hastesoldat Год назад

      @@cosmickirby580 Still don't see how gpu makers and engineers are at fault there. It's the game devs that are incompetent and lazy as usual and are using the technologies wrong.

    • @cosmickirby580
      @cosmickirby580 Год назад

      @hastesoldat again, gpu makers and engineers are not conspiring with game devs. it's not that deep.
      What I'm trying to tell you is that the technology, regardless of its intended purpose, allows for corners to be cut both on the manufacturing side and the actual data being processed.
      In oversimplified words,
      gpus don't need to be as powerful because supersampling makes up for it.
      Games don't need to be as optimized because supersampling makes up for it.
      This does not mean that they have to be conspiring with each other. In reality, they are working towards the same goal, making more money with less time invested, and this technology simply allows it to be done.

  • @cgerman
    @cgerman Год назад +5

    It could be that developers are starting to become lazy and instead of writing better code they rely on upscaling or they simply say just buy a better GPU. It's insane that even the 4090 can't keep up.

    • @Leahi84
      @Leahi84 Год назад +3

      Thats the atitude alot of people ive run into in the PC games community have too. You're just expected to upgrade or stop complaining, which is horrible with the economy the way it is.

  • @imo098765
    @imo098765 Год назад +21

    Great work Daniel, love these type of vids
    Edit: The floating torso to point to a part of the screen is always great

  • @ojhuk
    @ojhuk Год назад +19

    I want to play this but I'll be holding off buying this for now until they get around to finishing it. I miss the days where an optimized game was implied by the term "release version". Thanks for the heads up Daniel!

    • @iPeanutJelly
      @iPeanutJelly Год назад

      the game runs fine if you arent pushing over 60fps. i pre ordered and played the game fine enough. small stutters here and there but nothing bad until final boss. less than 30 fps on med settings

    • @MaxIronsThird
      @MaxIronsThird Год назад +7

      That's going to take a while.

    • @moldyshishkabob
      @moldyshishkabob Год назад +28

      @@iPeanutJelly "If you aren't pushing over 60fps"
      "I pre ordered"
      Pfft okay. Really talking to the masses with this. The latter makes it sound to me like you're justifying the waste of money after the fact.

    • @ojhuk
      @ojhuk Год назад +3

      @@MaxIronsThird If the past few years have taught me anything it's patience.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад +6

      @@iPeanutJelly Go to consoles if you don't want more than 60FPS. And STOP advising others on wasting their money like you did.

  • @spg3331
    @spg3331 Год назад +51

    game devs being lazy? NO WAY! the actual state of gaming the last few years have been unacceptable

    • @TheVanillatech
      @TheVanillatech Год назад +1

      Yet you keep slapping that pre-order button!
      And "few years" is a bit of a stretch. Try almost 10 years! Since Witcher III - basically. That was the first big test the water "FUCK YOU" from developers. Putting NDA's on reviewing the game, because of the EPIC downgrade in visuals and the blatent "drop port" the PC version was of the console versions, with CDPR protecting their pre-order millions by threatening that german website with legal action for leaking the truth days before the release - in a desperate attempt to expose CDPR and allow people to get their money back in time by cancelling pre-orders based on FALSE ADVERTISING.
      Now the devs don't even try to hide it anymore - cos ya'll KEEP ON SLAPPING that pre-order button.
      Oh and you keep paying Nvidia $400 for entry level GPU's too. And $300 for their media centre options.
      GG. WP.

    • @Rythm1337
      @Rythm1337 Год назад +2

      hes right devs are lazy, you can see whats wrong with the optimizations within the stats.

    • @nannnanaa
      @nannnanaa Год назад +1

      have you played the game?

    • @BladeCrew
      @BladeCrew Год назад +5

      Buy indie games, those devs make sure you can run your game even on a potato or toaster.

    • @TheVanillatech
      @TheVanillatech Год назад +2

      @@BladeCrew Just buy games that are already out - have been reviewed - and those reviewers said "Great game! Well optimized!". SURE - take my money!
      Not pre-ordering nonsense 2 years in advance and then getting trash like this - and having to spend $500 on DLC while you wait for the trash to get patched! XD
      Gamers all lost their minds! XD

  • @Legnalious
    @Legnalious Год назад +5

    DigitalFoundry's video about this game on the consoles says that the resolution is around ~1200p upscaled to 1440p for 30 fps. And 792p for balanced and 720p for performance. It's unoptimized and they're using upscaling as a crutch for playable framerates.

    • @Stardomplay
      @Stardomplay Год назад +2

      *sigh* same story with Forspoken, Jedi Survivor, and now this. I guess we should be getting use to this as the norm for most of us with mid ranged PCs, though I do plan on building a 7900xt build in future.

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад +2

      @@Stardomplay turn the setting to medium and most of the problem will be solved (unless it is game engine issue like those shader compilation stutter stuff). the problem is i see people take ultra setting as a measure to see if the game is well optimized or not when ultra have always been build to punish even the fastest hardware available at the time.

  • @einstien2409
    @einstien2409 Год назад +12

    Can you see if XESS and FSR has the same effect on the frame rate stability?

    • @sergtrav
      @sergtrav Год назад

      XeSS is significantly faster in this game for some reason.

    • @einstien2409
      @einstien2409 Год назад

      @@sergtrav have to try it out then, thanks.

  • @BPMa14n
    @BPMa14n Год назад +32

    My brothers and sisters with last gen mid range cards this is a blessing the backlog has been waiting for us for years. Maybe in 2025 we can play this game at 1440p 100+ fps as God intended Pc games be played.

    • @RGInquisitor
      @RGInquisitor Год назад +3

      Tell that to the people who are still waiting for a GPU worthy enough to get a stable 60fps on Attila: Total War. The game came out 8 years ago, and the developers claimed it was "made with future GPUs in mind" when they received backlash for the performance.

    • @beri4138
      @beri4138 Год назад +1

      @@RGInquisitor That game runs almost exclusively on your CPU.
      GPU is irrelevant. I played it on a Gtx 260 back in the day.

    • @RGInquisitor
      @RGInquisitor Год назад +1

      @@beri4138 Indeed it does. Still runs like shit even with the latest cpus. Newer and older games perform much better. Very bad optimization regardless.

    • @escape808
      @escape808 Год назад

      I play at 1440p and get average of 120fps on a 3080 and 10900k. just need to upgrade my guy

    • @RGInquisitor
      @RGInquisitor Год назад

      @@escape808 No, I do not. You also do not get an average of 120fps on Atilla on any hardware.

  • @Cygnus-Phi
    @Cygnus-Phi Год назад +13

    Awesome work! The one thing that's missing is checking it with a fast/modern Intel processor (or a fast/modern non-x3d AMD) to see if it's just pure raw power that's needed or 3D V-cache related, and then a 5800X3D just to cross check. Also why no AMD cards checked?

  • @ThatsMySkill
    @ThatsMySkill Год назад +2

    when dlss first released i loved the idea of it giving us a boost but sadly its now being used as a lazy way to optimize. years ago we managed to run games fine without dlss but now its like a mandatory thing. its so lazy. we could have insane graphics combined with insane fps but studios dont wanna optimize their games anymore. thats why i appreciate the last two doom games and also atomic heart for running so smoothly despite looking really good.

  • @refl3xes835
    @refl3xes835 Год назад +2

    I bought this game a few days ago and tried it this morning. First time I've ever refunded a game on Steam. The gameplay was nice but man, the performance is so bad. I have a 2080Super and 9900k in this system and I wasn't hitting 60fps with DLSS on, and huge frametime spikes/stutters. I want to like this game and I'm sure I will if they ever fix it, but right now it's a mess.

  • @Richard-tj1yh
    @Richard-tj1yh Год назад +5

    All UE games have heavy CPU bottlenecks, add raytracing on top of that and its murder.

    • @od1sseas663
      @od1sseas663 Год назад +2

      just say you're poor and cannot afford a 13900KS

    • @newyorktechworld6492
      @newyorktechworld6492 Год назад +4

      ​@@od1sseas663If you're raising a family with bills to pay your first priority isn't an expensive cpu. Many of us are on a tight budget.

    • @123TheCloop
      @123TheCloop Год назад

      @@od1sseas663 just say your a shill and cannot afford a 7800x3D or 5800x3D which wipes the floor with a 13900K in benchmarks, keep drinking that koolaid bro, let me how it feels when the copium wears off

    • @od1sseas663
      @od1sseas663 Год назад +1

      @waitwhat1320 I got it for almost 500$ when it released. My GPU alone costs more than your whole pc you poor kid 😂😂
      130fps in Last of Us. Yeah, such a "low end" pc 😂😂😂

    • @od1sseas663
      @od1sseas663 Год назад

      @waitwhat1320 Upload a video with your high end specs then

  • @PixelShade
    @PixelShade Год назад +3

    I am afraid of the UE5 future... It feels like developers get a lot of convenience out of the engine, like offering unlimited detail with nanite, ridicolous amount of layers for physically based materials, cinematic lighting and camera effects. Yet we simulated all of that stuff, for a fraction of the cost 10 years ago. I think what will happen is that development studios will just cut corners in terms of staff, and publishers will profit more. The only AAA game I have been impressed me A LOT these last couple of years has been Half Life Alyx (and Cyberpunk, which also has it's own in-house engine). Which looks pre-rendered in VR. They use highly optimized old-school techniques, and it's impressive that they reach that level of quality while rendering two scenes simultaneously for each eye at insane resolutions and high refresh rates. Heck. I play at 5400x2700 at a solid 90fps with a RX6800. The game renders wonderfully at 4320x2160 on a RX6600XT. Yes, they use a combination of baked and dynamic lighting, yes they use parallax corrected cube maps for reflections, but honestly. Why not!? when they look that good?
    I'm looking forward to Counter Strike 2, just to see how good the Source2 will look and how performant the game will be. :)

  • @nomustacheguy5249
    @nomustacheguy5249 Год назад +5

    How does the game perform on AMD gpus ? Any chance you will make another video ?

  • @RubbingPotatoes
    @RubbingPotatoes Год назад +2

    @5:30 we later determined that it was CPU bottlenecked. But why does the CPU utilization not show 100% usage on any of the cores?

  • @Sineseol
    @Sineseol Год назад +2

    The game is an ugly mess, looks like a game from 5 year ago. It only shows that UE 5 is an unoptimized bloatware.

  • @Star_Wars_Galaxy
    @Star_Wars_Galaxy Год назад +5

    I really like this style of video. Where you alternating step your way up in performance from the GPU and CPU. Really lets you get a fell for how the game performs on most hardware. Would love to see it with another CPU option although in know that's more work than just slotting in another graphics card or would require another system.

  • @anchannel100
    @anchannel100 Год назад +16

    PC Gaming is turning into a rat race

  • @stirfrysensei
    @stirfrysensei Год назад +14

    add another one to the pile I guess…regardless of what the devs claim, this is poorly optimized and shouldn’t have released until it was actually finished. Sadly, people will use this info in their GPU wars, instead of holding the devs accountable.

  • @slothsarecool
    @slothsarecool Год назад +2

    More like Unreal isn’t ready. I miss when games were actually optimized 😅

  • @kathrynck
    @kathrynck Год назад +2

    I think running Remnant II on a GTX 1650 (as per their listedn "minimum requirements", was incredibly optimistic of the devs.
    Remnant 1 (UE4) was very playable with a GTX 1080 Ti in 1440p native. But it didn't exactly flat-line against the monitor refresh cap (mostly 60-90 fps).
    And for reference, a 1080 Ti makes a 1650 look like a bad joke. I seriously doubt a 1650 could even come close to running Remnant 1 in 1440p.
    The footage of Remnant II (at 540p upscaled to 1080p) looks pretty bad. Lots of stutter, and the eye candy isn't even as pretty as Remnant 1 on UE4.
    Remnant one was _terrible_ at pre-compiling content as you approach new terrain. It would crash somewhat frequently because of not pre-loading early, and then freaking out when you move forward and suddenly need new render content. And that was with 11GB on a 1080 ti.
    Excellent testing by the way. Clearly the cpu/mobo/ram is critical for this game. There's a LOT of gaming pc's out there with a 9600K or less. That's gonna hurt Remnant II a lot.
    Adding "eye candy" (UE5) while relying on huge amounts of upscaling to make it work, is counter-productive. Actually looks worse than if they just made the game on UE4, running in native rez.
    Also, wow, a beastly system with a 4060 @ 1080p/medium can't 60fps? That's horrendously unoptimized.
    And a 4090 @ 1440p ultra only hits 70 fps, with no ray tracing in the game????? That's just broken.
    This may be the "hardest-to-run" game in existence. Generally cyberpunk with max RT is the lowest fps you can get on a game. But this beats cyberpunk for low FPS.
    You really NEED very high fps to play Remnant too. It's got a lot of content which requires 'immediate' reflexes. It's basically a gun-toting version of Dark Souls with much better multiplayer.
    I have to say this is not a very good pc port. Which is a pity, because Remnant is an amazing game.

  • @Aaron_Jason
    @Aaron_Jason Год назад +6

    I find capping framerate usually fixes terrible frametimes in most games, usually have to do it with rivatuner but sometimes ingame works.

    • @luisuribe5432
      @luisuribe5432 Год назад +4

      Ah yea cap it at 30 fps 😂😂😂

    • @ChrisM541
      @ChrisM541 Год назад +1

      You did watch the video? If you did, you'll know the problem is with LOW frame rates. Just how much lower would you like it capped? ;)

  • @Steven-ex3ne
    @Steven-ex3ne Год назад +24

    One of your best videos yet, a fantastic realworld look at performance across various options.

  • @derekn615
    @derekn615 Год назад +5

    This game doesn't have RT, but it makes heavy use of UE5 Nanite. Its likely that having the option to turn this off would dramatically improve performance. You touched on this in your Fortnite video when they moved to UE5 ruclips.net/video/WcCUL3dR_V0/видео.html

    • @Wobbothe3rd
      @Wobbothe3rd Год назад +1

      You can't just turn nanite off lol

    • @WeaselPaw
      @WeaselPaw Год назад +1

      With nanite "off" it would probably just run at 4 fps as it would have more stuff to render.

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад

      @@WeaselPaw depends on game implementation. they can go back to older way of doing things which will have more performance but less detail.

    • @WeaselPaw
      @WeaselPaw Год назад

      @@arenzricodexd4409 Yeah but in that case you can't just have an option to turn it on or off. Otherwise they would have to implement both ways, which would defeat the purpose of using nanite in the first place.

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад

      @@WeaselPaw implementing both old and new method is common when everyone still on transition phase. that's why we see when new DX being released majority of game did not use new API exclusively. rather they release on both old and newer API. most often the older API serve as a way for people with older hardware or slower hardware to play the game. but as usual for some gamer especially those with newer hardware they will accuse developer being lazy because developer did not use newer API exclusively and fully optimized the game with newer API feature when they did this. i think something similar also happen here.

  • @tindo
    @tindo Год назад +2

    And graphically the game doesn't look worthy of it's requirements

  • @mayssm
    @mayssm Год назад +1

    Remember back when game devs used to brag about how real a world looked, how luscious the plant life, the shadows, the water, etc? These days Devs would rather you turn down all those settings so it looks like crap.

  • @n1lknarf
    @n1lknarf Год назад +4

    Your testing is also very generous; fill the screen with particle effects and more characters running their Ai Behavior and timers, and you'll definitely see the frame drops.

  • @jeremyf1901
    @jeremyf1901 Год назад +6

    I hope Digital Foundry cover this game.

  • @pneumonoultramicroscopicsi4065
    @pneumonoultramicroscopicsi4065 Год назад +3

    Human nature dictates taking the path of least resistance

  • @iyaramonk
    @iyaramonk Год назад +2

    An Unreal game that is poorly optimized? How novel 😄

  • @kishimotodashoppee8632
    @kishimotodashoppee8632 Год назад +2

    The worst part is that the majority of people will just enable it and play the game, and this will become the norm for lack of backlash.

    • @jamesvenom9834
      @jamesvenom9834 Год назад

      And they are doing it. See it anywhere, whether steam or reddit, you will see people hating on the ones that highlight the poor performance.

  • @SkeeziOS
    @SkeeziOS Год назад +3

    Something tells me the ultra setting just adds unnecessary stuff that devs left there for the players to get the best experience possible from the game if their system can support it but not needed for most people.
    Still the game is too demanding and needs some serious revision, but for now would like to see some testing at medium and high settings.
    Either way, it’s DOA imo

    • @FloVelo941
      @FloVelo941 Год назад

      Ultra setting crashes Remnant 2 on the spot for me and with this new GPU I got, I can play Forza 5 at near ultra settings without hiccups.

  • @ThisAintPizzaHut445
    @ThisAintPizzaHut445 Год назад +4

    Super interested in seeing your AMD benchmarks. Other channels have shown the 7900XTX matching or even beating the 4090 in this game.

    • @GeneralS1mba
      @GeneralS1mba Год назад

      Probably was since it wasn't optimized properly, it just inherintly works better on amd cos of consoles

  • @wekkimeif7720
    @wekkimeif7720 Год назад +3

    Those spikes most likely is the game building new shaders. The areas you already visited before has no stutter as the shaders are already loaded. It's common Unreal engine issue and devs 99% of the time don't optimize shaders

    • @BlackParade01
      @BlackParade01 Год назад +1

      Unreal Engine 5.2 is supposed to have a built-in solution for this, and Remnant 2 is built on 5.2.

    • @wekkimeif7720
      @wekkimeif7720 Год назад +3

      @@BlackParade01 Unreal Engine 4 had already a solution for this. Just build the shaders before the game starts, but this is just another developer studio that doesn't care about performance at all. They admitted in their own post they had done no optimization and relied fully on temporal up scaling.

  • @LolJolk
    @LolJolk Год назад +1

    How come everyone made games 5-10 years ago, that graphically look exactly the same as new games, and ran perfectly fine on old hardware like a quad core i5 with GTX 700/AMD HD 7000 at 1080 60fps? Now you need an RTX 3060/RX 6600 for basically same performance without upscaling, for same looking games

  • @edwardhood6252
    @edwardhood6252 Год назад +1

    I think that it is ABSOLUTELY DISGUSTING that the 4090 required upscaling at any resolution with no ray-tracing to give good performance! The dev's on this game definitely DID NOT do their jobs and just relied on upscaling as a crutch for getting good frame rates. I think that this game is going to get a hard nope from me!

  • @mchonkler7225
    @mchonkler7225 Год назад +5

    In other words, the devs were lazy and tried to cut costs by having hardware brute force it rather than spending time optimizing.

    • @bigturkey1
      @bigturkey1 Год назад

      what game have you played in the last year that maxed out a 4k 120hz panel without using dlss?

  • @paulcoffield2102
    @paulcoffield2102 Год назад +8

    I remember only a couple of years ago, the 3080 was seen as THE card to have in terms of price to performance for 1440 and 4K gaming. Now it's been relegated to a 1080p card by this game. I have a 4090 and 7950x, and I can barely hold 45-50FPS at native 4K, even when it's overclocked to 3Ghz and pulling over 500w. Just insanity. I could understand it a little more if the game looked ridiculously good, but it doesn't.

    • @latambox
      @latambox Год назад +1

      Yes true and Red Dead Redemption 2 has better graphics as well as star war jedi survivor

    • @beri4138
      @beri4138 Год назад

      @@latambox Both of those games are demanding as fuck

    • @pliat
      @pliat Год назад +2

      @@beri4138 nowhere near this game. i can run RDR2 at max everything at native 5120x2160 at ~110fps with my 4090. not very demanding

  • @CorporateShill66
    @CorporateShill66 Год назад +2

    Game doesn't look very good either

  • @Markoul11
    @Markoul11 Год назад +1

    In a low-cpu system locking the FPS to say at 60, 50, or 40 FPS will take out the strain from the cpu for a smoother gameplay. Do not unclock the FPS in the game settings! Also as grazy at it sounds in a low-cpu system DLSS balanced or even quality setting can give you better performance than the performance setting since DLSS takes strain out of the GPU but increases the strain on the cpu.

  • @phant0mdummy
    @phant0mdummy Год назад +1

    What's the point in all of this?
    Why even pretend we should use UE5 if this is the performance expectation when running 720p internally on current consoles... and when even a 4090 system can have issues running the game, then clearly it's far too soon for the engine to be utilized. It's not worth it.

  • @KidGacy
    @KidGacy Год назад +5

    They didn't make the game with upscaling in mind , they are hiding their incompetence with upscaling, this is ridiculous , the visuals are nice but nothing to write home about and games should be made for current hardware, it's like they didn't look at the steam hardware survey.

  • @J0ttaD
    @J0ttaD Год назад +7

    Its crazy how this game barely looks any better than the first game but crushes that system... !?!?!? WHAT

    • @FloVelo941
      @FloVelo941 Год назад +1

      It literally crashes my system and I just got $400 AMD 16GB GPU for this game. I have to stay at 30fps with lowest setting. At this point Mad Max looks better on my computer than this game.

    • @goblinphreak2132
      @goblinphreak2132 Год назад +2

      Remnants, the first one. 7800x3d and 7900xtx and 64gb ddr5 6000 cl30 max settings 3440x1440, i get 180-300fps. Looks good.
      Remnants 2, same 3440x1440, same ultra settings (no FSR) and I get 56fps AVERAGE. Turn on FSR QUALITY and I get 110-120fps but game looks like literal shit. The game isnt that much prettier than the first game.

  • @GhostAcez
    @GhostAcez Год назад +4

    The original choppiness you expirenced when using the i5 9600k, its because that cpu is one of the worst modern cpus for gaming. It only has 6 cores and 6 threads which makes it fequently perform worse than lower powered 4 core 8 thread chips. You wouldve likely had a better expirence using one of the minimum requirements cpus instead honestly. I expected a bad expirence the moment you said "9600k" i suspected this would happen and am glad you tested it with other cpus.

  • @Premier024
    @Premier024 Год назад +1

    So on my PC with a 13700k 4090 in the city by the crystal at 4k high preset dlss off frame gen off I get around 50 fps. I play it with dlss set to balanced and get around 100 on average about 30 hours played so far. I'm still loving the game but it runs horribly.

  • @N0N0111
    @N0N0111 Год назад +1

    This seems like Nvidia's Hairworks 2.0 is now playing ball with performance.
    This is my suspicion, cause a game running native in 1080p with so many stutters is highly not normal.
    Let see Nvidia put out a driver fix for this game in the near future and fix this very odd poor performance of this game.

  • @erickelly4107
    @erickelly4107 Год назад +3

    As I've been stating for months now that the RTX 4090 makes more sense paired with a 1440p / 240hz monitor.(for a wide variety of games: Single Player RPG's Max settings / Ray tracing, etc. @ ~90FPS+ FPS@ up to 240FPS etc.) This upsets some as 4K monitors have become much more popular ( many more have them and want to justify the purchase obviously) but it really shouldn't come as a big surprise given that even older games such as Cyberpunk 2077, Control, Halo Infinite, etc. are plenty demanding at 1440p with a corresponding optimal ~ 90+FPS experience running on an RTX 4090 PC.
    So why anybody would be surprised that newer UE5 games wouldn't somehow just farther make this point clear that the RTX 4090 is more ideally suited for 1440p is just bewildering.. 1080p / 1440p /4K is NOT what it use to mean, an issue many seem not to grasp..
    Really though having both a 4K/ 120Hz / VRR/ OLED and 1440p / 240hz / G-Sync display I've preferred gaming on my 1440p monitor overall with my RTX 4090 PC. The largest leap in visuals is going from 1080p to 1440p, 1440p to 4K isn't nearly as mind blowing (diminished returns after 1440p) when it comes to gaming. For example, if the option is 4K/ 60FPS or 1440p/ 90FPS I'd choose 1440p EVERY time.
    The consoles have more VRAM (~ 13.5GB "usable" for games) than they literally know what to do with and this is the reason for so many of the terrible "PC Ports" IMO which is a shame. The GPU's in these consoles (~ RTX 2070 at best) aren't nearly powerful enough to justify gaming at 4K and thus the reason for the upscaling / dynamic resolution gimmicks and even still the performance "FPS" is quite abysmal ~ 30FPS.
    Regardless of this terrible error via pandering to ignorance and marketing "4K" (4K more better...)with a $500 budget, game developers should NOT let this be a lame excuse to release terrible PC Ports that run like crap due to absurd VRAM requirements so that the consoles can run high textures at ~30FPS.. Pretty visuals don't mean much when your "in motion" performance is sub-par.
    As far as CPU requirements, again no real surprise that the new platform (AM5 / DDR5) will begin to start showing its advantages more over the older (DDR4) platform as time passes. People love to complain (a favorite pastime of many / Nvidia, AMD, Intel is out to get me, etc..) about the price of admission for AM5 / DDR5 but it's not like there would be no benefit to spending more to enjoy the benefits well into the future. It pays not to be short sighted.
    The RTX 4060 is not nearly a terrible as many have made it out to be based on many reviews / benchmarks I've seen. The problem by in large is that most reviewers aren't providing the proper context but instead pander to the outrage mob whom demand outrage porn. Very EASY to do considering the horrendous state of the economy in general - Just look at the comments with most "likes" and this point becomes crystal clear.
    People very often don't like hearing the reality (often involves more $$$) of things but it is what it is regardless of that the mob wants to hear.
    As far as having to use "upscaling" to play the game optimally I find it difficult to become remotely bothered by this really. I mean if the game still looks and plays well is it really that big of a deal? I mean who would have thought that over time games would become more demanding to run… Again comparing 1080p /1440p/4K in the past to now is like comparing apples to oranges.
    Another thing people seem not to understand is that viewing videos on RUclips (Decompression, 60FPS cap, etc.) is NOT the same as what the game looks like when playing on a high refresh monitor natively at home.. So the comments about how the game doesn't look good, etc. are pretty irrelevant / farther demonstrates the appetite for outrage porn - people are broke / life sucks and why did I purchase this 4K monitor again?

    • @Wobbothe3rd
      @Wobbothe3rd Год назад +2

      Good comment, but the consoles don't actually have as much VRAM as you think. Very few games on Xbox X or PS5 actually use more than 8GB, 10 or 11GB absolute max. The ps5 reserves a huge chunk (over 3GB) of the shared RAM for the OS, and obviously not ALL of the remaining RAM is used for video!

    • @wallacesousuke1433
      @wallacesousuke1433 Год назад

      Holy wall of text...

    • @erickelly4107
      @erickelly4107 Год назад

      @@wallacesousuke1433 Yeah but I space it out to be easier to read..

    • @leoSaunders
      @leoSaunders Год назад +1

      just for everyone else: 2070 is a 6600
      6600 = 2070 =< 3060 < 6600xt < 6650xt

  • @spurio187
    @spurio187 Год назад +12

    Nanite can be really heavy and is directly tied to number of pixels being pushed, its inherent to how the tech works. It basically tries to cull the number of polys to the number of pixels for a per pixel effect, thats why upscaling works so well here. Now imagine when we get games with Lumen as well. Hopefully the tech will mature nicely tho and we will see big performance jumps.

    • @Franku40keks
      @Franku40keks Год назад +2

      This is a good explanation, thank you.

    • @Alex.Holland
      @Alex.Holland Год назад +1

      While i was disappointed with the mandatory dlss, I was blown away at how well the dlss worked in this game compared to any otehr implementation i had seen before. Other than the standard parallel line distortion effect, I legit cannot tell when dlss is on or off, or what setting its on. 3070 and I tweaked settings to get a rock solid 72 fps at 1440p and the game is extremely playable.

  • @neofitosmihail4272
    @neofitosmihail4272 Год назад +2

    SAPPHIRE 7900XTX NITRO+ - 5800X3D = THE GAME OPTIMIZATION IS CRAP -------------->> DONT EVEN BOTHER TO PLAY THIS GAME --->> LIKE THE DEVELOPERS DON'T BOTHER TO OPTIMIZE THE GAME

  • @zretil
    @zretil Год назад +3

    I was expecting a similar system requirement as the previous game, graphically the game looks pretty much identical to the first one, I don't see any benefits from using UE5 other than bumping new GPUs sales of course.

  • @torrxknight2064
    @torrxknight2064 Год назад +1

    I have a Ryzen 7 5800x, RX 6600 XT, 32gb DDR4 at 2800mhz, and the game is on a gen 4 NVMe SSD. Smart Access Memory (SAM) is on, and at 3440x1440 (ultrawide) I can choke out anywhere between 12-63 fps WITHOUT UPSCALING. Like what the actual hell. I can run something like Metro Exodus with RAY TRACING at 60fps minimum. Yet, this game makes me belive I'm still using a Vega 8 iGPU to play it. Cranking down my resolution to 720p of all things lets me run the game at a stable 50-70 fps. The game is fun, until you play it in open areas or against enemies that have moves with special effects. The devs should be ashamed of themselves for releasing a Crysis in 2023, and this makes me dislike/hate upscaling even more. It was meant to be a nice bonus to increase performance to hit those sweet high frame rates (120-144+), not be a requirement for playing a game...

  • @yoshi7481
    @yoshi7481 Год назад +1

    DLSS was a crutch for lower-end systems, now it's a crutch for people who have no business making games to call themselves "devs". If it didn't exist, Remnant 2 would not exist.
    Anyone can make a POS unoptimized mess, not touching any engine code, only using BPs, not even knowing basic CS concepts, and not optimizing any of their environments.
    The hardware is not "behind", it's the software that is lagging behind the hardware insanely. The software isn't utilizing the hardware properly.
    UE4/5, apart from physics/anims is primarily single-threaded.
    You pay $300 for your fancy 5800X3D only for the game to use like 2 cores for important things like drawing and the game thread.

  • @stratuvarious8547
    @stratuvarious8547 Год назад +7

    Given the recommended specs, I don't see why higher end systems would need upscaling. I'm running a Ryzen 7 3800X/6900XT, which is well above the "recommended", especially on the GPU side. This is a real problem, I've been trying to give developers the benefit of the doubt when it comes to PS5/Series ports, but it doesn't seem like Remnant 2 has that excuse. Eventually, this is going to become a problem for developers, because people are quickly getting sick of the idea of release now/optimize later.

    • @rahulpandey9472
      @rahulpandey9472 Год назад +1

      If we go by what devs said regarding the use of upscaling tech then those requirements are for Dlss performance at 1080p in the case of 2060 for 60 fps and FSR performance at 1080p in the case of 1650 for 30 fps makes perfect sense.

  • @mrpositronia
    @mrpositronia Год назад +4

    Very poorly optimised. Almost like they haven't got round to it yet. Or they don't intend to.

    • @sudd3660
      @sudd3660 Год назад +1

      is not all games shit when they launch?

    • @beri4138
      @beri4138 Год назад

      @@sudd3660 You were shit when you launched

    • @mrpositronia
      @mrpositronia Год назад +1

      @@sudd3660 They never used to be.

    • @sudd3660
      @sudd3660 Год назад +1

      @@mrpositronia it is common enough now that is almost standard.
      you have to go back before updates when they actually finished games before release.

  • @jacksonoliveira2813
    @jacksonoliveira2813 Год назад +4

    Quando comprei minha RTX3060Ti fiquei super feliz, capaz de rodar todos os games atuais e ainda por cima em QHD (alguns com ajuda do DLSS para manter o frametime estável), agora ver que pouco tempo se passou e me sinto usando minha velha GTX970 de novo (olha que ela durou viu), essas otimizações estão porcas de mais.

    • @KING0SISQO
      @KING0SISQO Год назад +1

      Still have a 970 ☠️

    • @jacksonoliveira2813
      @jacksonoliveira2813 Год назад

      @@KING0SISQO in GPU world 970, 980, 1070 and 1080 are kings... but otimizations and thats sh*s upscalers and fg kill all good raw power GPU.

  • @parkersgarage4216
    @parkersgarage4216 Год назад +1

    i mean if they cant do their job i simply wont buy their game. a game requiring upscaling is a shit game and doesnt need to be supported w my money. im blown away the 4090 only gets 70 fps in 1440p in this game. i wish you would have done high settings along w the ultra. most of the time you cant discern between the 2. at least i know i cant.

  • @CheesyX2
    @CheesyX2 Год назад +1

    And now perpare that this BS is going to be the future for the next couple of years. UE5 is just broken at its core. I've developed quite a few Projects in UE4 and tried many of them in UE5, generally performance is cut in half even when not using Nanite, Lumen and Virtual Shadow Maps. 2 completely identical Projects (down to the Editor Settings), in UE4 i'm able to get around ~200fps while in UE5 the same scene with the exact same settings cuts the fps down to ~95fps. Power consumption and usage of the individual parts is still higher even while pushing way less fps (and while having all new features disabled).
    Another good showcase that this Engine in it's current form is just broken is Epic's beloved CashCow Fortnite. Before the big switch to UE5 i very rarely dropped below 170fps on Epic Settings (120ish with RayTracing enabled) however since the switch to UE5 i barely manage to scratch 100fps on MEDIUM to HIGH settings (again disabling the advanced features does not seem to help very much) in a game that still looks like its from 2010 and on top of all that the stuttering has become unbearable.
    Also keep in mind that this was tested on a quite powerful system with an RTX 3080TI.

  • @callmekrautboy
    @callmekrautboy Год назад +5

    Thanks for putting in the work to get this out. Great overview. 100% not supporting this type of development.

  • @cookie_of_nine
    @cookie_of_nine Год назад +13

    I have a feeling Nanite is responsible for the dramatic improvement made by the upscaling compared to other games.
    Nanite likely tries to put more detail in an object the more pixels it takes up on screen (provided there is more detail to be had), thus decreasing the resolution, either manually, or via upscaling has the side effect that it changes both the amount of work the gpu/cpu needs to do for the scene in general on top of the fewer pixels it needs to fill.
    Not sure if nanite has a multiplier for how much detail it works to put on screen, and the remnant devs set it too high, or maybe nanite doesn't provide that knob (or make it easy to change) so a resolution change is unfortunately the "easiest" way to trade off detail for performance.

    • @seanmaclean1341
      @seanmaclean1341 Год назад +2

      this is 100% it.

    • @Larathu
      @Larathu Год назад +1

      For your first point you are correct, but from my understanding of Nanite you are incorrect on the second.
      For Nanite to work as intended you need a target framerate,
      if FPS < Target FPS : Nanite lowers details
      if FPS > target : Nanite Increases details
      Uncapped FPS = Max Details at all times

  • @__-fi6xg
    @__-fi6xg Год назад +4

    Oh oh, but mister Owen, my PC is ready, clearly you missed the part where i didnt buy a 1440p or 4k monitor, so my new pc feels stacked in any game at 1080p.

  • @kitsunegamedev4526
    @kitsunegamedev4526 Год назад +1

    This is the future of games. Unoptimized, TAA-forced, DLSS-forced BLURRYNESS FESTIVAL. Whats the point of Nanite, for example, if the image quality is so crap that you can't even appreciate it?. It's really beyond me how we've got to this point.

  • @tehama4321
    @tehama4321 Год назад +1

    This is one of the first UE5 titles. UE4 released in 2014 and was relevant for the Xbox One / PS4 generation.
    For an engine that is going to be a cornerstone for the next decade, it SHOULD tax current hardware.
    I'm not saying the game is optimized. This also isn't a AAA release. The implementation of Nanite and the geometry is impressive coming from a team of this size. Without implementing next generation lighting, sure it may look a little flat and less flattering.
    That being said, I would be more concerned if early UE5 titles released didnt push the envelope or ran flawlessly on a CPU or budget GPU (X060) released five years ago.

  • @BUCCIMAIN
    @BUCCIMAIN Год назад +3

    People are saying ''the game doesn't even look good'' but graphically it does, the art direction is just VERY specific and liking it is subjective. You have a valid argument about the performance, no need to bash something for no reason.

    • @j-swag7438
      @j-swag7438 Год назад

      the graphics look average/decent to me but the graphics are nowhere near good enough to warrant those system requirements. There's a level in black ops 3 zombies called "Zetsubou No Shima" which has a similar art style to what's shown in this video, but it runs perfectly on my gtx 970 and the graphical quality is pretty much the same. and that game released 8 years ago

    • @BUCCIMAIN
      @BUCCIMAIN Год назад

      @@j-swag7438 I took a look at a few RUclips videos of Zetsubou No Shima and no, that isn't ''pretty much the same''. The details and overall quality are obviously far better in Remnant 2. Like I said, the argument about performance is valid but saying it looks 'bad' is subjective.

  • @Jakesmith943
    @Jakesmith943 Год назад +3

    Thank you SOOOO much for this review, saved me from buying it and being disappointed yet again with a new AAA title on PC. There is no way I am paying money now for something that will be fixed later. I didn't see this issue raised in any of the reviews I watched. Again, thanks!

  • @nebsi4202
    @nebsi4202 Год назад +4

    i use a 4080 at 1080p and with a 13600k and this is 1 of the games that actually pushed my gpu to the max and without dlss/framegen i was having around 100 fps average AT 1080p and in some areas it would actually drop to as low as 74 frames

    • @bigturkey1
      @bigturkey1 Год назад

      should game at 4k with 4080

    • @nebsi4202
      @nebsi4202 Год назад

      @@bigturkey1 im gonna game at whatever Resolution i want its my pc xdd

    • @bigturkey1
      @bigturkey1 Год назад

      @@nebsi4202 thats dumb

    • @nebsi4202
      @nebsi4202 Год назад

      @@bigturkey1 my money my GPU i can dp what i want with it Simple

  • @MechAdv
    @MechAdv Год назад +1

    While this game doesn’t appear to be breathtaking visually, remember that UE5 demos WITHOUT Nanite or Lumen from 2 years ago were crushing the 3090. Just because the devs didn’t do a good job artistically doesn’t make the engine any less challenging to push frames through. UE5 is the reason that I’m waiting until next gen to upgrade my GPU. There isn’t enough perf on offer with this generation to get 3-4 years out of a card from this generation.

  • @GemyGaming_LOL
    @GemyGaming_LOL Год назад +1

    DIGITAL Foundry just released a performance analyze of this game on consoles and its running internal res of 720p @ 60fps on balanced mode with drops to 50 upscaled to 1440p.
    quality node @ 1200p @ 30 fps
    consoles also performed very badly its not just a pc issues its how gaming has become now days.