@@ewitte12 Exactly, some games look bad and run even worse, but a game with these graphics is going to be demanding regardless. It's like Cyberpunk with path tracing, it's built for future hardware.
I think they're doing what old system requirements used to , where they state the game targets 30 fps just because it can't hold 60 , even it in reality it could run at 45 or 50 fps
@@abadidibadou5476 You need a better eye doctor or a better monitor or stop gaking those copium pills. I bet you can't see the difference bwtween 720p and 1440p either.
The major issue with these kind of specs is the gap between these two parameters : resolution and preset ... jumping to medium 1080p to high 1440p when the majority of players playing with midrange cards will go with high 1080 or medium 1440p
Idk seems like a pretty minor issue given that you can just tweak your settings to your system. I think the major issue is not listing a frame rate with the specs provided.
@@2MuchQuenton buddy u clearly have no clue what ur talking about but i get it when you base PC aptitude off a game that could be run off a potato. Try running a game that actually has some textures to it instead of buildings made out of blocks and then come talk to me.
Digital Foundry initial analysis on the footage they released suggests its a 1080p and lower resolution on the Series X. This looks like 30fps settings
@@789uio6y What do you expect, they want to use the latest and greatest tech on middle of the road 2020 tech This was bound to happen. The same things happened with last gen. If they tried to run RDR2 on Xbox One X settings on the base consoles, they would be sub 720p experiences
Just get a good pc lol. Also if you don’t have a good one, fsr3 dlss and frame generation exists. And if you’re having stutters you can just set a frame rate cap. So many easy ways to get more fps bro
I think this will run 60fps or close to it. The 6800xt at 1440p using tsr should get 60fps. Maybe not all the time but close enough. I would be very surprised to see that card not hit 60fps at all.
Going from 1440p to 4K does not cut your fps by half unless you run out of Vram Just look at benchmarks for 3080 Ti at 4K and 1440p. Going from 4k to 1440p give you around ~70%-80% fps boost depending on the game And that when you are gpu limited. If you arw cpu limited the gap will he even less than that
Not to mention it’s gonna have frame generation too lol. Idk why so many people complaining about “performance issues” when the damn game looks like real life
I never use presets and always tweak settings to how I like them, usually with textures maxed out very little shadows and everything else on medium or high instead of ultra. Also in this game it will definitely support dlss, and probably frame gen. It's only console that needs to worry.
@@theelectricprince8231 look how the 3080 and 3090 went from 4k to 1440p too. I mean maybe at the time of their launch they are beefy enough to handle 4k but one year later games are so demanding that they turn out to be more fitting for 2k
It's not crazy if high settings use lumen or some hardware RT , these cards can't get 60 fps with RT on in many titles and even with standard settings in the heaviest games . If there's anything i learned this generation is don't play UE5 games at native res
@@CaptToilet maybe, but dlss quality on an already 4k resolution looks incredibly good; dlss looks bad only on 1080P and on some games on 1440P, but 4k is always beautiful, maybe even with even more aggressive dlss settings
Is not the engine is raytracing (lumen) and nanite, if you ignore those UE5 can be as fast as UE4. Most people want "real life" graphics now and want them to run on a 300€ or less GPU and that is just not possible. You want good graphics, you have to pay for it with faster and so more expensive hardware, but even a current 2000€ GPU, is struggling with raytracing, more so if you use path tracing, at native resolution, if it was not for upscaling misleading people, they would see how really demanding RT can be.
I'll be right with the new rig. Got an OLED for color POP. A 7900 XT, 5800X3D, A KC3000 2TB and 32GB of 3600MHz, CL15 memory. Display's only 1440p too.
Aside from Ray Tracing, I'm so pleased with my GRE. The fact I can just use it, no underclocking, and not experience massive fan noise as I did with my Pallit 3070 is also a big bonus. For the 2 years that I used that card, 98% of the time it was running at 65% power lol
Well yeah, you got a lower tier brand 3070. What did you expect??? My 3080 don't get loud, don't get hot and is fine 3+ years after its release. Because I got a model that isn't total shit
This looks much more GPU limited than CPU limited to me. I don't see any indicator that the 4k with a 7900xtx/4080 is using DLSS/FSR/TSR. If it gets 30 FPS on an RX 5700 xt at medium 1080p, to me that means it'll run 1080p on consoles internal, upscaled to 4k using FSR or TSR. And that the GPU on consoles is what's limiting it to 30 FPS, not the CPU. If it can get 30 FPS on a Ryzen 2600, I don't see why it can't get 60 FPS on a Ryzen 5700x. It's only limited to 30 on a 7900xtx, because they are probably playing at native 4k. But it's still Unreal 5, so it likely won't be playable at 90 FPS+ on really any PC unless using frame generation.
Looking at the specs , the 30 fps lock is due to cpu limitations. And no even the pro consoles won't be able to hit 60 with cpu limitations. Only frame gen could help somewhat.
My work station PC with an i9 13900 and RTX 4070 is ready to have a crack at. 1440p 60 or 4k 30 on high settings should be doable if I use DLSS upscailing. DLSS frame generation would be a last resort to tack on top of DLSS upscailing if I really wanted to push for 60 fps+, but for that I'd want it to be on max settings.
We’re at a graphical performance junction with newer titles of late and UE5 and other engines being quite demanding for even 1080p high’ish settings (not ultra) and we’ll need Nvidia and AMD to offer something like a 4070Ti (12GB) performance and vram in an RTX 5060 class GPU and at least 50%+ more performance etc the generation after than to keep up with gaming demands!
@@Kitten_Stomperwell, vote with your wallets and maybe, just maybe nvidia will have no choice but to meet the market with reasonable prices. Will that happen? I doubt it. People are lemmings and love to complain.
How do you sell the average person image quality? Graphics is much easier to sell than telling someone it runs at 120FPS that doesn't even know what framerate is.
@@L4veyan The average user is not counting frames just like the average person watching a movie doesn't count on how many times the movie is chopped up.
Bruh, soon we will all need 3 max level GPUs that haven't come out yet to run a game at 60fps that has you walking on a rainy rocky beach at night for an hour.
It may be that we are starting to see people writing with the arc cards in mind. Dogma 2 runs much better on an A750 than a AMD 6600 for example. It may also be that in the case of the A770 in this game. I would suggest a combination of better use of the hardware and the faster and increased memory from 8gb to 16gb may help with the 1% lows making it easier to maintain the 30 frames per second. It will be interesting to see if this is the case.
I don't know about you guys but i think 60fps should be the absolute bare minimum and a baseline, no matter the specs. Everything under is literally unplayable
I just wanna know if this game is using UE 5.4 with improved PSO tech to reduce or eliminate stutter struggle... Thus far, all UE4/ UE5 still have awful optimization for shadow comp stutter and traversal stutter - Immortals of Aveum plays like crap and still crashes with DLSS3 enabled - Hopefully it was just a rushed EA thing. Then again, Ninja Theory's UE4 hellblade has TERRIBLE stutter from traversal to shader compile even with the RT enhanced patch etc...
Yeah though Ninja Theory wasn't really doing any bespoke partnership engine development with Epic at the time of Hellblade 1. As for traversal stutter - that problem runs deep & it's down to the fact that consoles have dedicated hardware for asset streaming/decompression with a unified memory structure, & PCs don't. It's a fundamental hardware difference that can't be solved with brute force of powerful PC hardware. Direct Storage keeps being talked about, but it doesn't seem like a real solution.
@@InnuendoXP yes, its really a game engine issue tho. If the engine (UE5) can properly make use, and leverage all PC CPU cores without waste, direct storage and other fallback techniques for decompression wouldnt need to be relied upon. To note, other game engines don't particularly have this issue. 4A studios (e.g metro series on 4A Engine) and CD Projekt Red, cyberpunk, Witcher series on custom Red Engine take into account various PC hardware, leveraging cores and I/O in incredible ways. (Yes, i cheated, i chose 2 companies whom often design for PC forward ideologies,) but i mention because hes not impossible for engines to have this scalability coupled with high fidelity. CD projekt red running to UE5 for their next titles, if they run like dream on UE5.9xxxx, an every other UE5 game continues to run like ass on PC hardware, ill try to start a riot... lol
@@nuruddinpeters9491 well it's an important difference when companies develop for PC first Vs Console first when they're aiming to leverage the hardware-accellerated streaming hardware in consoles to hit their performance targets. The fundamental difference in data flows & memory management is very non-trivial, it's giving your engine a wholly different set of instructions on when, how & where to pull data from, when to release it. In the case of PC to avoid eating up CPU time for those milliseconds it would mean having to effectively throttle SSD read speeds, expand system & vram memory requirements for caching & culling, all sorts of things they can just instantly just do on-the-fly on consoles that they can't on PC. It's a built-in performance optimisation for consoles, the most profitable market by far (for publishers) that effectively drastically reduces memory requirements. Going after traversal stutter on PC means sacrificing the console experience unless you have more dev time & money than god, & borderline building half a new game engine for the sake of stopping some hitching & stuttering happen. You can see why lots of devs simply aren't budgeted to make that happen & why we've been seeing so many shoddy PC releases on the stutter front. PC I/O is not 'incredible' when it comes to minimum latency overhead from drive reads, the architecture which does that was not designed for gaming-first, it's secondary at best. The only way this is stopping on PC's side without waiting months after launch for the fixes to be made is for AMD & Intel to add their own hardware acceleration & get motherboard manufacturers to play ball on it if the chip needs to be sat somewhere between the RAM & GPU PCI-E lanes.
@@InnuendoXP you make great points, many of which Im aware, however i must provide push back. Developing for console and developing for PC isnt either Or. Tools on UE engine from dev and studio accounts have mostly pointed to being trash, stuck to 2 or 4 cores, thread limited and horrible for compile for bigger, higher fidelity games. (UE4) Speaking of UE3, this wasn't the issue. Games like Batman Arkham City leveraged PC hardware extremely well, not to mention scaling well on crappy jaguer CPU cores for consoles. UE4 was never great for PC. Early UE5, again, not a great start either, Epic admitted to the issues and lack of tools/techniques with decompression. To your point about games being either only good on PC or console, this is crap. Again, Witcher 3 was developed for consoles first, and ported to PC without criminal traversal stutter. Custom engine sure. Not UE engine (important.) CD Projekt Red back in 2015 with Witcher 3 were effectively Indie devs taking both console and PC to task. This nonsense of costing too much money is cope. I think you have a lot of valid points, but yea, some of it is flawed. UE engine needs better tools, period. Its not an either or situation for Console and PC. Its a, when so many devs run to Epic's engine, they struggle with trash tools. Also, its not like PS5 and Xbox series are immune to traverse stutter. Both consoles suffer UE5 Immortals of Aveum with travese stutter despite their magical I/O bandwith. Same for Star Wars jedi survior (UE4.) Hell, Re engine has travesal stutter with Dragon's dogma, and its an entirely different engine. We can safely assume capcom dev'ed the game for consoles first, PC 2nd as a port. For all of the consoles amazing bandwidth and magical mem management, some of these engines and tools are just hot trash. (Yes, this could be an issue of time for current modern games, but my point here, consoles are not immune to traversal stutter in 2023/24.) Now, i will say, custom engines, (in house engines) are inherently expensive, (yes, its a luxury in today's industry. But at the same time, they perform great feats of magick at wonderful fidelity, (4A studios) for both console /PC, without triple A bugets.
@@nuruddinpeters9491 Witcher 3, was developed for console hardware that was weak when it was new, that ran on slow HDDs while PCS were already moving to SSD, & there was no such dedicated hardware specifically for streaming data while taking the load off the CPU, so I really don't think the traversal stutter discussion is applicable in the same way as memory management would've been completely different (i.e. more comparable to PC), there would've been no such fast reads & dumps in/out of memory to make more efficient use of a limited memory pool. Also it was all kinds of jank on launch & it took a year or two of patching for the issues to be ironed out. Sound familiar?
One thing I can think of: If the Arc GPUs they listed genuinely do hold up at those resolutions/presets, at similar FPS to the AMD/NVidia GPUs in those requirement tiers, it could mean the game is unusually heavily reliant on VRAM-bandwidth. In proportion to their basic/1080p performance Arc GPUs seem to do unusually/disproportionately well in VRAM-bandwidth heavy situations like Ray Tracing and 4K. (Not that the Arc GPUs have an especially strong baseline to start with, it's just that they often don't fall off as fast in these demanding circumstances, percentage-wise, compared to the other brands.) I'd have to agree though... It's still very suspicious to me putting the A770 in with the 3080 and the 6800 XT. (Just looking at the 6800 XT alone, that is a significant jump over the regular 6800, and even then I'd find it more than a bit suspicious grouping ARC A770 with the base RX 6800!) Maybe... the FPS you can expect is totally different, and only the resolution is the same! Lol! With those CPUs/GPUs listed, and that advanced-looking graphical presentation from the trailer, no performance/60FPS mode on console, as you are quite justified in pointing out, I also can't imagine they are tying to push massively high FPS. Easily at or below 30FPS at times I'd guess. I find PC system requirements often to be a bit optimistic, as you can technically play the game, though people may be disappointed with framerates. Anyway, enough from me, thanks for the video!
I imagine you'll need to use DLSS/FSR/XESS Balanced or Performance modes to achieve 60+ with this game... I just only hope this UE5 game is not a horrendous stuttering mess like Lords of the Fallen still is (even after their final 1.5 patch). But I won't hold my breath. I really want to play this game because I love the first game a lot. But sadly I'm going to wait until after performance reviews come out to see if it's a stuttering mess or what have you.
I was able to run this game at 60fps 1080p mix between low and mid with my R5 3600, 16GB Ram at 2400mhz and an RX6600XT i cant complain that much ~ the story is great the gameplay is great and familier also some flawes of the first one where adressed. Maybe i am just fanboy-ing the game but it was great to play
Im excited to play this on my ultrawide and xdefiant is out next week also, im currently playing through kingdom come with the occasional bfv multiplayer session
After COVID added to this game's development time, of course, interrupting the cycle -- something tells me the delay didn't result in more polish. For how long this game was in-development (and the project's scope), I was expecting a product that was thoroughly optimized, not yet another UE5 title that will be heavily dependent on AI for the heavy-lifting. I guess we'll find out once the reviews are released.
I'd think the Xbox version is going to be a dynamic upscaled 4K so probably closer to the 1440p line internally, at 30 fps. If it's anything like other games I've seen, the consoles seem to be generally capable of 40-ish FPS at the settings they want to use, but then they cap it at 30 for people without a new TV that can't do things like 40 fps VRR in 120 Hz 4K mode.
Xbox is under 1080p 30fps! Using dynamic resolution and upscaled. It's on 2020 mid range hardware even with games made for the hardware that's impossible! And this was tested!
Be real what is your framrate, I mean stable fps in the game, with nonl stuttering at all running 1440p medium settings. If you're barely getting over 60fps without dlss or some kinda upscaling, then you're golden.
Unreal is SO efficient with VRAM.. each recommended GPU has more than the rec'ed vram except 2070, which has the same amount as the recommendation. This is great news across the board with a lot of stuff moving to UE
We need to take in account that modern games are developed with upscaling in mind. Much like they used TAA/FXAA to replace MSAA (because it was to demanding with differed rendering) and how MSAA replaced SSAA (it is extremely demanding). Some current upscaling tech uses custom anti alaising and typically looks better than native taa/fxaa. Its another tool/tech devs will use to add more fidelity to games.
I doubt taa or FXAA will ever look good. Especially now that better alternatives are being used. DLAA is already on par with MSAA X8, and DLSS 3.7 at 1440p and up with quality mode and preset e is nearly indistinguishable from native DLAA. And thankfully it only takes a few seconds to update the DLSS version and preset that's being used.
When this game launches, Intel will launch a day 1 driver that also happens to be the fine wine driver, finally unlocking the A770's potential and placing it at 3080 levels of power. Trust me, it came to me in a dream.
These games are just too demanding and simply not even worth playing to just watch beaches and walk through wet Rocky pathways for hours at 1080p 30fps with stuttering at medium settings on high end expensive hardware like a 3070. Be practical. Boot up the first game, run it on native 4k max settings, and the experience is way better. It's so jarring to play the second game because it feels good and bad at the same time.
well..... my 7900 XT should be ready for high(recommended?) / very high on a 1440p ultrawide, but not too sure about my elderly i5-8600k... although with it OCed to 5.1ghz all-core, i avg around 120fps in Cyberpunk on high+ settings (no RT obv)
there were FSR and DSLL, so i don't think , most of time it have any impact and much reference value. i don't think most of players is using native graphics setting to play any game. there have much more reference value for lower class GPU still. middle class GPU is capable for handling any situation of resolution with FSR or DSLL
Okay my system is fine, 7900GRE/7600X and I only play in 1440p Ultra. The issue is games that do not have any idea about their target audience and make these mad requirements. The target audience of pc gamers (if it is even ported right and not a damn mess on launch) are people playing with 1660 super/rx 580 8 gb- ryzen 5 3600 systems. I am sure that excludes budget gamers using Nvidia right away due to vram unless these gamers suddenly grab a 1080ti and the amd budget gamers will struggle on older tired cards that struggle already.
Thank you. I've noticed we are in the sane few now. People spend so much money on hardware and happy with a game bouncing between 30 and 40 fps when all you do in the game is walk and jog.
*I JUST checked the recommended specs. 10700K and an RTX 3080.* *I have a 14700K and an RTX 3080. Soon to be a 5080. But, at this rate, I will need a new PC every 2 years....* 😑
u know whats weird?the RTX 3060(12GB) is 1 gen above the 2070 and yet according to those System Req its on the same level as the 2070 when the best GPU's after the 30's gen is the 4070 and above but those a re for 4k 60 fps but how can a 4080 at 4k achive u 60fps+ when an RTX 3060 12gb (or a 3080 in their case) won't achive that without DLSS/FSR
This game is more like an interactive movie so 30ftp actually makes more sense for more "cinematic immersion". It also a good excuse for not optimizing UE 5. I have never seen a game running smooth no matter what hardware...
Seems to be a misunderstanding on many of these videos. Consoles are a controlled ecosystem where you can have some assurance of FPS. PCs have many additional factors hardware/software/power etc., meaning that a publisher would be guessing FPS and be absolutely slaughtered by users not getting the stated FPS. This has been seen and so why do it.
It will be good for a few years. Let go of the idea of cranking settings, use tweak-guides & your eyes to judge how the game actually looks. The 6800 is heads & shoulders above the consoles, has plenty of VRAM, & the consoles are targeting a 4K output. Devs won't be making games for PC that the consoles can't run, don't worry. If you're concerned about framerate, put a modest lock in to get low latency, keep your FOV high, and play with a gamepad. There are games where it's worth worrying about high framerates, and games where it isn't.
I tried playing the first game with my RX 6750 XT and Ryzen 7 5700X PC at 1440p, got like 30mins into the game but couldn't stomach the constant stuttering...
Not shocked with GPU requirements, however CPU i5-12600K for 4k? Quite bizarre, as this is a rather "visual" experience rather than City Skylines 2. I will give it a shot with my 7900 xt and 13600k in 1440p and let's hope FSR is not a must-do in this case.
After Forbidden west and now this, so contrast between the two lol, but i'm somewhat not surprised, it's the usual UE5 game with absurdly high hardware requirement, and probably gonna have stuttery first launch.
@@Spr1ggan87 And yet for a PS4 title, the graphic hold up just fine, ask any PS5 owners and they will agree that Forbidden West Graphics is one of the best looking titles ever released, hence why we don't really need absurd hardware requirement to achieve good looking game if optimized properly.
@@tomthomas3499 You are completely missing the point, the engine and game were built around PS4 so there are concessions made to run on said ancient tech. HellBlade 2 is not made for Xbox One or PS4 so there are no concessions made. So saying Forbidden West runs better is just stating the obvious. Also iirc HB2 has raytracing, Forbidden West doesn't.
@@mrbeencountin it released closer to 3 years ago, then 4 years but even if we say a 3080 is 4 years old, to get that performance in the 4000 series its a $600 card We know most people go up to $300 for a card and now you add inflation for the costs of living, rent prices increases, home prices soaring. Not many people are lucky enough to spend that money I was fortunate enough to spend $800 last time around but now with more responsibilities, I'm not spending $1000 to get a meaningful improvement on what I have
I might buy this just because it costs 25$ in my 3rd world country... last a couple of years publishers stopped using regional prices, so most big games come at full 60/70$ price and it really hurts my wallet. But let's wait for reviews.
Steam Deck will deliver a flawless 267p experience 💪
at flawless 20fps
Just flawlessly upscale it to 360p, sorted.
cinematic VHS experience!
Not gonna hit 30fps on poor steam deck crap bull shit !!!!!!!
you should be greatful playing it lol
The right question is not "if the PC is ready", but if the "game is ready to launch" 🙂
lol
Low fps =/= unoptimized
It's high level gambling to buy a new game today.
@@ewitte12 Exactly, some games look bad and run even worse, but a game with these graphics is going to be demanding regardless. It's like Cyberpunk with path tracing, it's built for future hardware.
@@ewitte12 Can be, but most of the time low fps = unoptimized garbage that will be fixed in a few months after a dozen performance patches.
They're not listing the FPS because it's targeting 30fps lol. That's what I think.
I think they're doing what old system requirements used to , where they state the game targets 30 fps just because it can't hold 60 , even it in reality it could run at 45 or 50 fps
my eyes cannot see the difference between 30 fps and 120 fps anyway
@@abadidibadou5476 You need a better eye doctor or a better monitor or stop gaking those copium pills. I bet you can't see the difference bwtween 720p and 1440p either.
@@lynackhilou4865 Still not good, they should clearly tell the players what is required for 60FPS.
@@DragonOfTheMortalKombat Lil bro hasn't heard of sarcasm 💀
The major issue with these kind of specs is the gap between these two parameters : resolution and preset ... jumping to medium 1080p to high 1440p when the majority of players playing with midrange cards will go with high 1080 or medium 1440p
Yeah , even nixxes does that with their system requirements which i found a bit weird
True that , medium 1440p is the probably most used settings for medium tier gpus , dare i say the use of upscaling is gonna be mandatory for this one.
@@leyterispap6775 It's basically mandatory for every UE5 game, you gain so much performance even on quality mode. It is idiotic not to use it.
NO one believes in these dev recommended specs anymore
Idk seems like a pretty minor issue given that you can just tweak your settings to your system. I think the major issue is not listing a frame rate with the specs provided.
And this doesn't even factor in the usual UE5 type stutter that will inevitably occur.
Thanks for giving us priority over Darren. You're the best!
Hopefully it comes complete and not a broken mess. I actually liked the first one a bit.
If a game was made on UE5 that should tell you everything you need to know about the performance.
The finals was made on UE5 and runs amazing. Your pc is just ass
@@2MuchQuenton damn never knew the 4080 paired with the 7800x3D was ass, how could I have not realized this ahead of time
@@juniorgalindo1083 if you’re struggling to run UE5 games then it’s ass
@@2MuchQuenton buddy u clearly have no clue what ur talking about but i get it when you base PC aptitude off a game that could be run off a potato. Try running a game that actually has some textures to it instead of buildings made out of blocks and then come talk to me.
@@2MuchQuenton optimization you know?
Digital Foundry initial analysis on the footage they released suggests its a 1080p and lower resolution on the Series X. This looks like 30fps settings
But I thought consoles are 4k or at least 1440p... Also series x on lower resolution than 1080p???
So series s is on 480p lol!
@@789uio6ynothing new here. Go look at how many games are releasing 1080p or lower to hit unstable 60 on consoles then upscaling. A lot.
@@789uio6y What do you expect, they want to use the latest and greatest tech on middle of the road 2020 tech
This was bound to happen. The same things happened with last gen. If they tried to run RDR2 on Xbox One X settings on the base consoles, they would be sub 720p experiences
The series s was marketed as a 1440p 120fps machine btw LOL
Wtf. What's the point at this point, pardon the tautology?
So, it's not PC Specifications , but PC Speculations, that's what SPECS stands for
That's not what specs stands for
@@deadpaul6587 you must be very funny in the party
@@iamhardwell2844you shouldn't come to party
I saw the requirements, then you came to my mind opened RUclips and found your analysis in my home page. damn
Buy on sale after optimisation
Try on gamepass
@@tytanis interesting
Already bought it and gonna play it next Tuesday when it drops 😁
I always accidently see your videos right when you post them.
Yes the PC is ready for this interactive cutscene movie, RUclips playthrough at 4k 60 fps.
So mad a game isn't a Ubisoft open world slogfest?
@@dynamichunter843 What?! You don't like Ubisoft open world slogfests? Everyone loves Ubisoft! What's wrong with you? Reported. /s
@@SirThanksalott Everyone loves Ubisoft my @ss lmao
Behold the great and only AAAA developer
🤣
@@DragonOfTheMortalKombat bro doesn’t understand sarcasm
@@DragonOfTheMortalKombat Bro is new to the internet. Glass bro.
Time for 30 FPS to die, 60 FPS should be the new bare minimum.
Just get a good pc lol. Also if you don’t have a good one, fsr3 dlss and frame generation exists. And if you’re having stutters you can just set a frame rate cap. So many easy ways to get more fps bro
Fool@@2MuchQuenton
@@prafullrahangdale666 if you don’t wanna listen to tips that will help you have a better experience thats you 🤷♂️
Yeah my PC is ready since it's a movie game that you can watch on yt
Or stream via game pass xbox cloud
They said the same thing about Starfield, turned out it was just poorly made and unoptimized, still is today
thanks danilele as always very in depth
Very thorough look at the specs required. Thanks Daniel.
I think this will run 60fps or close to it. The 6800xt at 1440p using tsr should get 60fps. Maybe not all the time but close enough. I would be very surprised to see that card not hit 60fps at all.
My guess is that at native it will get 30+ fps but not quite 60 , which it will reach or poosibly exceed with quality upscaling
6:42 , what was that expresion on the face
Make a gif😂😂
Nice find xDDD
Going from 1440p to 4K does not cut your fps by half unless you run out of Vram
Just look at benchmarks for 3080 Ti at 4K and 1440p.
Going from 4k to 1440p give you around ~70%-80% fps boost depending on the game
And that when you are gpu limited. If you arw cpu limited the gap will he even less than that
*Turns on DLSS balanced mode* my PC is now ready for 60fps 😳
Not to mention it’s gonna have frame generation too lol. Idk why so many people complaining about “performance issues” when the damn game looks like real life
Ya'll need glasses
Looks like the 3rd DX 12 phase, just below Alan Wake 2 that may be the most demanding game out there.
I never use presets and always tweak settings to how I like them, usually with textures maxed out very little shadows and everything else on medium or high instead of ultra. Also in this game it will definitely support dlss, and probably frame gen. It's only console that needs to worry.
That will be crazy to see xtx and 4080 not getting 60 fps without upscaling on 4k.
Those cards are 1440p cards
@@theelectricprince8231 look how the 3080 and 3090 went from 4k to 1440p too. I mean maybe at the time of their launch they are beefy enough to handle 4k but one year later games are so demanding that they turn out to be more fitting for 2k
@@laszlodajka5946 I'm not leaving 1440p for at least 6 years
It's not crazy if high settings use lumen or some hardware RT , these cards can't get 60 fps with RT on in many titles and even with standard settings in the heaviest games .
If there's anything i learned this generation is don't play UE5 games at native res
@@lynackhilou4865 it is crazy as they started out as 4k cards.
Unreal engine 5 is clearly ahead of its time, no console nor GPU (except maybe 4090) can run it at 4k60 fps
And that 4k60 on a 4090 is really only achievable with dlss quality and/or framegen enabled. when running at max settings
it isn't the engine itself, more so how it is being used
@@CaptToilet maybe, but dlss quality on an already 4k resolution looks incredibly good; dlss looks bad only on 1080P and on some games on 1440P, but 4k is always beautiful, maybe even with even more aggressive dlss settings
5090 to the rescue! 😂
Is not the engine is raytracing (lumen) and nanite, if you ignore those UE5 can be as fast as UE4.
Most people want "real life" graphics now and want them to run on a 300€ or less GPU and that is just not possible.
You want good graphics, you have to pay for it with faster and so more expensive hardware, but even a current 2000€ GPU,
is struggling with raytracing, more so if you use path tracing, at native resolution, if it was not for upscaling misleading people,
they would see how really demanding RT can be.
Thank you for the reasoning behind the graphic specs requirements! Intel's A770 is definitely out of spot compared Nvidia and Amd relative performance
My 3050 6gb is about to have a field day
I'll be right with the new rig. Got an OLED for color POP. A 7900 XT, 5800X3D, A KC3000 2TB and 32GB of 3600MHz, CL15 memory. Display's only 1440p too.
They say that the game is in anamorphic 2.39:1 with letterbox in every 16:9 monitor, I will try it in 48:9 ultrawide to see if I still have letterbox
Aside from Ray Tracing, I'm so pleased with my GRE. The fact I can just use it, no underclocking, and not experience massive fan noise as I did with my Pallit 3070 is also a big bonus. For the 2 years that I used that card, 98% of the time it was running at 65% power lol
that card requires undervolting with overclocking, free 15% performance bump
Well yeah, you got a lower tier brand 3070. What did you expect??? My 3080 don't get loud, don't get hot and is fine 3+ years after its release. Because I got a model that isn't total shit
I mean a 3070 is basically a 1080 Ti so yeah that's a big upgrade.
@@Dexion845 What? 1080 Ti is nowhere near the 3070
@@PseudoPolish Alright ya got me there, that was a huge exaggeration.
This looks much more GPU limited than CPU limited to me. I don't see any indicator that the 4k with a 7900xtx/4080 is using DLSS/FSR/TSR.
If it gets 30 FPS on an RX 5700 xt at medium 1080p, to me that means it'll run 1080p on consoles internal, upscaled to 4k using FSR or TSR. And that the GPU on consoles is what's limiting it to 30 FPS, not the CPU.
If it can get 30 FPS on a Ryzen 2600, I don't see why it can't get 60 FPS on a Ryzen 5700x. It's only limited to 30 on a 7900xtx, because they are probably playing at native 4k. But it's still Unreal 5, so it likely won't be playable at 90 FPS+ on really any PC unless using frame generation.
Looking at the specs , the 30 fps lock is due to cpu limitations. And no even the pro consoles won't be able to hit 60 with cpu limitations. Only frame gen could help somewhat.
My work station PC with an i9 13900 and RTX 4070 is ready to have a crack at. 1440p 60 or 4k 30 on high settings should be doable if I use DLSS upscailing. DLSS frame generation would be a last resort to tack on top of DLSS upscailing if I really wanted to push for 60 fps+, but for that I'd want it to be on max settings.
If your 13900K is one of the good ones yes.
You gotta do all this to play a walking simulator insane to me. Gaming in 2024 is weird now
5:14 We're more important than Darren. Take that, Darren!
We’re at a graphical performance junction with newer titles of late and UE5 and other engines being quite demanding for even 1080p high’ish settings (not ultra) and we’ll need Nvidia and AMD to offer something like a 4070Ti (12GB) performance and vram in an RTX 5060 class GPU and at least 50%+ more performance etc the generation after than to keep up with gaming demands!
No chance, they busy making that AI money.
@@Kitten_Stomperwell, vote with your wallets and maybe, just maybe nvidia will have no choice but to meet the market with reasonable prices. Will that happen? I doubt it. People are lemmings and love to complain.
Developers should be targeting better image quality over graphical fidelity. Modern games look so blurry even at native 4K.
How do you sell the average person image quality? Graphics is much easier to sell than telling someone it runs at 120FPS that doesn't even know what framerate is.
@@penumbrum3135 I know, the average user is the problem.
@@L4veyan The average user is not counting frames just like the average person watching a movie doesn't count on how many times the movie is chopped up.
Well said
Yes Sir. Okey dokey. Understood. Alpha Roger. Solid copy.
i'll just wait for the rtx 5090
Bruh, soon we will all need 3 max level GPUs that haven't come out yet to run a game at 60fps that has you walking on a rainy rocky beach at night for an hour.
It may be that we are starting to see people writing with the arc cards in mind. Dogma 2 runs much better on an A750 than a AMD 6600 for example. It may also be that in the case of the A770 in this game. I would suggest a combination of better use of the hardware and the faster and increased memory from 8gb to 16gb may help with the 1% lows making it easier to maintain the 30 frames per second. It will be interesting to see if this is the case.
I don't know about you guys but i think 60fps should be the absolute bare minimum and a baseline, no matter the specs. Everything under is literally unplayable
16GB system RAM is now the minimum for games, glad I got 32GB.
I can't live without 16GB even for casual computing.
@@Splarkszter Well 16 is the minimum now.
I have 32 gigs of ram that thus game still does run smooth
Did you get to play the Pax Dei alpha and test some cards on that?
I just wanna know if this game is using UE 5.4 with improved PSO tech to reduce or eliminate stutter struggle... Thus far, all UE4/ UE5 still have awful optimization for shadow comp stutter and traversal stutter - Immortals of Aveum plays like crap and still crashes with DLSS3 enabled - Hopefully it was just a rushed EA thing.
Then again, Ninja Theory's UE4 hellblade has TERRIBLE stutter from traversal to shader compile even with the RT enhanced patch etc...
Yeah though Ninja Theory wasn't really doing any bespoke partnership engine development with Epic at the time of Hellblade 1.
As for traversal stutter - that problem runs deep & it's down to the fact that consoles have dedicated hardware for asset streaming/decompression with a unified memory structure, & PCs don't. It's a fundamental hardware difference that can't be solved with brute force of powerful PC hardware. Direct Storage keeps being talked about, but it doesn't seem like a real solution.
@@InnuendoXP yes, its really a game engine issue tho. If the engine (UE5) can properly make use, and leverage all PC CPU cores without waste, direct storage and other fallback techniques for decompression wouldnt need to be relied upon.
To note, other game engines don't particularly have this issue. 4A studios (e.g metro series on 4A Engine) and CD Projekt Red, cyberpunk, Witcher series on custom Red Engine take into account various PC hardware, leveraging cores and I/O in incredible ways. (Yes, i cheated, i chose 2 companies whom often design for PC forward ideologies,) but i mention because hes not impossible for engines to have this scalability coupled with high fidelity.
CD projekt red running to UE5 for their next titles, if they run like dream on UE5.9xxxx, an every other UE5 game continues to run like ass on PC hardware, ill try to start a riot... lol
@@nuruddinpeters9491 well it's an important difference when companies develop for PC first Vs Console first when they're aiming to leverage the hardware-accellerated streaming hardware in consoles to hit their performance targets.
The fundamental difference in data flows & memory management is very non-trivial, it's giving your engine a wholly different set of instructions on when, how & where to pull data from, when to release it. In the case of PC to avoid eating up CPU time for those milliseconds it would mean having to effectively throttle SSD read speeds, expand system & vram memory requirements for caching & culling, all sorts of things they can just instantly just do on-the-fly on consoles that they can't on PC.
It's a built-in performance optimisation for consoles, the most profitable market by far (for publishers) that effectively drastically reduces memory requirements. Going after traversal stutter on PC means sacrificing the console experience unless you have more dev time & money than god, & borderline building half a new game engine for the sake of stopping some hitching & stuttering happen. You can see why lots of devs simply aren't budgeted to make that happen & why we've been seeing so many shoddy PC releases on the stutter front.
PC I/O is not 'incredible' when it comes to minimum latency overhead from drive reads, the architecture which does that was not designed for gaming-first, it's secondary at best. The only way this is stopping on PC's side without waiting months after launch for the fixes to be made is for AMD & Intel to add their own hardware acceleration & get motherboard manufacturers to play ball on it if the chip needs to be sat somewhere between the RAM & GPU PCI-E lanes.
@@InnuendoXP you make great points, many of which Im aware, however i must provide push back.
Developing for console and developing for PC isnt either Or. Tools on UE engine from dev and studio accounts have mostly pointed to being trash, stuck to 2 or 4 cores, thread limited and horrible for compile for bigger, higher fidelity games. (UE4)
Speaking of UE3, this wasn't the issue. Games like Batman Arkham City leveraged PC hardware extremely well, not to mention scaling well on crappy jaguer CPU cores for consoles. UE4 was never great for PC. Early UE5, again, not a great start either, Epic admitted to the issues and lack of tools/techniques with decompression.
To your point about games being either only good on PC or console, this is crap. Again, Witcher 3 was developed for consoles first, and ported to PC without criminal traversal stutter. Custom engine sure. Not UE engine (important.)
CD Projekt Red back in 2015 with Witcher 3 were effectively Indie devs taking both console and PC to task. This nonsense of costing too much money is cope.
I think you have a lot of valid points, but yea, some of it is flawed. UE engine needs better tools, period. Its not an either or situation for Console and PC. Its a, when so many devs run to Epic's engine, they struggle with trash tools.
Also, its not like PS5 and Xbox series are immune to traverse stutter. Both consoles suffer UE5 Immortals of Aveum with travese stutter despite their magical I/O bandwith. Same for Star Wars jedi survior (UE4.) Hell, Re engine has travesal stutter with Dragon's dogma, and its an entirely different engine. We can safely assume capcom dev'ed the game for consoles first, PC 2nd as a port. For all of the consoles amazing bandwidth and magical mem management, some of these engines and tools are just hot trash. (Yes, this could be an issue of time for current modern games, but my point here, consoles are not immune to traversal stutter in 2023/24.)
Now, i will say, custom engines, (in house engines) are inherently expensive, (yes, its a luxury in today's industry. But at the same time, they perform great feats of magick at wonderful fidelity, (4A studios) for both console /PC, without triple A bugets.
@@nuruddinpeters9491 Witcher 3, was developed for console hardware that was weak when it was new, that ran on slow HDDs while PCS were already moving to SSD, & there was no such dedicated hardware specifically for streaming data while taking the load off the CPU, so I really don't think the traversal stutter discussion is applicable in the same way as memory management would've been completely different (i.e. more comparable to PC), there would've been no such fast reads & dumps in/out of memory to make more efficient use of a limited memory pool.
Also it was all kinds of jank on launch & it took a year or two of patching for the issues to be ironed out. Sound familiar?
One thing I can think of: If the Arc GPUs they listed genuinely do hold up at those resolutions/presets, at similar FPS to the AMD/NVidia GPUs in those requirement tiers, it could mean the game is unusually heavily reliant on VRAM-bandwidth. In proportion to their basic/1080p performance Arc GPUs seem to do unusually/disproportionately well in VRAM-bandwidth heavy situations like Ray Tracing and 4K.
(Not that the Arc GPUs have an especially strong baseline to start with, it's just that they often don't fall off as fast in these demanding circumstances, percentage-wise, compared to the other brands.)
I'd have to agree though... It's still very suspicious to me putting the A770 in with the 3080 and the 6800 XT. (Just looking at the 6800 XT alone, that is a significant jump over the regular 6800, and even then I'd find it more than a bit suspicious grouping ARC A770 with the base RX 6800!)
Maybe... the FPS you can expect is totally different, and only the resolution is the same! Lol!
With those CPUs/GPUs listed, and that advanced-looking graphical presentation from the trailer, no performance/60FPS mode on console, as you are quite justified in pointing out, I also can't imagine they are tying to push massively high FPS. Easily at or below 30FPS at times I'd guess. I find PC system requirements often to be a bit optimistic, as you can technically play the game, though people may be disappointed with framerates.
Anyway, enough from me, thanks for the video!
Let’s make the PC specs not confusing by making it confusing
nice ghost pfp
Maybe medium to high turn on lumen stuff that is why there is a big jump in gpu
Doubt this game will render correctly without any sort of RT.
30 FPS in 2024 just seems criminal.
30 fps games should cost $30 max
Poor Darren, he just wanted to chat
I imagine you'll need to use DLSS/FSR/XESS Balanced or Performance modes to achieve 60+ with this game... I just only hope this UE5 game is not a horrendous stuttering mess like Lords of the Fallen still is (even after their final 1.5 patch). But I won't hold my breath. I really want to play this game because I love the first game a lot. But sadly I'm going to wait until after performance reviews come out to see if it's a stuttering mess or what have you.
they may have as well listed a voodoofx card next the a770 how unrealistic lol
I love these videos despite not being interested in the games
Is there chance that the A770 relative performance is based off older benchmarks when drivers were worse? As far as I know it keeps improving.
I was able to run this game at 60fps 1080p mix between low and mid with my R5 3600, 16GB Ram at 2400mhz and an RX6600XT i cant complain that much ~ the story is great the gameplay is great and familier also some flawes of the first one where adressed.
Maybe i am just fanboy-ing the game but it was great to play
Im excited to play this on my ultrawide and xdefiant is out next week also, im currently playing through kingdom come with the occasional bfv multiplayer session
I know my PC is as ready as it could be but I still gotta make sure.
My 4080 super is ready for all games in 4k, but some game have not optimization and fps drops under 60. I hope that this game won't be bad
we don't want impressive graphics at this cost. Just a finished game with fun mechanics.
The a770 is not placed correctly on the chart
After recents updates the a770 is upto performance of 4060
After COVID added to this game's development time, of course, interrupting the cycle -- something tells me the delay didn't result in more polish. For how long this game was in-development (and the project's scope), I was expecting a product that was thoroughly optimized, not yet another UE5 title that will be heavily dependent on AI for the heavy-lifting. I guess we'll find out once the reviews are released.
I'd think the Xbox version is going to be a dynamic upscaled 4K so probably closer to the 1440p line internally, at 30 fps. If it's anything like other games I've seen, the consoles seem to be generally capable of 40-ish FPS at the settings they want to use, but then they cap it at 30 for people without a new TV that can't do things like 40 fps VRR in 120 Hz 4K mode.
Xbox is under 1080p 30fps! Using dynamic resolution and upscaled. It's on 2020 mid range hardware even with games made for the hardware that's impossible! And this was tested!
Ghost runs great, it's optimized plus last gen hardware ran it, can confirm im struggling to get over 27 fps with a ryzen and rtx3050
Thank god I upgraded my pc last month. I’m chillin
Be real what is your framrate, I mean stable fps in the game, with nonl stuttering at all running 1440p medium settings. If you're barely getting over 60fps without dlss or some kinda upscaling, then you're golden.
Unreal is SO efficient with VRAM.. each recommended GPU has more than the rec'ed vram except 2070, which has the same amount as the recommendation. This is great news across the board with a lot of stuff moving to UE
Worst engine in terms of VRAM management
What about CPU utilization...
@@flat_lander1 How so? seriously please elaborate
@@592Johno new games are getting ready to push 8 core for sure.
Mouse Pointer Dan Strikes Back.
We need to take in account that modern games are developed with upscaling in mind. Much like they used TAA/FXAA to replace MSAA (because it was to demanding with differed rendering) and how MSAA replaced SSAA (it is extremely demanding). Some current upscaling tech uses custom anti alaising and typically looks better than native taa/fxaa. Its another tool/tech devs will use to add more fidelity to games.
damn, that's a really good point
Ill be interested when it stops making games look like shit. Same with TAA.
I doubt taa or FXAA will ever look good. Especially now that better alternatives are being used. DLAA is already on par with MSAA X8, and DLSS 3.7 at 1440p and up with quality mode and preset e is nearly indistinguishable from native DLAA. And thankfully it only takes a few seconds to update the DLSS version and preset that's being used.
6750 XT at $300 on NewEGG
When this game launches, Intel will launch a day 1 driver that also happens to be the fine wine driver, finally unlocking the A770's potential and placing it at 3080 levels of power.
Trust me, it came to me in a dream.
😂 ok buddy but seriously this might be the first game with intel gpu recommendation 😊
I am still waiting for Intel to fix gta V
Then you can finally justify buying that POS. Imagine thinking having a driver optmized for 1 game is a flex lol
@@nuffsaid7759 ruclips.net/video/uqUKXEiOx3g/видео.htmlsi=WBs-y7IbNk--sDkR = Intel is working on that
SURELY the A770 will beat the 4080 in UE5 Pepega
*Older games for the win these new games suckk*
Play within your means.
These games are just too demanding and simply not even worth playing to just watch beaches and walk through wet Rocky pathways for hours at 1080p 30fps with stuttering at medium settings on high end expensive hardware like a 3070. Be practical. Boot up the first game, run it on native 4k max settings, and the experience is way better. It's so jarring to play the second game because it feels good and bad at the same time.
@@JayUchiha17 its not that heavy
im happy for ~60fps average. long as it doesnt go too low. 40 is minimum for my eyeballs
I loved the first hellblade. It's a story game, you don't need more thsn 60FPS
so true
The 1st game was pretty demanding! I remember running this on a GTX 960 and it was like playing on a flipbook lol. Even my 2060 had some challenges.
My PC is ready.. but is the game??
well..... my 7900 XT should be ready for high(recommended?) / very high on a 1440p ultrawide, but not too sure about my elderly i5-8600k... although with it OCed to 5.1ghz all-core, i avg around 120fps in Cyberpunk on high+ settings (no RT obv)
I think high fps shouldn't be a problem for you, if it is that's not graphics cards fault but poor optimizations
My current 1060 is 8 years old i wonder if it will continue to game on 1080p any longer 😂 cause probably not
just maybe but at that point just buy an xbox and play it there
1080p with DLSS, FSR lol
I would rather sit through back to back episodes of the View and Drew Barrymore show than game at 30fps.
Nah, give me 30fps gaming...I'd rather die than be forced to watch that mind cancer...
In Jagged Alliance 3 at 1080p, the A770 matches the 6800XT and is 4fps behind the 3080 😳
there were FSR and DSLL, so i don't think , most of time it have any impact and much reference value. i don't think most of players is using native graphics setting to play any game. there have much more reference value for lower class GPU still. middle class GPU is capable for handling any situation of resolution with FSR or DSLL
Okay my system is fine, 7900GRE/7600X and I only play in 1440p Ultra. The issue is games that do not have any idea about their target audience and make these mad requirements. The target audience of pc gamers (if it is even ported right and not a damn mess on launch) are people playing with 1660 super/rx 580 8 gb- ryzen 5 3600 systems. I am sure that excludes budget gamers using Nvidia right away due to vram unless these gamers suddenly grab a 1080ti and the amd budget gamers will struggle on older tired cards that struggle already.
I'm using a Ryzen 7 5700X and an RTX 4060. I'll be aiming to play at 60 FPS, 1080P on medium quality. Let's see.
Less than 60fps should never be a thing on PC. I don't find any games to feel smooth on PC if it runs at less than 60fps. Basically to me
Thank you. I've noticed we are in the sane few now. People spend so much money on hardware and happy with a game bouncing between 30 and 40 fps when all you do in the game is walk and jog.
Lets see if ray tracing will be worth it in this title. Cause it better have RT in all settings to demand such high requirements.
*I JUST checked the recommended specs. 10700K and an RTX 3080.*
*I have a 14700K and an RTX 3080. Soon to be a 5080. But, at this rate, I will need a new PC every 2 years....* 😑
u know whats weird?the RTX 3060(12GB) is 1 gen above the 2070 and yet according to those System Req its on the same level as the 2070 when the best GPU's after the 30's gen is the 4070 and above but those a re for 4k 60 fps but how can a 4080 at 4k achive u 60fps+ when an RTX 3060 12gb (or a 3080 in their case) won't achive that without DLSS/FSR
Seems pretty reasonable to me for a game this pretty at this point in 2024.
This game is more like an interactive movie so 30ftp actually makes more sense for more "cinematic immersion".
It also a good excuse for not optimizing UE 5. I have never seen a game running smooth no matter what hardware...
I can't stop thinking about peter pan now when you go small LOL .. oh man that was funny
Damn. It looks like my 6700 XT is a 1080p GPU now if I want roughly 100 fps.
For this game you are Lucky if you get 60 fps since it's locked to 30 fps on XSX which uses similar GPU
@@Radek494 Yeah but if I play on 1080p then that tends to be CPU bound and I have a great CPU.
@@genericyoutubeaccount579 And then what CPU do you have?
@@genericyoutubeaccount579 XSX doesn't run much higher than 1080p either, but we'll see
@juanmassiosare9850 12700k which is equivalent to the 5800X3D in gaming.
Seems to be a misunderstanding on many of these videos. Consoles are a controlled ecosystem where you can have some assurance of FPS. PCs have many additional factors hardware/software/power etc., meaning that a publisher would be guessing FPS and be absolutely slaughtered by users not getting the stated FPS. This has been seen and so why do it.
Why is your youtube videos kind of not as saturated in color
Just built 6800 pc for 1440 60 fps for $800. Thought it was good for a few years :*(
It will be good for a few years. Let go of the idea of cranking settings, use tweak-guides & your eyes to judge how the game actually looks. The 6800 is heads & shoulders above the consoles, has plenty of VRAM, & the consoles are targeting a 4K output. Devs won't be making games for PC that the consoles can't run, don't worry.
If you're concerned about framerate, put a modest lock in to get low latency, keep your FOV high, and play with a gamepad. There are games where it's worth worrying about high framerates, and games where it isn't.
@InnuendoXP good points thanks. Most games im playing currently have run great. This just had me a little worried for the future.
6800 non xt is a good 1440p card, 6800xt is great at 1440p, if you want it to last why not buy the xt version instead?
@@tomthomas3499 strict budget
More like for a few months 😁 just buy a console every 4 years for 500 for the best gaming experience.
I tried playing the first game with my RX 6750 XT and Ryzen 7 5700X PC at 1440p, got like 30mins into the game but couldn't stomach the constant stuttering...
Not shocked with GPU requirements, however CPU i5-12600K for 4k? Quite bizarre, as this is a rather "visual" experience rather than City Skylines 2. I will give it a shot with my 7900 xt and 13600k in 1440p and let's hope FSR is not a must-do in this case.
My guess the game is just GPU heavy and the CPU isn’t being taxed much at all. It can happen
I was thinking of trying this on a 4gb 3050 before seeing this. Now, after seeing this I’m questioning whether I should even try buying it
After Forbidden west and now this, so contrast between the two lol, but i'm somewhat not surprised, it's the usual UE5 game with absurdly high hardware requirement, and probably gonna have stuttery first launch.
Forbidden West is a PS4 game, Hellblade 2 isn't
@@Spr1ggan87 And yet for a PS4 title, the graphic hold up just fine, ask any PS5 owners and they will agree that Forbidden West Graphics is one of the best looking titles ever released, hence why we don't really need absurd hardware requirement to achieve good looking game if optimized properly.
@@tomthomas3499 You are completely missing the point, the engine and game were built around PS4 so there are concessions made to run on said ancient tech. HellBlade 2 is not made for Xbox One or PS4 so there are no concessions made. So saying Forbidden West runs better is just stating the obvious.
Also iirc HB2 has raytracing, Forbidden West doesn't.
@@Spr1ggan87 Don't waste your time. He's gonna argue that RT is a marketing scam. Been there seen that.
Oh wow those are high requirements
not even, it's a 2024 game. RTX 3080 was released 4 years ago
@mrbeencountin and the majority of gamers don't have a card that's as fast a 3080
@@mrbeencountin I have a 4090, just makes me think about dragons dogma 2 performance
@@DemonDog17 true
@@mrbeencountin it released closer to 3 years ago, then 4 years but even if we say a 3080 is 4 years old, to get that performance in the 4000 series its a $600 card
We know most people go up to $300 for a card and now you add inflation for the costs of living, rent prices increases, home prices soaring. Not many people are lucky enough to spend that money
I was fortunate enough to spend $800 last time around but now with more responsibilities, I'm not spending $1000 to get a meaningful improvement on what I have
I might buy this just because it costs 25$ in my 3rd world country... last a couple of years publishers stopped using regional prices, so most big games come at full 60/70$ price and it really hurts my wallet. But let's wait for reviews.