i think steve probably has mixed genetics, his face does seem to have a bit of mediterranean or even turkish features, and of course most of his background is gonna be anglo, mixed people are just able to blend in in a lot of places
Why does it run so bad on AMD GPUs, especially considering there's not any kind of raytracing in this game? It was originally literally developed for RDNA2 hardware, and it is running really bad on RDNA2 GPUs, one tier below where usually it should be compared to their NVIDIA counterparts.
Rdna 2 seems to underperform way to much on this game. No way a 6700xt at 1080p on a last gen game barely hits 50fps and a 3060ti been faster than a 6800 does not make any sense
Yeah something is very fishy and they better patch this. I had well over 100 FPS in GOW 2018 at 1440p high with the 6900 XT meanwhile it can't even crack 70 but the 3090 is still nearly at 100.
God forbid a game has an actually good story. Or you'd rather learn all the story and lore by reading hundreds of pages of letters and documents throughout the world?
They really knocked this out the park, i installed it on a system with a 2060 connected to a 4k TV. It will while upscaling 1080p - 4k on medium preset do close enough to 60fps what i think is just insane. The only thing i noticed off was water splashing was jumpy. But it was totally playable and still looked fantastic (sitting a few ft away on the sofa). Hope to see more of this type of game. If it was not for VRAM limitations i think it would run even better. There are bits in the game like crawling through a narrow gap, wall climbing that feel like every other game out these days though. Makes it feel like the same engine used but who knows perhaps they just ripped off bits although not sure why.
Watching the testing series over an extended period of time is really good for forming a picture of what tier of gpu I should plan for in my next build, even if it's probably going to be the next generation.
is it me or does the game feel like poorly optimized for AMD GPUs i mean just look at the difference between rx7800xt & rtx 4070S in 1080&1440p. even the gre and sometimes 7900xt falls short
Its not that surprising. The 4070S is 10~% faster than the 7800xt, and is a couple percentage points off the GRE, and both GOW games run better on nvidia. The first one the 3080 beat every AMD card.
Nothing new really. 90% of PC gamers use Nvidia and this is what developers optimize for. Just because consoles use AMD, does not mean AMD hardware is superior on PC. Nvidia is even faster at executing the code. You see? I worked with tons of game engines and you don't optimize specificly for a GPU arch in most cases. You write the code, and test how it perform.
@@Dr.WhetFarts Welcome to the comment section, where 9 out of 10 are stout AMD fanboys, who always blame for poor performance everyone, but AMD themselves. It's always the developers of games and software that are somehow to blame. Praise the holy AMD, brother!
@@someperson1829 I mean, it's normal to be disappointed when a 7900xt is usually on par with a 4070ti super in raster You lot are talking like all the other games before these last 2-3 titles don't exist, lmao
@@Dr.WhetFarts then explain why the 7900xt is usually on par with a 4070ti super, except on the latest titles like wukong and star wars, that are casually Nvidia sponsors 🤔
@@thunderarch5951 Yes yes, and the poor performance of Zen 5 on Windows was Microsoft's fault. Everyone was screaming and crying about it in the comment section... until AMD themselves send a fix to Microsoft to include in the next update, because it was always up to AMD to fix it. lol.
You guys definitely do the best benchmarks, insane amount of data. One thing I'd like to see which may be in a cpu benchmark video I've missed is games benchmarked at 4k utra with a 4090 causing a gpu bottleneck. I want to see as you once told me via twitter my call of duty modern warfare 2 fps were low, you actually deleted your comment as you had made a mistake. I am wanting to see if a cpu like a 9700k gives as much data for the 4090 to produce frames vs the latest best cpu. I still believe a cpu like a 9700k is almost identical when fully gpu bottlenecked.
I really appreciate these kind of videos, as they're helpful to gauge a buying target or to set expectations. The charts showing native render vs various upscaling techs are great. A (heartfelt) Thanks, Steve!
No, that is just how most game engines work. Allocation and actual requirement are two very different numbers. Nvidia easily wins in most new games, while having less VRAM on average. Nvidia also have better memory compression and utilize the cache better.
Yeah I was getting vram warnings when on a 4080S, 4k with DLSS Performance at ultra settings (so basically 1080p ultra) It's just the game asking for more mem than it actually needs, it's no biggie.
Great work. As someone who played this uncapped on PS5- I won't be double dipping. The quality of the game and FPS on console is surprising high, with frames hovering between 80-100fps and looking great while do it. Enjoy my fellow PC brothers and sisters- its a gem of a game.
If you didn't already own it, which would you buy? I have a PS5 and a 4090 PC. Obviously the PC will run better, but when a game runs good enough on PS5 the convenience of console is nice. It make use of any DualSense features?
I didn't test this section you were in but I did test the area after the intro and got very good numbers on RX 6800 XT. Shocking to see fps drop so much in performance later on according to your testing. Great job as always, Steve! 👏🏼👏🏼👏🏼👏🏼👏🏼
You have some serious issues in your benchmark, i get 120fps 1440p ultra and 140fps 1440 high, taa, no FG, no upscaling, overclocked 6900xt on 24.7.1, w10 22h2 without these month's update, 5950x overclocked co+pbo
I haven't gotten to the "most demanding" part of the game yet, but up to the point where you get to the dwarves home I've been getting around 95 fps at 1440p ultra TAA native res on a overclocked 6950XT with an old I7 6950x clocked at 4.3Ghz.. I'll try and report back once I get to this section of the game, but so far I've been pleased with the performance considering the age of the x99 platform. Update: So I've gotten to the area tested, it's definitely more demanding but the lowest fps I've seen was a dip to 63 fps in combat but usually stays between 70-75 fps
I'm playing on rtx 3070 at 1440p with ultra settings and dlss quality and getting around 90-100 fps. It runs even better tham the 2018 gow, solid port.
I have a 3070Ti (16GB system RAM) and it feels like I get video memory leak after a while. 1440p, mix of high and ultra settings, tried both DLAA and DLSS quality and when the VRAM reaches near 100% usage changing the textures level does nothing. Restart the game and get 110-120fps again, it's so strange.
I actually laughed out loud at the sight of that amazing thumbnail. Awesome job you guys. Steve is in so many gaming universes that it’s ridiculous. I don’t think anyone can challenge his power.
It looks good, but outdated when it comes to lighting. Objects "float" in midair due to improper rasterized shading, there is no realistic umbra and penumbra, just approximations of them, there's light bleed literally everywhere, in every building, even under player character's armor (it's unshaded and too bright).. just check out eyelids of every character.. there's ton of light bleed/they are too bright, not shadowed properly. This game would look stunning with nice RTGI or at least RTAO/ RT shadows, if path tracing is out of the question. Developers confirmed to DF that they have plans to add some RT effects in future patches (at least RT reflections are confirmed)
@@iamspencerx I think that would be because there's a gambillion games and only a relative handful of them have photo modes. I've been quite thankful for games that have them, since many games have excellent worlds and environments that lend themselves to a wallpaper (I was kinda peeved that Elden Ring doesn't have one, I would have done something mildly unscrupulous in IRL life to get a good screenshot of the Cerulean Coast). Also, I don't think people really get what they mean when they call stylised lighting and rendering "outdated". "Technically inferior", maybe, but you don't need superior rendering technology to make striking visuals. The entire idea of the God Ray, or fully solid shadows on characters, amongst all manner of developer s*light* of hand tricks almost entirely depend on being unrealistic to achieve Being Cool-istic.
First FSR title I have used that I didn't immediately turn off. Actually does pretty well with the visuals, surprised me because I tend to be really picky.
Brother? 71FPS in 1080P with 7800XT ultra, I own one of these graphics and with those configurations it does not give such low fps, it is around 120fps, something must be wrong in your assembly, but these tables are not showing 100% reality
@@MrSeb-Swhat?😂, 7800xt has 16 gb vram, wth are you talking about?, also as the op said, this benchmark is fishy, I have 7800xt too and I am getting avg fps of 100. There is no way it performs similar to a 3070
The tests should be re done, after the 1.2 patch my 6700xt now uses full power instead of around 160w and easily is doing over 80fps (not in the beginning of the game) at 1440p all ultra besides tessellation
@@fightnight14I have a rx 6800 and I have way higher fps as well. Same for my friend on his 7800xt. No one is adding fsr or frame gen by mistake and ultra is ultra.
Appreciate the quality benchmarking so soon after release, but I'll be honest I'm liking and commenting purely because of the thumbnail art.. they have been consistently magnificent and whoever is responsible deserves an immediate raise
The version they tested with has added support for GoWR however it is not the stable driver branch. From my experience it's better to stick to stable driver branch and just wait for the patches to come out later. Running beta drivers on daily basis leads to headaches that a non techy user really doesn't want to deal with. It's for people who want to test and report bugs not for people who want troublefree experience.
I'm a very happy owner of RTX 3080. It's crazy that 4070 is slower than 3080 and 4070 Super isn't much faster. I bought mine for 350 Euro with a warranty. What a deal!
I think there's an issue with the game. I've seen other channels get over 80fps with the 7800 XT at 1440p/Ultra and Steve just gets 63. There's a memory leak and tessellation bug in the game too, so they could perhaps explain the disrepancies.
Fantastic point about the advantage of just making the install very large to circumvent some of the headaches of extreme decompression in game. At this point, I couldn't care less if a game was 500 gb on my PC if it improved performance even just a little over it being 50gb. It'd be nice if the user could just choose between two levels of compression though.
My 6600 is getting 40fps on Ultra with those lows for a game that looks as good as Ragnarok does? Man, I'd call that a result for a lower end 8GB card. Just appreciate that you've at least included it. Own it on PS5 already, but it's nice to know for sales in the future if I fancy it on Steam.
@@mruczyslaw50It does not woek like this. PC versions of games are build using different software tech. So it does not necessary mean that if consoles use AMD hardware, a game should run better on PCs with AMD hardware. In terms of CPU performance, it is even less relevant, since Intel CPUs often perform better in console ports.
Okay, so how are the Arc A770 and RTX 2060 Super part of the recommended specs when the A770 only gets 34 fps at 1080p, medium, with TAA on and the RTX 2060 Super isn't even in your test suite? I have an RTX 2060 so I thought I was in the clear. Can we get a video testing Ragnarok with the Steam-recommended hardware ( Ryzen 3600/ Arc A770, RTX 2060 Super, and the RX 5700) and see what settings it would take to get them to run at 1080p 30 - 60 fps. I feel like these benchmark videos are incomplete when you leave off actual recommended hardware, even if you decide it's unplayable some of us would like to see how they perform.
@@joeykeilholz925 Not really. I've watched other reviews since then on the base RTX 2060 6GB performance and the game gets much better numbers even with FG off. However, as one of the leaders in the tech space, I would like channels like Hardware Unboxed to include actual recommended hardware. I also build and repair PCs myself and have for over a decade. I understand what goes into benchmarking and how different hardware and settings can change test results. I got the answers I needed I only hate that Hardware Unboxed chose not to test important hardware and settings. PC gaming isn't just for enthusiasts. I build PCs for all levels of gaming.
@@notatechguy1209 Editors can no longer give us as much information. Nowadays, a number of factors come into play in order to even publish a reasonable benchmark test. Everything costs time and money. Behind each test there is the next one that has to be done. Please keep that in mind in this profession.
I'm getting around 80 ~ 118 fps with Pulse RX 7900 XTX and 5800X (2x32) 64GB 3200Mhz RAM Native AA and 4K all Max people are still taking these benchmark so seriously and I don't know why these shouldn't be a reference the game works so well pretty smooth and sutter free.
@@linkphan761generally speaking the game works pretty well it's very well optimized and that's what we want thanks to Sony as usual unlike GAYPUNK 2077
These AAA games are complete garbage, but you guys still put the work in doing benchmarks and optimization guides, gotta respect that, you guys are great. Also the thumbnails are absolutely fantastic.
@@BladeRunner-2211doesn't mean it's good. Reviews is just marketing at this point all the major reviewers get payed to shill shitty games so it's not something you would ever consider. Matter of fact if it's 10/10 in the reviews it would be a 6-7 at best
@@gregorB92 you realize that even user reviews are wrong most of the time because of bots that inflate the scores to make it look good???! Anyway glad it's on pc now so you and anyone who liked it can play it
I think the data is mostly correct for RDNA3. I have an XTX myself, and it does appear that internal latency becomes the bottleneck at lower resolution for games that mostly rely on classic raster techniques. That said: The RDNA2 performance is completely broken. The Navi 21 derived products should rip an tear, especially as the resolution goes down, as the infinity cache becomes more effective the lower the resolution.
@@retrosimon9843 Which does not use the same driver stack or graphic APIs that dedicated GPUs use. They couldn't be further from each other despite having similar hardware.
Here we can see what happens when AMD doesn't update their drivers in time for a big game release. As a 6700xt owner, they are ALWAYS so slow when it comes to driver updates and bugfix, I've been having a strange freesync brightness flicker bug for a year, I never had it before with the same gpu (and same monitor), I didn't even have it when I had nvidia (same monitor, also gsync compatible). It was introduced with a driver update, they never fixed it ( I tried everything I could) .
Runs great on my 5700xt and 6800 gpu (W10). I have everything to the max on my 3440x1440p screen, FSR 3.1 native and getting 70 fps witch is enough for this kind of games. Only thing Sony has to learn is how to make a proper Audio configuration. I have a great soundcard but like more Sony pc games Im only getting Stereo sound whatever I do.
@@zeubiflex6230 According to Digital Foundry, GOWR on the base PS4 runs at full 1080p with no upscaling or dynamic resolution, and the visuals are nearly identical to the PS5, except for minor details and framerate differences. With that said, you really shouldn’t need a $600 GPU to max out a cross-gen console game with barely any graphical improvements at the same resolution.
@@Knoah321According to Digital Foundry, GOWR on base PS4 runs at full 1080p with no upscaling or dynamic resolution, and the visuals are nearly identical to the PS5 except for minor details. With that said, I think we really shouldn’t need a $600 GPU to max out a last gen game at the same resolution with almost no graphical improvements.
Another magnificent thumbnail :D
Calm down
bespoke
whens the actual movie?
These are all good ideas for Steve to dress up in Halloween😂
Nothing compares to “Steve on Steroid”, sorry it’s “Ryzen on Steroid” video.
Super weird how Steve's face fits all these.
Almost like Mr. Potato head...
Especially wukong😂😂😂😂
@@GustaviustwinkelberrySteve always monkeying about
it's a testament to balin's photoshop prowess, and steve's face.
i think steve probably has mixed genetics, his face does seem to have a bit of mediterranean or even turkish features, and of course most of his background is gonna be anglo, mixed people are just able to blend in in a lot of places
I knew Steve was hiding something....he's been the God of War this whole time
No, Makarenkov is THE God of War. You'll need to see him.
Why does it run so bad on AMD GPUs, especially considering there's not any kind of raytracing in this game? It was originally literally developed for RDNA2 hardware, and it is running really bad on RDNA2 GPUs, one tier below where usually it should be compared to their NVIDIA counterparts.
Developers seem to prefer optimizing for Nvidia than AMD unfortunately.
@@Orly74 weird because the ps5s hardware is all amd lmao
This was PS4 bred that's GCN not RDNA. It is weird though why this runs so bad on AMD cards.
Sad truth, AMD GPUs are garbage.
@@alistermunro7090 How old are you?
you can save around 20GB of SSD space when you delete unused language files
Good tip 👍 But still annoying as they probably reinstall after every game update...
@@noer0205I'll have to try but it would be nice to be able to only download the language that we want to use! It would definitely save some space...
Wow, you were spot on! it is ~20GB. Thanks 👍👍
Remember when installs used to ask you which language you wanted? Pepperidge Farms remembers.
Rdna 2 seems to underperform way to much on this game. No way a 6700xt at 1080p on a last gen game barely hits 50fps and a 3060ti been faster than a 6800 does not make any sense
AMD in general is underperming, 3070 outperming 7800XT and performing like a 6900XT in 1080p result says it all
Yeah something is very fishy and they better patch this. I had well over 100 FPS in GOW 2018 at 1440p high with the 6900 XT meanwhile it can't even crack 70 but the 3090 is still nearly at 100.
@@valentinvas6454 yeah doesnt make sense at all
will be fixed with an update and new drivers
@@valentinvas6454 - there's currently a bug with the tessellation setting. Put it on Medium.
😂 god tier thumbnail
Of war*
pun intended xd
Literally!
I didn't even notice till u pointed it out so I had to look again. It truly is god tier
That is because Steve us the God of Thumbnails.
Steve is standing and... not upset? This is madness.
As you get older standing helps your back, a lot!
@@raptor1672 Poor diet, not age.
Pretty sure it's been hemorrhoids the whole time 😂😂😂😂
He is a calm and reasonable person.
30% performance diff between 3080 and 6800XT with only rasterization is crazy.
was impressed by that as well
God of hardwar
e
Gigachad Steve thumbnail
More like Gigagod
@@I-am-MasterChief Gigadad
The thumbnails have been on point these past few GPU Benchmarks XD
Thumbnail shows Steve's face after watching 50 hours of cut scenes 😂
truly, a magnificent "game" rofl
God forbid a game has an actually good story. Or you'd rather learn all the story and lore by reading hundreds of pages of letters and documents throughout the world?
@@battlekingad8291 what good story? The worst thing about this game is the story 😂
@@battlekingad8291 I cant lie the games story aint all that.
@@MaxUmbra Game got 10/10 reviews buddy. Go back to bed.
They really knocked this out the park, i installed it on a system with a 2060 connected to a 4k TV. It will while upscaling 1080p - 4k on medium preset do close enough to 60fps what i think is just insane. The only thing i noticed off was water splashing was jumpy. But it was totally playable and still looked fantastic (sitting a few ft away on the sofa). Hope to see more of this type of game. If it was not for VRAM limitations i think it would run even better. There are bits in the game like crawling through a narrow gap, wall climbing that feel like every other game out these days though. Makes it feel like the same engine used but who knows perhaps they just ripped off bits although not sure why.
Thumbnails getting cooler every time
Whoever does these amazing thumbnails deserves a huge raise!
Who makes these thumbnails deserves a raise!
Watching the testing series over an extended period of time is really good for forming a picture of what tier of gpu I should plan for in my next build, even if it's probably going to be the next generation.
Quality thumbnail. Does Steve frequently shout "Boy!" at Tim? 😁
is it me or does the game feel like poorly optimized for AMD GPUs i mean just look at the difference between rx7800xt & rtx 4070S in 1080&1440p. even the gre and sometimes 7900xt falls short
Its not that surprising. The 4070S is 10~% faster than the 7800xt, and is a couple percentage points off the GRE, and both GOW games run better on nvidia. The first one the 3080 beat every AMD card.
It's even weirder when you consider PS5 and ps4 use custom variants of AMD gpus
@@hopoff9968 The numbers dont line up with other videos on the 7800xt
I was kinda shocked how slow the 6600 and 6700 are considering its a ps4 title in medium/high settings only just around 60-70fps for the 6700xt
weird because this is a Sony game which should be better optimized for AMD GPU's because the PS5 and PS5 Pro both contain RDNA
For a game designed to run on AMD hardware the Radeon GPUs work quite badly here.
Nothing new really. 90% of PC gamers use Nvidia and this is what developers optimize for. Just because consoles use AMD, does not mean AMD hardware is superior on PC. Nvidia is even faster at executing the code. You see? I worked with tons of game engines and you don't optimize specificly for a GPU arch in most cases. You write the code, and test how it perform.
@@Dr.WhetFarts Welcome to the comment section, where 9 out of 10 are stout AMD fanboys, who always blame for poor performance everyone, but AMD themselves. It's always the developers of games and software that are somehow to blame. Praise the holy AMD, brother!
@@someperson1829 I mean, it's normal to be disappointed when a 7900xt is usually on par with a 4070ti super in raster
You lot are talking like all the other games before these last 2-3 titles don't exist, lmao
@@Dr.WhetFarts then explain why the 7900xt is usually on par with a 4070ti super, except on the latest titles like wukong and star wars, that are casually Nvidia sponsors 🤔
@@thunderarch5951 Yes yes, and the poor performance of Zen 5 on Windows was Microsoft's fault. Everyone was screaming and crying about it in the comment section... until AMD themselves send a fix to Microsoft to include in the next update, because it was always up to AMD to fix it. lol.
You guys definitely do the best benchmarks, insane amount of data.
One thing I'd like to see which may be in a cpu benchmark video I've missed is games benchmarked at 4k utra with a 4090 causing a gpu bottleneck. I want to see as you once told me via twitter my call of duty modern warfare 2 fps were low, you actually deleted your comment as you had made a mistake. I am wanting to see if a cpu like a 9700k gives as much data for the 4090 to produce frames vs the latest best cpu. I still believe a cpu like a 9700k is almost identical when fully gpu bottlenecked.
Something very wrong with these results, 7700xt breaks 100FPS 1080p ultra, not sweat, Something is wrong with your setup, thumbnail is epic though
7800xt 110 at 1080p native taa ultra too wrong readings from his setup
I really appreciate these kind of videos, as they're helpful to gauge a buying target or to set expectations. The charts showing native render vs various upscaling techs are great. A (heartfelt) Thanks, Steve!
I have 6700XT and I get average 80+ fps on 1080p Ultra Native settings.. So why 51 average on this video??
Scene tested maybe?
That's an insane amount of testing Steve. I hope you take good care of yourself. Thank you and cheers!
I see its now a trend with most new games. "VRAM LIMIT HAS BEEN EXCEEDED" even at 1080P
No, that is just how most game engines work. Allocation and actual requirement are two very different numbers. Nvidia easily wins in most new games, while having less VRAM on average. Nvidia also have better memory compression and utilize the cache better.
Looks like they removed it with a patch.
Yeah I was getting vram warnings when on a 4080S, 4k with DLSS Performance at ultra settings (so basically 1080p ultra)
It's just the game asking for more mem than it actually needs, it's no biggie.
they only use like 70% of the VRAM they're given, and then for some reason decide to hog the DRAM as well
Well now we need a thumbnail with Steve as Kratos and Tim as Atreus
1) Awesome thumbnail!
2) Man, the Radeon GPUs took a hit with this game...I'm sad to see the game so unfairly optimized, benefiting team green :/
Great work. As someone who played this uncapped on PS5- I won't be double dipping. The quality of the game and FPS on console is surprising high, with frames hovering between 80-100fps and looking great while do it. Enjoy my fellow PC brothers and sisters- its a gem of a game.
If you didn't already own it, which would you buy? I have a PS5 and a 4090 PC. Obviously the PC will run better, but when a game runs good enough on PS5 the convenience of console is nice. It make use of any DualSense features?
@@Malus1531 probably the 4090 PC if you have it hooked up to a TV or your monitor does HDR properly
how is the rx 7800 xt bit faster then a 3070 it should be a lot faster doesn't make sense.
Same way COD plays much better on AMD GPU than Nvidia. Some games play better on Nvidia and some play better on AMD.
@@19alive now but 7800xt is 2 tiers above the 3070 it got no business being close to it something wrong
@@simon89oi AMD cards are just garbage.
@@linkphan761 expect the 6500xt absolutely not
My king 7900xtx isn’t as powerful but dam I saved so much returning my 4090
I didn't test this section you were in but I did test the area after the intro and got very good numbers on RX 6800 XT. Shocking to see fps drop so much in performance later on according to your testing. Great job as always, Steve! 👏🏼👏🏼👏🏼👏🏼👏🏼
You have some serious issues in your benchmark, i get 120fps 1440p ultra and 140fps 1440 high, taa, no FG, no upscaling, overclocked 6900xt on 24.7.1, w10 22h2 without these month's update, 5950x overclocked co+pbo
I haven't gotten to the "most demanding" part of the game yet, but up to the point where you get to the dwarves home I've been getting around 95 fps at 1440p ultra TAA native res on a overclocked 6950XT with an old I7 6950x clocked at 4.3Ghz.. I'll try and report back once I get to this section of the game, but so far I've been pleased with the performance considering the age of the x99 platform.
Update: So I've gotten to the area tested, it's definitely more demanding but the lowest fps I've seen was a dip to 63 fps in combat but usually stays between 70-75 fps
GODlike thumbnail LMAO
I'm playing on rtx 3070 at 1440p with ultra settings and dlss quality and getting around 90-100 fps. It runs even better tham the 2018 gow, solid port.
I have a 3070Ti (16GB system RAM) and it feels like I get video memory leak after a while. 1440p, mix of high and ultra settings, tried both DLAA and DLSS quality and when the VRAM reaches near 100% usage changing the textures level does nothing. Restart the game and get 110-120fps again, it's so strange.
Have you gotten past the starting snowy area? It only gets worse from there
@@mariochi1499 same experience. I thought at first it was thermal issues. 😢
@@mariochi1499 There is a guy with a video on how to clear Nvidia cache shaders files, most feedback says it really smooths things out for them.
I actually laughed out loud at the sight of that amazing thumbnail. Awesome job you guys. Steve is in so many gaming universes that it’s ridiculous. I don’t think anyone can challenge his power.
A game looking good without ray tracing?? How is that even possible?
/s
It looks good, but outdated when it comes to lighting. Objects "float" in midair due to improper rasterized shading, there is no realistic umbra and penumbra, just approximations of them, there's light bleed literally everywhere, in every building, even under player character's armor (it's unshaded and too bright).. just check out eyelids of every character.. there's ton of light bleed/they are too bright, not shadowed properly. This game would look stunning with nice RTGI or at least RTAO/ RT shadows, if path tracing is out of the question. Developers confirmed to DF that they have plans to add some RT effects in future patches (at least RT reflections are confirmed)
@@Chasm9 I don't think most players actually use photo mode in games
@@iamspencerx I think that would be because there's a gambillion games and only a relative handful of them have photo modes. I've been quite thankful for games that have them, since many games have excellent worlds and environments that lend themselves to a wallpaper (I was kinda peeved that Elden Ring doesn't have one, I would have done something mildly unscrupulous in IRL life to get a good screenshot of the Cerulean Coast).
Also, I don't think people really get what they mean when they call stylised lighting and rendering "outdated". "Technically inferior", maybe, but you don't need superior rendering technology to make striking visuals. The entire idea of the God Ray, or fully solid shadows on characters, amongst all manner of developer s*light* of hand tricks almost entirely depend on being unrealistic to achieve Being Cool-istic.
Came here for the thumbnail. The editor made Steve a hardware unboxed official model for thumbnails
Thank you Steve and Tim for the awesome benchmark! 😉
Wow intel has some work to do on the arc cards with this game.
I now demand GPU Benchmarks Sponge Bob, Stray and Gollum.
This game is the first I'v seen with FSR 3.1 NATIVE rendering.
First FSR title I have used that I didn't immediately turn off. Actually does pretty well with the visuals, surprised me because I tend to be really picky.
almost all ps games ported to pc from nixxes have fsr native rendering method available
That thumbnail is magnificent! 😂 Please continue with these amazing thumbnails for all your game benchmark videos.
How is your 7800xt performance so low compared to other youtubers also using 7800xt?
Are you testing the same section? I know you're not, but let me know anyway.
@@Hardwareunboxed they all talking about the debut in the snow bro they haven't play it for more than 1 hour this is why they all asking you this
@@saiibox3174 The performance drops like crazy after the fimbulwinter section
Brother? 71FPS in 1080P with 7800XT ultra, I own one of these graphics and with those configurations it does not give such low fps, it is around 120fps, something must be wrong in your assembly, but these tables are not showing 100% reality
Are you looking at performance in the section tested? As noted in the video, we tested the most demanding section of the game we could find.
@@Hardwareunboxed So in the most demanding section of the game the 3070 is beating the 7800XT?
@@SemperValor UNFORTUNATELY YES. If 3070/TI had 12gb Vram would be very good gpu.
@@MrSeb-Swhat?😂, 7800xt has 16 gb vram, wth are you talking about?, also as the op said, this benchmark is fishy, I have 7800xt too and I am getting avg fps of 100. There is no way it performs similar to a 3070
The tests should be re done, after the 1.2 patch my 6700xt now uses full power instead of around 160w and easily is doing over 80fps (not in the beginning of the game) at 1440p all ultra besides tessellation
How am I getting more frames at 1440p with my 6700xt than you are getting at 1080p? And I was testing it in the same place as you.
Fsr?
Because your settings are not exactly the same as his
@@fightnight14I have a rx 6800 and I have way higher fps as well. Same for my friend on his 7800xt. No one is adding fsr or frame gen by mistake and ultra is ultra.
Showing EVGA cards is a real kick in the nuts, Steve.
I’ll pass this one just because SBI got hands on it, it’s time to get the anti woke movement going gamers
The RTX 3070 just seems to be waking up with some of the last game benchmarks HUB has done.
I’m only subscribed for the thumbnails.
Appreciate the quality benchmarking so soon after release, but I'll be honest I'm liking and commenting purely because of the thumbnail art.. they have been consistently magnificent and whoever is responsible deserves an immediate raise
Another GOATED Thumbnail 😅, I swear y’all are getting way too good at this
The thumbnail on this one is just Epic. Too bad Tim didn't feel like tagging along and letting Steve call him "boy" :P
Are the drivers for amd cards released?
The version they tested with has added support for GoWR however it is not the stable driver branch. From my experience it's better to stick to stable driver branch and just wait for the patches to come out later. Running beta drivers on daily basis leads to headaches that a non techy user really doesn't want to deal with. It's for people who want to test and report bugs not for people who want troublefree experience.
They better make that a playable skin.
Keep up the awesome work!
I'm a very happy owner of RTX 3080. It's crazy that 4070 is slower than 3080 and 4070 Super isn't much faster. I bought mine for 350 Euro with a warranty. What a deal!
Real. I was just about to say the 3080 is aging like fine wine. Own one myself
Bought on release my 3080 is the best GPU investment I've made since the 980.
@@RavTokomiMe too
I think there's an issue with the game. I've seen other channels get over 80fps with the 7800 XT at 1440p/Ultra and Steve just gets 63. There's a memory leak and tessellation bug in the game too, so they could perhaps explain the disrepancies.
Where did you get these numbers? I got a lot more FPS on my 6800 than what you listed here.
Is Frame Generation perhaps on?
The thumbnail made me spit my water out when I noticed 😂😂
Fantastic point about the advantage of just making the install very large to circumvent some of the headaches of extreme decompression in game. At this point, I couldn't care less if a game was 500 gb on my PC if it improved performance even just a little over it being 50gb. It'd be nice if the user could just choose between two levels of compression though.
it does seem strange to install literally every audio version instead of dropping ~20gb off by picking one
Excellent video Steve!
AMD gpus are clearly underperforming...
(no way a 4060 is beating a 6700xt lol)
Need an update from developers or amd pushes game ready drivers
right? i thought the same thing
Hilarious considering this game was optimised for playstation 😅
AMD already has the driver. It's the one tested in the video. He even mentions it. Try and pay attention.
@@mikelay5360 I dont know where you got that. The team that did the pc port optimizied it for nvidia bevcause 85% gamers use nvidia.
@@kazuviking That game was first optimised for AMD hardware (playstation). You would think it should be flawless For team red but alas💀.
My 6600 is getting 40fps on Ultra with those lows for a game that looks as good as Ragnarok does? Man, I'd call that a result for a lower end 8GB card. Just appreciate that you've at least included it. Own it on PS5 already, but it's nice to know for sales in the future if I fancy it on Steam.
Steve is just like agent 47, any outfit or disguise would fit on him
Before anyone brings up the same tired optimization argument, this game was designed for the PS4, which is why it runs so well.
Yeah people are using the word optimization excessively.
If it wasn't clear for anyone, NVIDIA is light years ahead of AMD when it comes to game ready drivers.
Whats up with amd cards lacking in these last few releases?
The same historical reason why AMD cards underperform in most sponsored titles.
@@Pleasiotic1 is this game even sponsored by nvidia?
Honestly those thumbnails need to be hung up in a museum somewhere 😂 just dam good work
Steve of War: Zeus!!!!!!! You can no longer hides framerates behind the skirts of Athena!!!!!!!!
Left shift + middle mouse button = Australian Rage
Petition to use this as Thumbnail template for all the videos going forward.
Optimization looks poor for this game specially for Radeon cards.
AMD users were always the 2nd class citizens. LMAO
And this is nonsense when the console hardware is literally what this game is supposed to be optimized for, and that is AMD @@AdiiS
Yup and upscaling and frame generation seemed to be the band aid solution (glad it’s there at least)
@@mruczyslaw50It does not woek like this. PC versions of games are build using different software tech. So it does not necessary mean that if consoles use AMD hardware, a game should run better on PCs with AMD hardware.
In terms of CPU performance, it is even less relevant, since Intel CPUs often perform better in console ports.
Cry harder. There will always be the negative crowd no matter the condition of the game at release.
The thumbnails are getting ridiculously good
Okay, so how are the Arc A770 and RTX 2060 Super part of the recommended specs when the A770 only gets 34 fps at 1080p, medium, with TAA on and the RTX 2060 Super isn't even in your test suite? I have an RTX 2060 so I thought I was in the clear. Can we get a video testing Ragnarok with the Steam-recommended hardware ( Ryzen 3600/ Arc A770, RTX 2060 Super, and the RX 5700) and see what settings it would take to get them to run at 1080p 30 - 60 fps. I feel like these benchmark videos are incomplete when you leave off actual recommended hardware, even if you decide it's unplayable some of us would like to see how they perform.
It's pretty easy to extrapolate.
@@joeykeilholz925 Not really. I've watched other reviews since then on the base RTX 2060 6GB performance and the game gets much better numbers even with FG off. However, as one of the leaders in the tech space, I would like channels like Hardware Unboxed to include actual recommended hardware. I also build and repair PCs myself and have for over a decade. I understand what goes into benchmarking and how different hardware and settings can change test results.
I got the answers I needed I only hate that Hardware Unboxed chose not to test important hardware and settings. PC gaming isn't just for enthusiasts. I build PCs for all levels of gaming.
@@notatechguy1209 Editors can no longer give us as much information. Nowadays, a number of factors come into play in order to even publish a reasonable benchmark test. Everything costs time and money. Behind each test there is the next one that has to be done. Please keep that in mind in this profession.
I'm getting around 80 ~ 118 fps with Pulse RX 7900 XTX and 5800X (2x32) 64GB 3200Mhz RAM Native AA and 4K all Max people are still taking these benchmark so seriously and I don't know why these shouldn't be a reference the game works so well pretty smooth and sutter free.
They literally said they were benching the most demanding section of the game. The first section is much less demanding and you get more FPS there.
@@linkphan761generally speaking the game works pretty well it's very well optimized and that's what we want thanks to Sony as usual unlike GAYPUNK 2077
These AAA games are complete garbage, but you guys still put the work in doing benchmarks and optimization guides, gotta respect that, you guys are great.
Also the thumbnails are absolutely fantastic.
Game got 10/10 reviews buddy. Go back to bed.
I think most people that played it really liked it.. What's with this dumb hate on every single big name game. Just try them out.
@@BladeRunner-2211doesn't mean it's good.
Reviews is just marketing at this point all the major reviewers get payed to shill shitty games so it's not something you would ever consider.
Matter of fact if it's 10/10 in the reviews it would be a 6-7 at best
How is this game garbage thought?… like, this comments just seems out of place and unnecessary
@@gregorB92 you realize that even user reviews are wrong most of the time because of bots that inflate the scores to make it look good???!
Anyway glad it's on pc now so you and anyone who liked it can play it
I think the data is mostly correct for RDNA3. I have an XTX myself, and it does appear that internal latency becomes the bottleneck at lower resolution for games that mostly rely on classic raster techniques. That said: The RDNA2 performance is completely broken. The Navi 21 derived products should rip an tear, especially as the resolution goes down, as the infinity cache becomes more effective the lower the resolution.
Optimized my ass it runs 20% slower on AMD cards
And? Radeon is a tiny bit of the market so it makes sense.
It was the opposite situation with Ghost of Tsushima, so it what it is. Different game engines prefer different hardware.
@@christophermullins7163 Ahem game was developed for the consoles which have...surprise a AMD APU. This makes no sense.
@@retrosimon9843 Which does not use the same driver stack or graphic APIs that dedicated GPUs use. They couldn't be further from each other despite having similar hardware.
This is why I bought nvidia lol but not for sony games
0:45 Great looking game without ray-tracing: music to my ears
Here we can see what happens when AMD doesn't update their drivers in time for a big game release. As a 6700xt owner, they are ALWAYS so slow when it comes to driver updates and bugfix, I've been having a strange freesync brightness flicker bug for a year, I never had it before with the same gpu (and same monitor), I didn't even have it when I had nvidia (same monitor, also gsync compatible). It was introduced with a driver update, they never fixed it ( I tried everything I could) .
the thumbnail = win.
The 3060Ti beating out the 6800 and the 7700XT is wild. And as an owner of the 3060Ti I’m very happy 😊
2022 game, needs a 4070 for 1440p 60 fps, ok
a PS4 game yeah 😂
is a 2024 game pc not 2022
2013 game
You do understand that consoles aren’t running ultra settings right? Turn them down to medium and yall get your precious 120 fps.
@@jimmyramos1989 ps5 is ultra in quality mode and ps4 is low settings
Ohh I don't know the last time a video had this good of a thumbnail.
AMD getting beat by Nvidia in new AAA games. Wukong, Outlaws, Space Marine 2 and now God of War ragnarok.
Steve be making us worried by being standing at the beginning of the video.
Cpu 7600 and 7900xt gpu here. Getting 120fps 4k ultra with fsr3.1 and framge gen. Like butter. Just saying so people know its not all that shit.
😂 we hear you.
Who looked ar this game, and the benchmarks and called it shit?
Again, I'm here to not watch the benchmark, but to say the thumbnail really nailed it.
What happened to RDNA in This Title 6800xt supposed to match 3080 😢
Runs great on my 5700xt and 6800 gpu (W10).
I have everything to the max on my 3440x1440p screen, FSR 3.1 native and getting 70 fps witch is enough for this kind of games.
Only thing Sony has to learn is how to make a proper Audio configuration.
I have a great soundcard but like more Sony pc games Im only getting Stereo sound whatever I do.
They screwed AMD in this one. 😂
Game designed for amd hardware, still runs 50% faster on the 4090 than 7900xtx lmao
Did they really screw AMD? The 4090 is in a whole other tier than the XTX. This performance delta is the expected norm.
@@RicochetForce 4080?
@@phm04 4080 and XTX trade blows and swap places in most benchmarks
Thumbnails have been top tier recently.
AMD's 6000 series are underperforming in so many new games.
RIP finewine tech.
careful you'll get bashed by the diehard AMD fanbots
This test is wrong idk where he got these numbers just check out any other benchmark on youtube.
And look at Intel Arc... yikes
Just came here to comment on how good the thumbnail is. Gj editor.
People are accepting this nicely but... geez! A freaking 3070 or a 6800 XT to play a PS4 game at 1080p? WTF
I highly doubt that the PS4 ran the game at native 1080p.
Most likely it ran at a much lower resolution and FPS.
720p@40FPS at low graphics possibly.
720p upscaled and everything on low for 30fps on ps4, it's simple lol
Bro, what? The 3060 and 6700xt are around 60 fps at ULTRA settings 1080p. Thats better than the ps5 plays it.
@@zeubiflex6230
According to Digital Foundry, GOWR on the base PS4 runs at full 1080p with no upscaling or dynamic resolution, and the visuals are nearly identical to the PS5, except for minor details and framerate differences. With that said, you really shouldn’t need a $600 GPU to max out a cross-gen console game with barely any graphical improvements at the same resolution.
@@Knoah321According to Digital Foundry, GOWR on base PS4 runs at full 1080p with no upscaling or dynamic resolution, and the visuals are nearly identical to the PS5 except for minor details. With that said, I think we really shouldn’t need a $600 GPU to max out a last gen game at the same resolution with almost no graphical improvements.
Don't know how your benchmark always differ from the benchmark where I see the people playing lmao
Most of those are fake.
Another L for Radeon owners, lower than expected performance.
It's odd given PS5 has Radeon
@@Radek494 Maybe the FineWine™driver team is sleeping?
@@KimBoKastekniv47 Let's wait for 24.9.1 and see.
The first God of War (2018) on pc is the first game that game me a "Whoa!" with my 4090 being a LOT faster than my previous card, the 3090.
amd showing how shit they are lately, worst purchase of my life. How to become nvidia fanboy - buy amd