In Poland, a new 7900 xtx is twice as cheap as a used 4090. New ones can be 3x more expensive. It's hard to believe. It's sad that AMD is giving up on high-performance GPUs in the next generation.
Bro, really? RX 8800xt will be amazing with rtx 4080 performance while being nearly 2x cheaper. Its enough for one gen break from high end. RX9000 will have high end. Also RX8000 gpus will get fsr4.
@@MacTavish9619 What do you mean 4080 performance bro? 4080 is on par with 7900XTX across the games. Why not say that it will be on par with the current AMD best card instead?
@@someperson1829 I guess it's possible that it shall be pretty much qual to 4080 on all fronts and that includes Raytracing and Pathtracing where 4080 usually if not always absolutely crushes 7900XTX. Highly unlikely , but possible.
@@arenzricodexd4409 Yeah, yeah. They're saying that for about 5 years now xD Also 8800 XT will have far better RT support than RDNA 3 and 2, so stop yapping.
@@fallenlegion1828 He's referring to the deeper color contrast compared to the 4090. It is slight, but there is a difference. AMD has always had slightly better image quality, not exactly sure how, but I have noticed it over the years. Might have to do with their pipeline layout, though I'm just speculating at this point.
Frame gen is worth using in single player games for lower temps. Just need to cap the fps to 4K 120Hz. No reason to cook your silicon and make the Air-conditioner come on but that's just my opinion as an avid AFMF2 enjoyer.
@@quintrapnell3605 I've just been using DLSS Quality at 4K with 120hz vsync on the OLED TV. GPU pulls around 260 watts CPU 60, total system power is around 350ish
I don't know, but unless you're getting CPU bottlenecked, or unless the frame-rate is super high even with no upscaling, you should pretty much always turn upscaling on when running at 4k.
Hard to say, they both shimmer when moving the camera. FSR is a bit blurrier so it probably works to its advantage in this particular case. You can see in the conclusion around 10:50 if you pause and look at the work artistry in the office, DLSS has a bit higher detail but FSR does pretty well too.
@@OmnianMIU I had a hard time trying to see any differences, im pretty sure there are, as DLSS is better, but even at 4k 55 inch screeen i only noticed the water shimmering. usually shimering is the easiest difference to note on comparisions, or the rest of details is harder at least or me.
Thanks for this comparison @TerraWare. Can you keep the same placement of names (in title), thumbnails and on screen footage consistent in the futrure? rn you have 4090 on the left for the title and thumbnail but the footage is on the right. This can be confusing
Yeah that bothered me too, I created the video first as always. Had the 4090 on the right initially in the thumbnail but didn't like how the RTX and XTX in the names lined up so I swapped them. Will have to plan it out a bit better next time. I value symmetry if I can when making thumbnails.
Hey bro, wanted to know your opinion on how FSR fares vs DLSS in this game? Also in general is FSR close enough to DLSS in most games? I am choosing between Nitro 7900XTX and TUF 4080 Super and the XTX seems a bit faster in general but not sure about giving up DLSS. Also how is the performance with the new AMD preview drivers for Ragnarok?
@@Chasm9 Does it? Seen other benchmarker who was livestreaming benchmarking 4090 with this game and nothing happened. His name is Santiago Santiago . He even tests it at 8K.
bought my rx 6800 last weekend,downloaded ragnarok yesterday and started playing this mornintg until now.....definately nooooooo complaints..wot a game
hi my friend!!very complet video good one thanks we know every things in eatch option !!i have 7900xtx sure i will ebjoy the game !!is it CPu heavy game or will work good in my 5800x3d ?have great weekend and always glad for u
the difference in the amount of used vram is insane between nvidia and amd in this game, the 4090 barely wants to use its vram while amd goes full throttle, not like the 4090 doesnt have enought
Forget frame generation and dlss, just put it on TAA and quality, which is a resolution scale of 80. I struggle to find any visual difference on my 32 inch oled
fyi: the game does not use rt, it uses planar reflections, duplicating the geom below the water with diffuse, compared to rt reflections of the same calibre its about 8x faster
Uuuuum what?? How did this fly over my head lol, had no idea Ragnarok was coming to pc XD. I'm a huge fan of the older games and I liked the storytelling approach on the new one tbh, hope Ragnarok is fun as well cause I'm getting this puppy. Fsr does look amazing here btw, CDPR take notes smh. Thanks for showing us man, looking good!!
It's pretty good although I felt the story could've been better in the sequel the gameplay does expand, so do the many areas you visit and maybe you'll like the story more than I. I
@@TerraWare Yea I heard that the story is kinda mediocre at best which really sucks... Just finished installing it and if it's as bad as they say I'll just refund it after a couple hours. Steam ftw lol. I'll be honest though I wasn't THAT impressed by the first one either in terms of storytelling. Had some really annoying moments (looking at you Atreus) but I just found the new take refreshing. I don't expect much from games such as these, just a chill time with great visuals and acceptable storytelling and plenty of hot chocolate on the side to enjoy them with. You know, a cinematic experience. If I want deep lore and combat there are games which are way more proficient at offering both at the same time but you know, gotta have myself some cinematic chill from time to time :P Hopefully I won't have to refund it
DLSS frame gen works fine on my end. Not sure if it's a widespread issue as you hinted, then. Also, have you had a chance to try the HDR in this game?? It's absolutely fantastic!! Supports HDR system level calibration, has NO black level raise, and supports wide color gamut BT2020. It's otherworldly
DLSS FG not working is pretty wide spread. Made a short about it and lots of comments people saying the same, my buddy over at Mostly Positive Reviews couldnt get it to work in his review on a 4070 Super either. I did test HDR and mentioned it in my previous video. Im not fit to test HDR with tools but I said it looked fantastic so am not surprised.
4090 consumes almost 75 watts less for 25% more performance. No wonder AMD does not want to compete with 5090. It will need 200 watts more power making it a 700 to 800 watt card and would still lose in RT, CUDA, Upscaling quality etc. I also feel like the the CPU is a bit of a bottleneck when DLSS is enabled.
Btw theres something odd with FSR FG with nvidia GPUs here... I noticed he gained only 30fps on the 4090 and wanted to test it out myself with my 4080 4k DLAA medium preset FG OFF - 110fps With FG ON - 140fps 4k DLAA ultra preset FG OFF - 95fps With FG ON - 125fps It IS always +30fps😂😂
@joselejos You're right, I've been playing around with DLSS too it is 30 FPS. I tested this game on my 6800XT cpu bound with a Ryzen 3700X at 70 fps. Turning on FSR-FG goes up as high as 140
So you mention power efficiency and performance difference but not the price difference, which is almost 50% between AMD and Ngreedia. Kind of a disservice towards us gamers Also, even AMD themselves said it many times, 790XTX competes with the 4080, but who cares, right?
@grtitann7425 I said all this in my video the price and everything. It is a 4080 super competitor and it would perform very similar to a 4080 Super in this game at a similar price. With 4080 being more power efficient, better upscaling, RT and all that.
DLSS frame gen worked fine for me until I got to the first realm you travel to, Then one area simply turned it off... Also, Reflex and HDR keeps turning off every restart of the game...
I've seen some comments people saying DLSS FG works for them but the overwhelming majority of comments claim DLSS FG doesnt work for them, including Mostly Positive Reviews video on the game. Havent had any issues with HDR personally. It looks fantastic with HDR
Really sucks that AMD isn’t making high end cards anymore. Not saying it would compete with the 5090 but it would’ve been the best price to performance card like the 7900xtx is.
Their best will still probably beat 4090 without problems in raster. And if priced right with massively improved RT & PT performance it could be quite popular among well informed enthusiasts and semienthusiast with relatily limited budgets. I doubt 4090s will drop in price significantly when 5000 series is finaly unleashed.
Maybe I should've but based of what I have seen and can see in this video if you pause DLSS reconstructs the fine detail a bit better but FSR looks great too. Better than in other games imo.
Please do not overclock your RTX 4090. Because if your RTX 4090 with OC version is 2565 MHz core as stock (means +0 MHz), this would get real boost typically 2760 or 2775MHz. While my RTX 4090 non-OC version is 2520 MHz core (means +0 MHz) and gets real boost typically 2715 or 2730 MHz. 😉
It depends on the card and temps. My 4090 will boost to 2955mhz and during the winter it will boost to 3050mhz. This is w/ power limits at 90% no undervolt or OC.
It doesn't make much of a difference anyway OC or no OC its around 5 to 10% advantage at best. The 7900XTX can see some nice gains up to 15% I have found in some cases.
nvidia and amd work diferently when it comes to clocks, you need to measure power draw and utilization to get an estimate if the gpu is matching full utilization, it's weird, but its why you can get the 4090 sometimes be below 300W while still boosting at maximum clocks.
AMD’s motto seems to be 'the past is the future.' Anyone buying AMD GPUs really needs to sharpen their critical thinking skills. With their low-quality GPUs, AMD will never get my money, no matter how much exposure I have to their products.
@@hornantuutti5157 I'm not referring to poor quality in terms of weak construction or performance, but in recent years, two key features have been introduced that are truly game-changing: ray tracing and AI rendering. Ray tracing is almost as revolutionary as the shift from 2D games to 3D polygons, and AI rendering is what makes it all possible. However, AMD's marketing often pushes the narrative that traditional rasterization techniques are still the best, all while they quietly work to close the gap on these new technologies. Worse, they do this behind the scenes, using paid RUclipsrs to spread misinformation, which doesn't seem fair. Intel, for example, doesn't pay 'reviewers' to do this kind of thing. In my opinion, it's time to properly test AMD's RX 7000 series against NVIDIA's cards, especially since they now have dedicated hardware for ray tracing and AI. There's no reason to avoid direct comparisons of how both companies handle these newer technologies. This video does make a fair comparison, though .
I look at the Watts of both cards an start to cry when i see the AMD 7900 XTX number.. its aroun 60-100 Watts more. That is so bad. I would buy a 4070 Ti Super or a 4080 instead of the 4090. With that poor power management, never AMD.
holy crap that power efficiency on the 7900XTX is crazy! 460W @ 88 FPS vs 400W @ 108 FPS. 5.2W/fps vs 3.7W/fps. That's a whopping 40% more power efficient!!
@@TerraWare both cards are overclocked, yes? Then the power efficiency comparison I made stands. Overclocked, the 4090 is 40% more power efficient than the 7900 XTX.
My 7900 xtx was constantly crashing due to "driver timeout" but once I uninstalled AMD adrenaline software I havent had one crash. I cant believe AMDs own software was causing this... I regret not getting Nvidia.
Imagine spending 2x more for 4090 to get 20% more fps xD Also xtx from this video is weird af. Even 7900xtx sapphire nitro + wont use more than 408-420W and his is using even 480W HOW. This is impossible.
@@MacTavish9619 This is what people ignore. I would literally have to spend $1000 more for a 4090 for an extra 20% that is not a good value for money. My XFX magnetic air 7900xtx uses about 390 watts.
In Poland, a new 7900 xtx is twice as cheap as a used 4090. New ones can be 3x more expensive. It's hard to believe. It's sad that AMD is giving up on high-performance GPUs in the next generation.
Bro, really? RX 8800xt will be amazing with rtx 4080 performance while being nearly 2x cheaper. Its enough for one gen break from high end. RX9000 will have high end. Also RX8000 gpus will get fsr4.
@@MacTavish9619 What do you mean 4080 performance bro? 4080 is on par with 7900XTX across the games. Why not say that it will be on par with the current AMD best card instead?
@@someperson1829
I guess it's possible that it shall be pretty much qual to 4080 on all fronts and that includes Raytracing and Pathtracing where 4080 usually if not always absolutely crushes 7900XTX.
Highly unlikely , but possible.
@@MacTavish9619 But a 5080 will beat a rx 8800xt. So he is right AMD has no answer for the 5080 and 5090
Wow bro
This is a proof of a true optimized game 👌 very nice performance on both
Indeed
Price difference is insanely huge. I am about to buy RX 7900 Xtx for a €850 eur, its even cheaper than any 4080 and doing quite simmilar if not better
Not in Raytracing if that's your thing.
@@GameslordXY its not
@@GameslordXYsoon there will be no option to turn it off.
@@arenzricodexd4409with console using more ray tracing mostly with the ps5 pro you gotta good point
@@arenzricodexd4409 Yeah, yeah. They're saying that for about 5 years now xD Also 8800 XT will have far better RT support than RDNA 3 and 2, so stop yapping.
i'm glad to see this game uses FSR 3.1
Yes and it's properly implemented unlike Cyberpunk 2077.
@@TerraWare
Cyberpunk is Nvidia's playground so no wonder.
So It could be deliberate.
Incompetence ?
I guess it could be a little bit of both. .
@@GameslordXY incompetence.
@@TerraWare You're wrong cyberpunk doesn't have the fsr 3.1 upscaler it actually uses fsr 2.2 with fg
you are the kid who had the xbox 360 and the PS3
I was more of a 360 person but yeah eventually did get a PS3 too lol.
Vsync is disable with FG .
The solution is to activate Vsync on drivers level. Then the game is super smooth with FG.
vsync didnt add more input lag ?
@@eriuz it could reduce the lag instead
I love the rx 7900xtx because it show more deep thept in pic rich color quality, i know radeon use av1 too
@@SoulSignalExperience 🤣🤣🤣
What are you talking about here? 😂
This is some next level crazy :D
Intel, Nvidia and 7000 series AMD GPUs all have AV1 support. This is next level copium.
@@fallenlegion1828 He's referring to the deeper color contrast compared to the 4090. It is slight, but there is a difference. AMD has always had slightly better image quality, not exactly sure how, but I have noticed it over the years. Might have to do with their pipeline layout, though I'm just speculating at this point.
I was going to do a video checking out the RX 7900XTX and figured may as well make it a VS video out of curiosity.
What psu u have, atx 3.0?
@@SoulSignalExperience Am currently using an EVGA 1000GT
Frame gen is worth using in single player games for lower temps. Just need to cap the fps to 4K 120Hz. No reason to cook your silicon and make the Air-conditioner come on but that's just my opinion as an avid AFMF2 enjoyer.
@@quintrapnell3605 I've just been using DLSS Quality at 4K with 120hz vsync on the OLED TV. GPU pulls around 260 watts CPU 60, total system power is around 350ish
9:45 did the 4090 DLSS shimer worse than 790XTX FSR?
I don't know, but unless you're getting CPU bottlenecked, or unless the frame-rate is super high even with no upscaling, you should pretty much always turn upscaling on when running at 4k.
Hard to say, they both shimmer when moving the camera. FSR is a bit blurrier so it probably works to its advantage in this particular case. You can see in the conclusion around 10:50 if you pause and look at the work artistry in the office, DLSS has a bit higher detail but FSR does pretty well too.
No, it's more detailed on DLSS, while on FSR frame it's all shimmered in the reflaction
@@OmnianMIU I had a hard time trying to see any differences, im pretty sure there are, as DLSS is better, but even at 4k 55 inch screeen i only noticed the water shimmering. usually shimering is the easiest difference to note on comparisions, or the rest of details is harder at least or me.
@@OmnianMIU do you know DLSS is on the right side? hehe
7900xtx vs 4080super 🎉
Don't have one unfortunately but 4080 Super is around 20% weaker than 4090 so should be pretty close to the XTX in practice.
@@TerraWare And they are the same price.
Thanks for this comparison @TerraWare. Can you keep the same placement of names (in title), thumbnails and on screen footage consistent in the futrure? rn you have 4090 on the left for the title and thumbnail but the footage is on the right. This can be confusing
Yeah that bothered me too, I created the video first as always. Had the 4090 on the right initially in the thumbnail but didn't like how the RTX and XTX in the names lined up so I swapped them. Will have to plan it out a bit better next time. I value symmetry if I can when making thumbnails.
Hey bro, wanted to know your opinion on how FSR fares vs DLSS in this game?
Also in general is FSR close enough to DLSS in most games? I am choosing between Nitro 7900XTX and TUF 4080 Super and the XTX seems a bit faster in general but not sure about giving up DLSS.
Also how is the performance with the new AMD preview drivers for Ragnarok?
FSR does a pretty good job here but DLSS is noticeably better in majority of cases.
very good video ! i hope they will fix frame gen nvidia
Yeah me too
It works fine on my 4090, tho 🤷♂
@@Chasm9
Does it?
Seen other benchmarker who was livestreaming benchmarking 4090 with this game and nothing happened.
His name is Santiago Santiago .
He even tests it at 8K.
bought my rx 6800 last weekend,downloaded ragnarok yesterday and started playing this mornintg until now.....definately nooooooo complaints..wot a game
hi my friend!!very complet video good one thanks we know every things in eatch option !!i have 7900xtx sure i will ebjoy the game !!is it CPu heavy game or will work good in my 5800x3d ?have great weekend and always glad for u
5800X 3D should be good for 120+ fps no problem.
@@TerraWare thanks my friends for information
7900XTX = 4080S, not 4090!!!
$1000 extra dollars for less than 20fps lmao
@qwerty-dm8gr I payed $1000 AUD for 7900xtx 4090 is $3700 AUD...
the difference in the amount of used vram is insane between nvidia and amd in this game, the 4090 barely wants to use its vram while amd goes full throttle, not like the 4090 doesnt have enought
What a beautifull game, this is how you make a truly great game.
Your benchs are very professional. You should be having more subs
Hopefully in due time. Thank you.
Great performance on both, though from looking at the frame time graph, I saw that the 4090 seems to have a rather large frame time spike.
Forget frame generation and dlss, just put it on TAA and quality, which is a resolution scale of 80. I struggle to find any visual difference on my 32 inch oled
So, an ADDITIONAL 1200$ CDN gets you +20 FPS. Man Nvidia's pricing is beyond ridiculous.
It's more of a 4080 Super competitor anyway which should be pretty similar performance to the XTX here for similar price.
It would be great to compare 7900 XTX vs 4080 S.
4080s is slightly better for the same price
@@urkot79 Still would like to see comparsion. I am not someone who cares a lot about ray tracing, just to see comparsion of raw power of both cards.
7900XTX was draining like 450 watts in the first test??
fyi: the game does not use rt, it uses planar reflections, duplicating the geom below the water with diffuse, compared to rt reflections of the same calibre its about 8x faster
Uuuuum what?? How did this fly over my head lol, had no idea Ragnarok was coming to pc XD. I'm a huge fan of the older games and I liked the storytelling approach on the new one tbh, hope Ragnarok is fun as well cause I'm getting this puppy.
Fsr does look amazing here btw, CDPR take notes smh. Thanks for showing us man, looking good!!
It's pretty good although I felt the story could've been better in the sequel the gameplay does expand, so do the many areas you visit and maybe you'll like the story more than I. I
@@TerraWare Yea I heard that the story is kinda mediocre at best which really sucks... Just finished installing it and if it's as bad as they say I'll just refund it after a couple hours. Steam ftw lol.
I'll be honest though I wasn't THAT impressed by the first one either in terms of storytelling. Had some really annoying moments (looking at you Atreus) but I just found the new take refreshing.
I don't expect much from games such as these, just a chill time with great visuals and acceptable storytelling and plenty of hot chocolate on the side to enjoy them with.
You know, a cinematic experience.
If I want deep lore and combat there are games which are way more proficient at offering both at the same time but you know, gotta have myself some cinematic chill from time to time :P Hopefully I won't have to refund it
@@constantinesoldatos Story's not that bad lol
One can burn down your house but faster vs the other that wont.. Hmm, which one should I choose... ;)
DLSS frame gen works fine on my end. Not sure if it's a widespread issue as you hinted, then.
Also, have you had a chance to try the HDR in this game?? It's absolutely fantastic!! Supports HDR system level calibration, has NO black level raise, and supports wide color gamut BT2020. It's otherworldly
DLSS FG not working is pretty wide spread. Made a short about it and lots of comments people saying the same, my buddy over at Mostly Positive Reviews couldnt get it to work in his review on a 4070 Super either.
I did test HDR and mentioned it in my previous video. Im not fit to test HDR with tools but I said it looked fantastic so am not surprised.
4090 consumes almost 75 watts less for 25% more performance. No wonder AMD does not want to compete with 5090. It will need 200 watts more power making it a 700 to 800 watt card and would still lose in RT, CUDA, Upscaling quality etc.
I also feel like the the CPU is a bit of a bottleneck when DLSS is enabled.
for the price difference the 7900xtx is a way better buy! but sure 4090 is a better card since its 2x the price of a 7900xtx (for 25% more frames)
playing native 5120x2160 ultra on my xtx its a blast
fascinating with the frame gen.
DLSS3 works for this game, you just need to reboot it.
Обратите внимание на мелкие детали: топор, экипировка, морщины на затылке Кратоса, текстуры моста. На Radeon они лучше!
you should better calibrate your frametime graph. We can see the frametime fluctuating quite a lot but it barely shows on your graph.
20% fps boost for 100% price 😊
Maybe check out the memory junction temperature with HWinfo or GPU-Z when the XTX is at 100% utilization. That's where it throttles.
Btw theres something odd with FSR FG with nvidia GPUs here... I noticed he gained only 30fps on the 4090 and wanted to test it out myself with my 4080
4k DLAA medium preset FG OFF - 110fps
With FG ON - 140fps
4k DLAA ultra preset FG OFF - 95fps
With FG ON - 125fps
It IS always +30fps😂😂
@joselejos You're right, I've been playing around with DLSS too it is 30 FPS. I tested this game on my 6800XT cpu bound with a Ryzen 3700X at 70 fps. Turning on FSR-FG goes up as high as 140
So you mention power efficiency and performance difference but not the price difference, which is almost 50% between AMD and Ngreedia.
Kind of a disservice towards us gamers
Also, even AMD themselves said it many times, 790XTX competes with the 4080, but who cares, right?
@grtitann7425 I said all this in my video the price and everything. It is a 4080 super competitor and it would perform very similar to a 4080 Super in this game at a similar price. With 4080 being more power efficient, better upscaling, RT and all that.
DLSS frame gen worked fine for me until I got to the first realm you travel to, Then one area simply turned it off... Also, Reflex and HDR keeps turning off every restart of the game...
I've seen some comments people saying DLSS FG works for them but the overwhelming majority of comments claim DLSS FG doesnt work for them, including Mostly Positive Reviews video on the game.
Havent had any issues with HDR personally. It looks fantastic with HDR
7900xtx FTW !!!
Really sucks that AMD isn’t making high end cards anymore. Not saying it would compete with the 5090 but it would’ve been the best price to performance card like the 7900xtx is.
Change that from anymore to the next Generation. We will see RDNA 5 which will have a highend gpu.
Agreed. Wish they'd compete in the high end next gen but they could still make good improvements with RDNA 4
Their best will still probably beat 4090 without problems in raster.
And if priced right with massively improved RT & PT performance it could be quite popular among well informed enthusiasts and semienthusiast with relatily limited budgets.
I doubt 4090s will drop in price significantly when 5000 series is finaly unleashed.
Bro this is your life?
Hope AMD could release fsr4 earlier
Why didn't you compare the qualities of the upscaling ?
Maybe I should've but based of what I have seen and can see in this video if you pause DLSS reconstructs the fine detail a bit better but FSR looks great too. Better than in other games imo.
Please do not overclock your RTX 4090. Because if your RTX 4090 with OC version is 2565 MHz core as stock (means +0 MHz), this would get real boost typically 2760 or 2775MHz. While my RTX 4090 non-OC version is 2520 MHz core (means +0 MHz) and gets real boost typically 2715 or 2730 MHz. 😉
@@ricarnuninho80 but his 4090 runs fine at 2940mhz so why Not ?
@@i3l4ckskillzz79 Ok but I don't deserve any guy wants to do overclock GPU. But if you stay to do permanently OC GPU, your GPU can't get *many* years.
It depends on the card and temps. My 4090 will boost to 2955mhz and during the winter it will boost to 3050mhz. This is w/ power limits at 90% no undervolt or OC.
It doesn't make much of a difference anyway OC or no OC its around 5 to 10% advantage at best. The 7900XTX can see some nice gains up to 15% I have found in some cases.
nvidia and amd work diferently when it comes to clocks, you need to measure power draw and utilization to get an estimate if the gpu is matching full utilization, it's weird, but its why you can get the 4090 sometimes be below 300W while still boosting at maximum clocks.
AMD’s motto seems to be 'the past is the future.' Anyone buying AMD GPUs really needs to sharpen their critical thinking skills. With their low-quality GPUs, AMD will never get my money, no matter how much exposure I have to their products.
Got 7800xt and there is no issues. So whats this poor quality you talk about? Whisper quiet and everything runs nice n smooth.
@@hornantuutti5157 I'm not referring to poor quality in terms of weak construction or performance, but in recent years, two key features have been introduced that are truly game-changing: ray tracing and AI rendering. Ray tracing is almost as revolutionary as the shift from 2D games to 3D polygons, and AI rendering is what makes it all possible. However, AMD's marketing often pushes the narrative that traditional rasterization techniques are still the best, all while they quietly work to close the gap on these new technologies. Worse, they do this behind the scenes, using paid RUclipsrs to spread misinformation, which doesn't seem fair. Intel, for example, doesn't pay 'reviewers' to do this kind of thing.
In my opinion, it's time to properly test AMD's RX 7000 series against NVIDIA's cards, especially since they now have dedicated hardware for ray tracing and AI. There's no reason to avoid direct comparisons of how both companies handle these newer technologies.
This video does make a fair comparison, though .
4090 ❤
In the fsr 3 frame generation benchmarks rhe 4090 is cpu bottlenecked
I look at the Watts of both cards an start to cry when i see the AMD 7900 XTX number.. its aroun 60-100 Watts more. That is so bad. I would buy a 4070 Ti Super or a 4080 instead of the 4090. With that poor power management, never AMD.
I do have both cards manually overclocked, which the RX 7000 high end can consume a bit more power but yeah the RTX 4000 is quite power efficient.
Oh common dude.
Your bills won't be massively higher.
@@GameslordXY they will be in some places, not massively but enough to care
holy crap that power efficiency on the 7900XTX is crazy!
460W @ 88 FPS vs 400W @ 108 FPS.
5.2W/fps vs 3.7W/fps. That's a whopping 40% more power efficient!!
He said is overclocked....
It's overclocked, that ads about 50 to 60 watts
@@TerraWare both cards are overclocked, yes? Then the power efficiency comparison I made stands. Overclocked, the 4090 is 40% more power efficient than the 7900 XTX.
Can I run this game gt 710
People buy amd??
@@wayne0320 yes because many don’t want to soend extra 1000$ so FPS goes from 90 -> 110
We have brains
My 7900 xtx was constantly crashing due to "driver timeout" but once I uninstalled AMD adrenaline software I havent had one crash. I cant believe AMDs own software was causing this... I regret not getting Nvidia.
Not only does RTX 4090 provides much higher framerate than RX 7900 XTX, but it also consumes less power and it runs cooler. Just brutal. RIP AMD🎉
LoL 7900xtx getting destroyed here, a lot of less fps while power consumption is 12%/30% then 4090
Lol! You fanboys grasp at straws just to make your point. Pathetic.
of course the 4090 is better, but it is also much more expensive.
Imagine spending 2x more for 4090 to get 20% more fps xD
Also xtx from this video is weird af. Even 7900xtx sapphire nitro + wont use more than 408-420W and his is using even 480W HOW. This is impossible.
@@MacTavish9619 This is what people ignore. I would literally have to spend $1000 more for a 4090 for an extra 20% that is not a good value for money. My XFX magnetic air 7900xtx uses about 390 watts.
@MacTavish9619 imagine spending 1k$ and get FStRash, sh1t performance in RT, crap performance in workload and tons of driver issues. Please
7900xtx = 460+ watts? 😅
RTX 4090 = 98% utilization (98% for 4090 means much more head room). I have one.
7900xtx = 100% no headroom.
Frame Gen. Meh.