@Thejacketof-huang Your entire youtube channel and every single comment is based around fanboying an Nvidia card. This is giving off some real strong "i saved up my allowance money to buy 4060 so i gotta seek validation" energy
I have been criticizing AMD for dissapointing us and not competing with Nvidia for a long time but thank you for reviewing this card like this. 7900 gre definitely seems an attractive option from the red team.
I still think it's too little too late, here in the UK you can get 4070 supers at the same price as the 7900 GRE, and although the 7900gre does win in some raster titles, I think the overall features package makes the 4070 Super the better choice
@@frankwainwright7826Yes, I think ray tracing is the future and there will be more and more RT only games like Avatar (which is sponsored by AMD by the way)
@@JimmyJr630 I honestly don't think 12 gigs of VRAM won't be enough for 1440p for a few years still. It's at 4k that textures are getting to the point where anything below 16gb is getting unusable. I'd rather get all the extra features and the efficiency Nvidia offers than just rasterization performance from AMD. Just different priorities/pov from me, I guess.
@@viniqf Alan Wake becomes vram limited at 1440p at max settings, soon more games will follow suit. The features that Nvidia has are not really worth it. DLSS just increases latency and frame gen just closes the gap that AMDs increased performance creates. And ray tracing i shown to be good for the 7900 GRE so it’s not like they have much of a lead on ray tracing and are beaten half the time.
@@lifemocker85 very much so. Nvidia needs to be having people buy cards every single generation. Even Intel's cheapest 16GB with a 256 bit bus isn't even close to being $500.
This was really insightful! Based on the 4070 vs 4070Super video, it seems like the 7900GRE might even be trading blows with the Super in rasterized performance. Super excited to see the next video!
My initial impressions for this card was really bad because hardware unboxed reviewed a china only reference model. Seems the partner models clock higher for a significant fps boost that actually makes this compelling.
Excellent work, Daniel. I agree that the 7900 GRE is the better card at $550, but the Powercolor Hellhound version you used is $580, which is the same price as your 6950 XT Merc 319. It seems that the 6950 XT and the 7900 GRE deliver nearly the same performance, the same VRAM, and the same memory bandwidth. Even the power draw is almost the same! The 4070 is out of my considerations for the reasons you listed, and now I'm trying to decide between the 6950 XT and the 7900 GRE. Would you be willing to compare them? No one on the internet has done this, and I don't think anyone ever will unless you do it. If you could, please display the fan speeds so we can estimate noise levels. Thanks a million!
I was looking at 4070 vs 7800xt vs 6950xt back in 2023 at september and buy 6950xt at August, was bad one I got as 21000 graphics score time spy then overclock and undervolt got stable 365W usage and score of 23300. 23990 highest score I got as overclock unstable. 7900gre get 21200 score as avg is really good, it's basically same as 6950xt, go for 7900gre for longer support. They are good as 4070ti almost.
I personally found your 7900GRE more informative and showing better variances from the 7800XT than HUB with the difference being 2-3% 🤷♂🤷♀🤦♂🤦♀. The 7900GRE is, imo a decent option despite being an Nvidia GPU user.
HUB just retested with Sapphire GRE and got noticeably better results than the reference AMD model they tested before. Seems like raised power limits and better cooling help AIB models quite substantially.
@@TheWoWBane I have both the GRE and 4070 Super. I can use RTX in Cyberpunk 2077 with RT-ON. But with RX 7900 GRE, the compromise is not worth enabling RT. I can turn ON shadows, etc. Both cards are from AiB partners and both are OC out of the box.
One number that caught my attention was the 280W, which is around the same power draw as the 4070 Ti Super, It'd be interesting to see a comparison from the perspective of "which company makes better use of those 280W
It's Nvidia, easily. I however find that it doesn't matter too much in large tower desktops. Now for smaller form factor PC's the lower profile versions of Nvidia GPU's are probably more desirable overall.
I think 4070 Super vs 7900 GRE, either one you pick you'll be happy. I wonder if this is gonna be the strategy from now on, up the CUs to match raytracing on Nvidia gpus next generation.
@@lifemocker85 not a gpu reviewer but I have 2 systems. 3080 12gb, and 7800xt. I play on 4k high settings with upscaling performance-quality (80 fps my target). I’ll experience first hand the short comings of VRAM. So far, I prefer playing on the 3080. It’s faster, performs better on 4k, and for some reason, Overwatch feels more responsive with reflex as opposed to anti lag.
@@Slambear faster I would say on par, depends on the game anyway. Difference is that 3080 has already lived long enough so it's kinda ok for the VRAM issue to be slowly appearing. On a 4070/super it's less ok. It was great for those people with 3070/ti as well...VRAM bottlenecking is already happening on selective title with 12gb on 1440p. But yeah overwatch and games like that it doesn't really matter. Maybe anti lag+ which was removed will help on AMD side.
with the 10gb model the vram is a problem but he has the 12gb model so yet he is to find problems because for now unless 4k there are no problems. the 3080 12 gb is also a bit faster aswell comapred to the 3080 10gb so it is around the same to slighty superior
I notice in Ratchet and Clank you show forty something FPS on the 4070 at 1440 with RT. Previously you benchmarked the 4070 with about 64 FPS at 1440 very high with RT. (RX 7800XT vs RTX 4070 vs RX 6950 XT- In the newest games!!) Techpowerup also shows about 70 FPS at 1440 highest with RT (custom scene). So 60-70 in other tests. Your current numbers are significantly lower. What's up with the discrepancy?
😱 you caught em red handed huh! Seems like an interrogation and not a question! If you want specific numbers and want to call people out for them, then your critique numbers shouldn't say you mentioned 40 something on other reviews! Let's be real here.
Maybe I'm old but all I ever want is smooth 60fps with good timing. People nowadays wanting 80+ fps is overkill in my opinion. Yeah I acknowledge it can help in a professional competitive environment, but c'mon, 99% aren't a pro. lol
@@etchieSketchie80fps+ isn't overkill, going over 120fps+ is IMO(for single player definitely). Then again 120fps is the max I would want but I'm fine with just 60-100fps for pretty much any game. Plus it'll be easier to run as well.
Most certainly depends on what you're playing and how fast your monitors response time is. Even back in the CRT days (where response time wasn't an issue and the internal processing time of the monitor was essentially 0ms) I preferred 75-85FPS in action packed games. Sure, "slow burn" investigation/puzzler/adventure games and RTS games often benefit from higher visuals at the expense of framerate, but racing sims, shooters and combat games are often so fast paced that you don't even notice the visual quality hit from dropping from Ultra to High, but you sure as hell notice the boost in responsiveness.
I was hyped for this card, however a few things turned me iff from it 1. The wild difference in performance from different manufacturers based on clock 2. Reports of thermal throttling in some cards 3. The price on some cards puts it on par to 4070 super 4. Ill wait since its a newish product
@@ciscovj4939 4070 Ti Super has 15% higher raster performance than 7900 GRE while consuming 40% more power than the 4070, I don't think it'll be enough to fulfill the conditions here.
maybe im fool but i went and buyed 4070 as it was discounted by 100 euros so total it was like 150 cheaper then 7900GRE. For 7900 i would of had to buy new power supply also. I have used AMD since rx 580 and all my AMD cards have cooling problems so i can see if nvidia is any better xD
Kind of in the same boat. Currently have a 2060S and the only worthwhile upgrade for me would be the 4070, since I wouldn't have to change my PSU or anything else. I'm still waiting tho since the price in CAD is just ridiculously high
Don't go for 12gb of VRAM guys... Not at this point in the ps5/xbox lifespan and with some games already asking for more, in 2 to 3 years you will definitely go through some games that will disappoint you.
I'm interested in the 4070 super vs 7900 GRE. As for ratchet and clank, in one of your older videos you showed that the 4060ti-16gb beats the 4070. It's very vram hungry, i wonder if that is because of direct storage. I think the 7900 GRE is a good deal.
Buyers remorse will always happen. You will wait for the nvidia and amd 5000 or 8000 series, but decide to wait for better and newer models and by the time that comes around the 6000 and 9000 series gets announced so you wait for that and the cycle repeats. Just be happy with your purchase
you been enjoying your cards for months shouldnt worry tbh. techs gets """"""old"""""" very quickly sadly. However as long as its doing what you want it to do or more is fine.
I say, wait a bit. Maybe look for the 4070 Super sale. Seems those "Super" models cooled down in the past two weeks. I was into RX 7900GRE/7900XT. I even purchased one but resigned the same day. Next week, I found a 4070 Super (open box) with a nice discount. I was not sure for a while, but I picked it. But, the card was already sold at their regular store. An hour later, the guy called me and asked if I still wanted one. I got a new one, and he also added a Kingston USB memory for "inconvenience". It was worth waiting for so long. For all the new AMD 7800XT, then the "Super" models, and even a bit longer. Good luck! Ps. Prices in Europe are far from those you have in the US. Especially during the Amazon Sale. We don't have any actual "Sale".
its crazy to me that a $800 card cant even do 60 fps in 1440p in modern games that look no different then games 5 years ago when a 2080 would push 100fps+ on 1440p easily
What about using XeSS instead of FSR. I know it isn't as efficient, but at 1440p or 4k? Does XeSS Balanced beat FSR Quality on AMD? Honestly if the performance and Vram differences continue like this, then XeSS might be good enough to have me move away from DLSS.
Gotta also remember that FSR isn't the only upscaling option you have on AMD cards, it's usually not that hard to replace it either natively or using mods.. and even with a higher performance hit you'd still be ahead most of the time, especially when also doing RT.
The RX7700xt has also dropped its MSRP. In Canada Prices are looking like (scores from gpu.benchmark using average score): card cost score $/score rx7700xt $560 100 5.60 rx7800xt $680 118 5.76 rx7900gre $750 127 5.91 Which is a bit of a surprise. Looks like the rx7700xt's price drop makes it more interesting. Comments?
The 7700 XT is not that interesting as an investment at even $400 US. We are still not in the last phase of the PS5 era, we are at the 2nd phase since games are more demanding and most Tech RUclipsrs do not test the latest UE5 titles. I am saving and waiting for RDNA4.
as a person who mostly plays older AAA titles and has no interest in newer titles, i think a 7700xt will be fine for my new build. I have a 1440p monitor so mmax id go is still a 7800xt. I still think 7800xt is the sweet spot for most people
@@kt311. Agreed. I've not seen much said about the new MSRP for the rx7700xt. Figured that it was interesting in that in makes the card a better value for some use cases which was not the case before. As it stands now, I'd love to get a rx7900gre. Aside from games I also play with locally run AI and the extra compute units in the gre (and the 16G) will help.
The Ngreedia Fanboi Army has been deployed here. Get ready for "Dlss is better" and "RT perfomance" spammed like hell even though Daniel proves both arguments wrong in the this matchup.
Yup. And then they go "I wish AMD was more competitive" in every video about GPU prices, because they want cheap AMD GPUs to push the prices of Nvidia GPUs down. And Nvidia knows all this perfectly well. The sales numbers don't lie. Unless they're willing to say "Screw you Nvidia. I'm going AMD for this generation, and I'll only come back if things improve" nothing will change.
My way to play cyberpunk on 7900gre is 4k/high intel xess ultra quality and Fsr3mod and i will get 110-120fps , I play with the same settings on my 7800xt and i have 100fps its very good to play cybepunk like this especially on 4k120hz 50inch tv
7700 moved to 419. Should have moved to 399. 7800xt to 469 And 7900 gre to like 539 just to be slightly cheaper than 4070 and would have been more of a home run for AMD. But hey any company MOVING PRICE TO PERFORMANCE DOWN rather than up. Like some others…it’s always a win for consumers.
@@lifemocker85 we all wish the 4090 was 1000 but that’s not feasible. has to continue to move price to performance. Rdna 4 hopefully brings that level of performance down to more consumers.
Thank you for doing the 1080p tests, im one of those people who intend to buy the GRE for 1080p i also use Vsync, This is mainly because im still rocking a good 60Hz 1080p LED TV (46 inch) and im content with it, Plus i can max all my settings in all my games, Sure its not as sharpe as 1440p or 4k but in the Radeon software i can turn on Radeon image sharpening which is pretty good (10% sharpening is what i use and like)
This is a super cursed comment bro. 1080p monitors are fine especially for high hz competitive fps players but why on earth would you want a 7900 gre for 60hz 1080p? That's like buying a helicopter solely to use the rotar blades to chop your food. Vsync too ahhh, lord have mercy
You can undervolt 4070 and slightly overclock memory => same performance with ~140-150W consume, lol. So tech wise ADA is killing these AMD radiators lol
Take note that the Hellhound is NOT $550, but $580, and you can get a 4070 super for $590. If you use the reference model GRE it looks notably worse than the overclocked models.
@@HanSolo__ Willfully introducing overclocked models into testing creates a minefield of problems, because no 2 cards of even the same model overclock the same. Not to mention then you need to overclock the Nvidia card as well. What if you get an amazing OC on the 4070 super and a terrible OC on the hellhound? Using overclocked models as representative of an entire class of GPU is just a bad idea.
The GRE seems to consistently run 8-12 degrees cooler even though the GRE is 270 watts and the 4070 is only 200 watts. That's a credit to the PowerColor. Maybe they should rename themselves PowerCooler. ;)
Every other week I go I will get a 7800xt this month and then another contender pops up making me go, maybe I will wait a month, hold strong my little 2070S, still, I might be holding out hope since out market tends on the side of, "do you want to be gouged, or out of stock".
That's a never ending cycle as a GPU buyer. I was just about to buy a red devil 6950xt tomorrow (really good prices on the used market btw) after thinking it through for a week. Just found out it doesn't support anti lag+ for fps shooters, now im set back a week lol
As a user with a GTX 1070 MSI Armor video card, i am sadly in need to get a new one. As i was looking around (from the start of the year) in my Country, i have decided to get a GPU for my b-day in September, as the last GPU i purchased was back in 2017. So the cards i can afford are: GIGABYTE GB R79GREGAMING OC-16GD (the one im learning towards) and a GIGABYTE GB N407SEAGLEOC ICE-12GD So i was like hmmmmm I don`t buy GPUs that often, RT is meh to me - i like it, but i don`t need it ..... And i am worried that 12gb of vRAM will NOT be enough in the next 3-5 years, before i can afford my next GPU :) (4070 Super will cost me: 740$ / and the 7900GRE will cost me: 660$) Considering how close the 2 are id say its better to simply save the dough and not go for Nvidia as its just more expensive...... The thing about the 7900GRE is that its main issue is its held back by its bandwidth due to its memory. So i asked my cousin who works at a PC repair shop, if he could potentially change my GPU`s memory in 1-2 years down the line and he Said YES, that can be done. I even asked him how much it would cost and he gave me a reasonable price. I`m thinking of going down the AMD route this time around as i simply HATE the idea of having a 12VHPR cable in my PC case...... My question is - what would be a good CPU/ RAM speed to pair with the RX7900GRE? Maybe a R7 7800X3D and a DDR5, 6000 mHz kit with tightest timings i can find for a decent price?
LG is to dim. Samsung QN90 in Game Mode is way better with 2000+ nits brightness and no burn in and better blacks than OLED. Sony has set the new standard of 4000 nits which means OLED TV is dead. Play on QN90 and you will see the difference is mind blowing and you never go back to dim OLED. OLED is only good for phones and tablets with Samsung AMOLED.
@FunFunFun8888 I think lg is fine if you where to get an oled tho it would be the smamsung qd oled 1000 nits and also oled I have the lg c3 it's only 861 nits but still good enough depends what you want tho more brightness or less perfect blacks
AMD recently confirmed the OC limits on this card are being removed. The Sapphire Pulse model at 550 with a stable aggressive OC will be an incredibly desirable midrange product.
💥 Sharpness and definition are really bad in NVIDIA, serrated? I have an notebook Lenovo Legion 5i i7 12700h RTX 3060 and the image IN THAT THINGS are worst than in my Acer Aspire 5 Ryzen 7 5700u Integrated Graphics in the same conditions! NVIDIA ugly, AMD beautiful! And the filters of Experience increaaes the sharpness, but don't turn to beautiful, like AMD!
The thing about the showing the way the games would really be played - AMD users usually don’t use FSR because of the visual quality hit so it’s really often DLSS vs native
Depend on resolution. Fsr at 4K or even 1440p is good. Maybe if I put my 4070 and 7800xt systems next to each other and compared, I could see a difference but honestly I can’t tell at quality settings. Dlss does better at the balanced setting for sure though
FSR 2+ on Quality when selecting 1440p or higher in the menu actually looks pretty good. I personally don't ever use below quality on any upscale. TSR which is an unreal engine 5 upscale can look better than even DLSS sometimes if you compare quality mode vs quality mode. They all have pros and cons, nvidia is never perfect either.
@@meesironmanIts B980 supposed flagship around 4070ti/4080 performance at half the price. Also a lower tier card just need to wait on what is released q2/3
! Correct me if I'm wrong, trying to clear up what I've seen: ! As I understand it, someone needs to release an unlocked VBIOS to overcome arbitrary limitations on the GPU & Memory clocks? That the GDDR chips on the GRE are the same as on the 7800XT and are rated for 20Gbps (GRE set to 18, 7800XT set to 19.5), which would be a board power issue at reference, but AIBs are providing a power boost as well. So if the VBIOS wasn't limiting the memory clock, it would be within spec (not even pushing them) to get 11% more memory bandwidth. Pair that with an undervolt for the GPU and there should be sufficient power to gain some ground on the GPU clock as well? As it stands, the GRE looks like an OK offering. With an unlocked VBIOS it would be the best value, even paying $30 extra for an AIB with increased board power and dual VBIOS.
maybe on Nitro plus where is not so much power limited, Hellhound also takes bit more juice than reference. Vram speeds holds this gpu and its not good for OC, maybe some models can go higher than stock on vram speed.
Did anyone else notice that 7900 GRE uses almost 100W or more than 4070 to be equal or win in some games by a small margin ! Thats great from fps to price ration perspective sure but when 7900 goes even above 300W for that to be achieved when 4700 almost never get more than 200, theres a huge gap there needs to be mentioned.
Its clear he wants to do a clickbait video for money and that narrative doesn’t get him more money. From the benchmarks you can figure out the super is just going to be better outright here
@@luzhang2982Wtf are you talking about? Official MSRP for the 4070 and 7900 GRE is $549, the 4070 Super is $599. What clickbait and narrative is he creating?
@@luzhang2982 No its not clear. They cost around $550 in the US. What is this? Anytime someone does a comparison of products they dont like or agree with it must be bias or theyre in it just for the money. All youtubers make videos for the clicks and money so its like saying the sky is blue. You just want to see Nvidia win and other clowns who want to see AMD win accuse him of being an Nvidia shill. Its just a comparison between two products that cost the same in his country of residence and where most of his audience resides. Its easy to understand. I can walk in at my local store and buy a 4070 for $550 or a 7900 GRE for the same price. This video makes perfect sense
Just one thing, ratchet and clank is not optimized for Nvidia graphics cards, it's more VRAM hungry than usual, because it was built for PS5 in mind (amd architecture, more VRAM dependent) that game doesn't benefit from the huge amount of L2 cache that Nvidia cards has
It's kind of a weird title. The level of RT it runs on the consoles is plainly not available in the PC port settings. So in raster it favours AMD a good deal but with RT it completely tanks on AMD like Cyberpunk.... and lo and behold, the PC port is an "Nvidia sponsored title".
@@javierarroyo6600Obviously it's not a coincidence. Consoles have 16 gigs of shared Vram they can dynamically gimp as needed. If you're behind the consoles, you start running into issues, always been like that every console gen.
The 4070 needs to go back to $520 or even lower as it’s meant to compete with the 7800xt while the 7900gre is similar to the 4070s in raster. At just a $50 gap, I might opt for the 4070s for the feature set. Wish they kept the gap at $100 to make it as attentive as the 7800xt was when the 4070 was still at $600
I hate how nobody does RT comparisons most of the time. Thnx for this. But tbh right from the beginning of the video the 7900GRE is using 100 more watts than the 4070... for a mere 5 fps that kinda adds up
Im starring at the watt usage there... 290watt + n the 7900 gre is really really ouch. Im not entirely done with the video but I believe theres no undervolting involved there? cause almost 100watt more power draw is insane.
Please add RT with 1080p testing If TSR looks better than FSR, you should put TSR against DLSS instead of FSR since I would choose the one that makes my game look better and thus TSR is the more realistic use scenario for AMD
you can always turn on FG in the AMD settings menu even if the game doesn't have a toggle for it. So you can have FG in games like Cyberpunk on AMD hardware.
Damn 7900 GRE actually SLAMS the 4070 unless u like really high RT to the point where the 4070 is getting 30fps, the rasterization increases is like a whole teir leap for AMD and priced the same that’s some good comp from AMD, nvidia kinda has to lower 4070 price now
The Unique problem that I SEE in LATAM Even 7900 xt cost almost the doble price That the original 4070, I hope this helps to nvidia to improve their future gpus because amd now is having a good performance with raytracing 👀
Seeing the 4070 already running out of Vram for Ratchet and Clank at 1440p is a huge worry. I have a feeling that 12gb is not going to age well at all in the next year or so. Nvidia has to make 16gb the minimum from here otherwise consumers aren't going to have a bar of it. The greed needs to stop.
Thank you for the benchmarks Daniel, but man oh man! the game industry must be in really bad state if you still keep testing some of the already irrelevant or forgotten titles.
4070 and the 7900 GRE are somewhat what I can consider the actual minimum as seen from this video as playable for RT, even not relying on upscaling. I have been wondering how long we should be waiting for RT to be playable without much performance penalty. Considering that is if RT is an option instead as a requirement (full RT rendering). Maybe 2 generations later from both AMD and NVIDIA that I would consider acceptable for RT performance judging from this video, if we consider 20-30% RT performance uplift per generation.
As someone with a 7900XTX, those AMD drivers and weak features just mar your experience. Driver crashes, FSR2 artifacts, weak support for frame generation, etc. AMD needs to work on its software this generation.
i still think that the 4070 super its the real competition of the 7900 gre, sure its more expensive but not by that much the performance difference would be less but all the rtx and dlss improvement would be way more than the 7900 gre
Hello, Daniel! IMHO it is not right to show on benchmarks results with DLSS FG and not at least mention that the AMD FG can be already used for any DLSS game that have FG. I understand that it is not official feature but considering the fact that FG from AMD became opensource, in the benchmarks it should be mentioned that FG from AMD is available on games like Cyberpunk. If you read that, I really appreciate, thank you for your great job!
Nice vid. I was surprised at the RT performance. If there was something like this that was comparable to a 4080 with RT, I would have strongly considered buying that over an Nvidia GPU.
There has to be something remedy can do to improve AMD gpu performance in Alan Wake since they definitely spent the time optimizing Nvidia performance, the way starfield was initially AMD leaning but then Nvidia performance improved a few weeks later
Hi Owen, thats what I have been saying for some time now, AMD has to target Nvidias RT performance on their GPUs. As AMD match on RT, and we know their performances is lower in RT, the result will be a RX card with about 30% above Nvidias in native resolution. Then, buyers will turn their heads other than only think in Nvidia for their graphic cards. As RT is becoming more and more a reality, other than just a show off, like in the begining, I feel that that is why, Nvidia cards, blew away the competition.
Better frame generation. Better upscaling. Better / cleaner ray tracing with ray reconstruction. The GRE is compelling but I would personally push the upscaling another notch and stick with the 4070. Realistically I would probably just look to spend the extra 50 to get the super though.
i just WANT DLSS. i hope FSR gets exactly as good as DLSS in every regards, then i would consider an AMD GPU. upscaling is a godsend, and will be a godsend in the longevity of any GPU
but its cool that an AMD card of the same price beating some 4070 benchmarks which costs the same. however i would say that the 4070 is bad for 550. everyone should buy the 4070 super for 600 instead, which will beat every GRE result, while still using less power. BUT 12gb VRAM feels bad. thats why i would save up some more money and buy at least a 4070ti super.
the GRE is using 50% more power for the same or 10% more performance. very bad efficiency, thats a fact. but most ppl wont give two fks about that so its just theoretical. it is a sign of superior nvidia chip maturity tho. i just dont lile the 4070 at 550. it should be 500.
"I just WANT DLSS" "[sic] the 4070 is bad for 550" That's your problem. You're so infatuated with Nvidia's software tricks that you'll complain about Nvidia's prices... and then go buy them anyway. Which tells Nvidia that they're doing the right thing. Because your whining means jack shit when your wallet doesn't back it up.
@@andersjjensenwhat the fck are you talking about? i wouldnt buy a 4070. i wouldnt buy a 4070S. if you get a 4070ti super for 800.- or a 7900XT for 720-730.-, i would buy the 4070TIS bc i want DLSS and 16gb VRAM is enough for me. for 1000.-i i would buy the 4080S instead of a 900-1000.- 7900XTX bc 16GB VRAM is enough for me and i want DLSS. also the better efficiency is just crazy. im in EU, so its relevant for me how much power my stuff uses. i would def buy a X3D AMD CPU bc they are better and more efficient for gaming than Intels offerings. But in terms of GPUs, Nvidia can ask for more money than AMD bc Nvidia GPUs are better in every way in my opinion. except for the price.
Got the green light from the wife to get a gpu upgrade (so she can use my old one haha), we set a budget for 600 bucks and this one caught my eye. Would it be a logical step to upgrade from a RTX3070 to the RX 7900 GRE? Or would you recommend a RTX4070 Super?
The next video is definitely for me. Im using a 1080ti currently and want to upgrade. Im considering between 3080, 3080ti, 4070, and 4070 super. The game im playing most I use lots of mods in and it really kills the 1080ti. Not sure if the higher memory bandwidth of the 3080's might be better or not.
It strikes me that there's unusually big upgrade available to AM4 people if they were to use 5800X3D and an RX 7900 GRE...let's say you're on a 2700X with a 2060RTX at the moment, it's kinda crazy how huge the performance gains would be just for swapping out that CPU, and getting this midrange GPU...it would launch that aging PC that's only just pushing 1080p low/med quality... right into the 120fps/ultra quality/1440p domain, at a cost that would be double or more a couple of years ago...and even then, you'd need to change other components potentially the whole system! I just can't decide if it's worth upgrading from a 5800X+3070 or wait for NVidia 5XXX gpus and let AM5 hardware fall in price a bit more.
It's a great option but definitely keep in mind it is consistently using about 50% more power which is ~100W more. If you game several hours a day that cost could add up to a few extra dollars per month. Not enough for me personally to care but something to keep in mind.
Love the review but the PowerCooler Hellhound is retailing much higher than all the other 7900 GRE's ... In the US it is retailing at or around 579.99. Which is almost the price of a 4070 Super. :P :|
Mofos in chat think their electricity rates are similar to current era Europe. Check your electricity rates peeps, they are largely negligible for a device like a PC in most countries.
i received two dead rx6800 from bestbuy so forced myself to buy a 4070 even though i love amd, and now im questioning getting the GRE as its the same price
the power draw for AMD is like 33% higher constantly.. scares me a bit. just ordered the Nitro+ model anyhow ;D 3440x1440 on RTX 3060Ti wasn't great all the time. Hoping for some serious power now :p
@@lifemocker85 No. It's a 150W GPU undervolted. No wait, it's a 100W GPU if you just undervolt it enough. Seriously, stop being a bonehead. Every GPU is better if you tweak it... unless you lost the silicon lottery and got a chip that just barely made the cut. Then you can tweak and twerk all you want, and all you're going to get is a few percent.
@@lifemocker85 I was actually sticking up for the GRE with the comments of ‘how inefficient it is’. IMO 1440 high settings is where both of the cards belong. Doubtful either card has issues running those setting for 3+ years
Some AIB models are going for 550$, Power Color Hellhound is going for 580$. Which is still better than what some people said (600+ for all AIB models).
All AMD has to do on pricing is line it up based on RT performance. From there, raster will be much higher per $ and always have more vram than nvidia (for the parts with more than 8gb anyway, but 8gb is a joke on both sides so..). The GRE is almost a no-brainer based on that alignment today with the 4070, unless you really want lower power draw of the 4070 or 4070S
I wonder if the issue is purely vram or if it's the memory bandwidth. I've noticed plenty of issues with my 4070 super in Ratchet & Clank and Far Cry 6 at 4K. And I'm seeing that the vram is getting nearly maxed out. At least when it comes to allocation. It would be interesting to see the 4070 super compare to the 3080 TI. Since the 4070 super should be a little bit more powerful, but the 3080 TI has a higher memory bandwidth and they're both 12 GB cards.
Thank you AMD for releasing the GRE!! Gives me flashbacks to the R9 290X a little bit. We love wide GPUs with lots of cores and memory!!! Granted, we're losing some power efficiency, but AMD has improved their power efficiency a lot since the 290x obviously.
The little Daniel moving around the screen is hilarious
He's floating around like a little fairy😆
It's one of my favorite parts of this channel lol
@@thicclink agreed, pretty original in the tech creator space
when he points out stuff in articles, it's the best. it's both funny and functional lol
legit only reviewer that does that. love these videos
I’m glad you’re keeping these up to date with all the benchmarks and real-life gameplay tests. Great work keep it up 👍
Can you make a meme of walter white crying in the basement (he is the gpu industry) and skylar is GTA 6? Love your vids btw
Dude, stop commenting and make memes!
It's over AMBug users 💀
@Thejacketof-huang Your entire youtube channel and every single comment is based around fanboying an Nvidia card. This is giving off some real strong "i saved up my allowance money to buy 4060 so i gotta seek validation" energy
When Daniel shrinks down to a tiny size (aaa!), you know that things are about to get serious.
😂😂😂
I have been criticizing AMD for dissapointing us and not competing with Nvidia for a long time but thank you for reviewing this card like this. 7900 gre definitely seems an attractive option from the red team.
I still think it's too little too late, here in the UK you can get 4070 supers at the same price as the 7900 GRE, and although the 7900gre does win in some raster titles, I think the overall features package makes the 4070 Super the better choice
@@frankwainwright7826Yes, I think ray tracing is the future and there will be more and more RT only games like Avatar (which is sponsored by AMD by the way)
@@frankwainwright7826Vram is more important. The 7900 GRE will last 5 years but 4070 super owners will become vram bottlenecked in new AAA games
@@JimmyJr630 I honestly don't think 12 gigs of VRAM won't be enough for 1440p for a few years still. It's at 4k that textures are getting to the point where anything below 16gb is getting unusable. I'd rather get all the extra features and the efficiency Nvidia offers than just rasterization performance from AMD.
Just different priorities/pov from me, I guess.
@@viniqf Alan Wake becomes vram limited at 1440p at max settings, soon more games will follow suit. The features that Nvidia has are not really worth it. DLSS just increases latency and frame gen just closes the gap that AMDs increased performance creates. And ray tracing i shown to be good for the 7900 GRE so it’s not like they have much of a lead on ray tracing and are beaten half the time.
Nice review Danial. It clearly shows 12gb vram limit when using higher resolutions and raytracing.
For what it costs the 4070 super and all of the 12gb variants are pretty much doa.
These 12 gb cards are great, only while they last tho
This seems to be a problem with 70 class from the last couple of generations. Being very limited by the VRAM.
@@KevDawg1992 Planned obsolescence
@@lifemocker85 very much so. Nvidia needs to be having people buy cards every single generation. Even Intel's cheapest 16GB with a 256 bit bus isn't even close to being $500.
This was really insightful! Based on the 4070 vs 4070Super video, it seems like the 7900GRE might even be trading blows with the Super in rasterized performance. Super excited to see the next video!
The 4070 Super will crush the 7900 GRE
@lp2fz not really. lol But I'm sure Daniel will also test this in a day or two.
nah take a seat fanboy with your 12gb vram@@Lukas-lp2fz
@@kosmosyche😂 I think he's gonna cry himself to sleep real soon.
35:38 @@Lukas-lp2fz rekt
My initial impressions for this card was really bad because hardware unboxed reviewed a china only reference model. Seems the partner models clock higher for a significant fps boost that actually makes this compelling.
HU have had shady testing for years. They finally got caught out by the look of it.
Excellent work, Daniel. I agree that the 7900 GRE is the better card at $550, but the Powercolor Hellhound version you used is $580, which is the same price as your 6950 XT Merc 319. It seems that the 6950 XT and the 7900 GRE deliver nearly the same performance, the same VRAM, and the same memory bandwidth. Even the power draw is almost the same! The 4070 is out of my considerations for the reasons you listed, and now I'm trying to decide between the 6950 XT and the 7900 GRE. Would you be willing to compare them? No one on the internet has done this, and I don't think anyone ever will unless you do it. If you could, please display the fan speeds so we can estimate noise levels. Thanks a million!
Get the 7900 for the newer architecture and better RT implementation
@@JVCFever0 If I recall correctly, Gamers Nexus showed the 6950 being slightly faster at ray tracing than the GRE is.
So, it's the about the same price at the 4070 Super, which I've seen at 589?
I would just jump up to the Sapphire Pulse AMD Radeon RX 7900 XT amazon and newegg $699
I was looking at 4070 vs 7800xt vs 6950xt back in 2023 at september and buy 6950xt at August, was bad one I got as 21000 graphics score time spy then overclock and undervolt got stable 365W usage and score of 23300.
23990 highest score I got as overclock unstable.
7900gre get 21200 score as avg is really good, it's basically same as 6950xt, go for 7900gre for longer support.
They are good as 4070ti almost.
I personally found your 7900GRE more informative and showing better variances from the 7800XT than HUB with the difference being 2-3% 🤷♂🤷♀🤦♂🤦♀. The 7900GRE is, imo a decent option despite being an Nvidia GPU user.
Nvidia user enjoying castrated cards
HUB just retested with Sapphire GRE and got noticeably better results than the reference AMD model they tested before. Seems like raised power limits and better cooling help AIB models quite substantially.
They re reviewed it
This!
@@GewelRealnVidia cards are good, just overpriced.
AMD doesn't get enough credit for just how well their cards do Ray tracing more.
You mean they aren't being bashed at on how TERRIBLE they are
Anything below a 4080 super is going to be mediocre at raytracing. Not a big difference between 4070 super and 7900 gre
@@TheWoWBane I have both the GRE and 4070 Super. I can use RTX in Cyberpunk 2077 with RT-ON. But with RX 7900 GRE, the compromise is not worth enabling RT. I can turn ON shadows, etc.
Both cards are from AiB partners and both are OC out of the box.
Yeah most ppl choose Nvidia for nothing
They still have a ways to go but they seem to be finally catching up on the rt front.
NVIDIA fans when AMD is good at raytracing: "TRASH, ABSOLUTE TRASH!!!! BUY NVIDIA"
One number that caught my attention was the 280W, which is around the same power draw as the 4070 Ti Super, It'd be interesting to see a comparison from the perspective of "which company makes better use of those 280W
It's Nvidia, easily. I however find that it doesn't matter too much in large tower desktops.
Now for smaller form factor PC's the lower profile versions of Nvidia GPU's are probably more desirable overall.
Undervolt
Nvidia, but not by that much tbh, 4070 Ti Super is only about 15% faster in raster on average according to Techpowerup.
eh power usage doesnt really matter when your thinking on buying a 550 dollar/600 euro card lol
Just look at the die process for your answer. N5+N6 chiplets vs 4N monolithic
I think 4070 Super vs 7900 GRE, either one you pick you'll be happy. I wonder if this is gonna be the strategy from now on, up the CUs to match raytracing on Nvidia gpus next generation.
You wont be happy long with only 12gb
@@lifemocker85 not a gpu reviewer but I have 2 systems. 3080 12gb, and 7800xt. I play on 4k high settings with upscaling performance-quality (80 fps my target). I’ll experience first hand the short comings of VRAM. So far, I prefer playing on the 3080. It’s faster, performs better on 4k, and for some reason, Overwatch feels more responsive with reflex as opposed to anti lag.
@@Slambear faster I would say on par, depends on the game anyway. Difference is that 3080 has already lived long enough so it's kinda ok for the VRAM issue to be slowly appearing. On a 4070/super it's less ok. It was great for those people with 3070/ti as well...VRAM bottlenecking is already happening on selective title with 12gb on 1440p. But yeah overwatch and games like that it doesn't really matter.
Maybe anti lag+ which was removed will help on AMD side.
with the 10gb model the vram is a problem but he has the 12gb model so yet he is to find problems because for now unless 4k there are no problems. the 3080 12 gb is also a bit faster aswell comapred to the 3080 10gb so it is around the same to slighty superior
@@Slambear those are not 4k gpus
I wished this came out before i bought my 7800xt lol. Can't cry over spoiled milk. And I'm enjoying my 7800xt
I bought my daughter a 7800 xt last month but if this was out I would have gone with it instead. She won’t notice the difference 😂🤣😂🤣
Same. I like my 7800XT but it would have been nice if AMD would have just launched the 7900 GRE everywhere at the same time. It would have sold well.
4070 super is still the best mid tier option so it doesnt matter
@@iequalsnoob LMAO 🤣😂🤣😂🤣😂🤣😂🤣😂😂🤣
@@txmetalcobrait would have been more expensive
I notice in Ratchet and Clank you show forty something FPS on the 4070 at 1440 with RT. Previously you benchmarked the 4070 with about 64 FPS at 1440 very high with RT. (RX 7800XT vs RTX 4070 vs RX 6950 XT- In the newest games!!) Techpowerup also shows about 70 FPS at 1440 highest with RT (custom scene). So 60-70 in other tests. Your current numbers are significantly lower. What's up with the discrepancy?
😱 you caught em red handed huh! Seems like an interrogation and not a question! If you want specific numbers and want to call people out for them, then your critique numbers shouldn't say you mentioned 40 something on other reviews!
Let's be real here.
Maybe I'm old but all I ever want is smooth 60fps with good timing. People nowadays wanting 80+ fps is overkill in my opinion. Yeah I acknowledge it can help in a professional competitive environment, but c'mon, 99% aren't a pro. lol
Fast internet and monitor response are just as important as high fps. If either one of those things are lacking more frames will just be more shit.
@@BlackJesus8463 yup
@@etchieSketchie80fps+ isn't overkill, going over 120fps+ is IMO(for single player definitely). Then again 120fps is the max I would want but I'm fine with just 60-100fps for pretty much any game. Plus it'll be easier to run as well.
Most certainly depends on what you're playing and how fast your monitors response time is. Even back in the CRT days (where response time wasn't an issue and the internal processing time of the monitor was essentially 0ms) I preferred 75-85FPS in action packed games. Sure, "slow burn" investigation/puzzler/adventure games and RTS games often benefit from higher visuals at the expense of framerate, but racing sims, shooters and combat games are often so fast paced that you don't even notice the visual quality hit from dropping from Ultra to High, but you sure as hell notice the boost in responsiveness.
120 is generally what people use. Personally I upgraded to 100 and I can't go back lol
The Best comparison I've ever seen. you focused on usage and not just numbers
I was hyped for this card, however a few things turned me iff from it 1. The wild difference in performance from different manufacturers based on clock 2. Reports of thermal throttling in some cards 3. The price on some cards puts it on par to 4070 super 4. Ill wait since its a newish product
I would rather have a GPU with 7900 GRE raster performance and VRAM and 4070 power consumption and RT/DLSS
On your oldass lcd display?
Just buy a 4080 and downclock it, problem solved /s
@@HunterTrackslol. Why would you jump to a card that cost double 😂🤣😂🤣. Wouldn’t it make more sense to get a 4070 TI super before jumping to a 4080?
@@ciscovj4939 4070 Ti Super has 15% higher raster performance than 7900 GRE while consuming 40% more power than the 4070, I don't think it'll be enough to fulfill the conditions here.
@@HunterTracks but the 4080 uses more power than the 4070 TI super. Wouldn’t it be easier to under volt the 4070ti super than the 4080?
maybe im fool but i went and buyed 4070 as it was discounted by 100 euros so total it was like 150 cheaper then 7900GRE. For 7900 i would of had to buy new power supply also. I have used AMD since rx 580 and all my AMD cards have cooling problems so i can see if nvidia is any better xD
Kind of in the same boat. Currently have a 2060S and the only worthwhile upgrade for me would be the 4070, since I wouldn't have to change my PSU or anything else. I'm still waiting tho since the price in CAD is just ridiculously high
Bruh your RX580 didn't have cooling problems. lol
Got my GRE for 2 weeks now and ou boy i am very happy! I play at 1440p and have no issues.
Don't go for 12gb of VRAM guys... Not at this point in the ps5/xbox lifespan and with some games already asking for more, in 2 to 3 years you will definitely go through some games that will disappoint you.
I'm interested in the 4070 super vs 7900 GRE. As for ratchet and clank, in one of your older videos you showed that the 4060ti-16gb beats the 4070. It's very vram hungry, i wonder if that is because of direct storage. I think the 7900 GRE is a good deal.
you getting the 7900 GRE?
When i got my pc 4070 was the only gpu in this price range (New) now seeing these cards coming out makes me wish i waited.
Buyers remorse will always happen. You will wait for the nvidia and amd 5000 or 8000 series, but decide to wait for better and newer models and by the time that comes around the 6000 and 9000 series gets announced so you wait for that and the cycle repeats. Just be happy with your purchase
you been enjoying your cards for months shouldnt worry tbh. techs gets """"""old"""""" very quickly sadly. However as long as its doing what you want it to do or more is fine.
Do we think the 7800xt will be reduced? I’m thinking of making my first pc but unsure if I should wait a bit
I say, wait a bit. Maybe look for the 4070 Super sale. Seems those "Super" models cooled down in the past two weeks.
I was into RX 7900GRE/7900XT. I even purchased one but resigned the same day. Next week, I found a 4070 Super (open box) with a nice discount. I was not sure for a while, but I picked it. But, the card was already sold at their regular store. An hour later, the guy called me and asked if I still wanted one. I got a new one, and he also added a Kingston USB memory for "inconvenience". It was worth waiting for so long. For all the new AMD 7800XT, then the "Super" models, and even a bit longer.
Good luck!
Ps. Prices in Europe are far from those you have in the US.
Especially during the Amazon Sale. We don't have any actual "Sale".
its crazy to me that a $800 card cant even do 60 fps in 1440p in modern games that look no different then games 5 years ago when a 2080 would push 100fps+ on 1440p easily
What about using XeSS instead of FSR. I know it isn't as efficient, but at 1440p or 4k? Does XeSS Balanced beat FSR Quality on AMD? Honestly if the performance and Vram differences continue like this, then XeSS might be good enough to have me move away from DLSS.
Gotta also remember that FSR isn't the only upscaling option you have on AMD cards, it's usually not that hard to replace it either natively or using mods.. and even with a higher performance hit you'd still be ahead most of the time, especially when also doing RT.
The RX7700xt has also dropped its MSRP. In Canada Prices are looking like (scores from gpu.benchmark using average score):
card cost score $/score
rx7700xt $560 100 5.60
rx7800xt $680 118 5.76
rx7900gre $750 127 5.91
Which is a bit of a surprise. Looks like the rx7700xt's price drop makes it more interesting. Comments?
The 7700 XT is not that interesting as an investment at even $400 US. We are still not in the last phase of the PS5 era, we are at the 2nd phase since games are more demanding and most Tech RUclipsrs do not test the latest UE5 titles.
I am saving and waiting for RDNA4.
as a person who mostly plays older AAA titles and has no interest in newer titles, i think a 7700xt will be fine for my new build. I have a 1440p monitor so mmax id go is still a 7800xt. I still think 7800xt is the sweet spot for most people
What on earth is gpu.benchmark score? What about actual game results?
@@kt311. Agreed. I've not seen much said about the new MSRP for the rx7700xt. Figured that it was interesting in that in makes the card a better value for some use cases which was not the case before. As it stands now, I'd love to get a rx7900gre. Aside from games I also play with locally run AI and the extra compute units in the gre (and the 16G) will help.
@@takehirolol5962Yeah I'm also waiting on RDNA 4 to see what it offers for price to performance as well.
Found a new 4070 on amazon at $385. Wouldn't spend 540 for its original price, but I could definitely say it was the best 400$ gpu out at that time.
The Ngreedia Fanboi Army has been deployed here. Get ready for "Dlss is better" and "RT perfomance" spammed like hell even though Daniel proves both arguments wrong in the this matchup.
Yup. And then they go "I wish AMD was more competitive" in every video about GPU prices, because they want cheap AMD GPUs to push the prices of Nvidia GPUs down. And Nvidia knows all this perfectly well. The sales numbers don't lie. Unless they're willing to say "Screw you Nvidia. I'm going AMD for this generation, and I'll only come back if things improve" nothing will change.
My way to play cyberpunk on 7900gre is 4k/high intel xess ultra quality and Fsr3mod and i will get 110-120fps , I play with the same settings on my 7800xt and i have 100fps its very good to play cybepunk like this especially on 4k120hz 50inch tv
7700 moved to 419. Should have moved to 399. 7800xt to 469 And 7900 gre to like 539 just to be slightly cheaper than 4070 and would have been more of a home run for AMD. But hey any company MOVING PRICE TO PERFORMANCE DOWN rather than up. Like some others…it’s always a win for consumers.
OLED is better than raytracing. ✌
78xt should be 400
@@lifemocker85How about RX 7900 GRE?
@@lifemocker85 we all wish the 4090 was 1000 but that’s not feasible. has to continue to move price to performance. Rdna 4 hopefully brings that level of performance down to more consumers.
@@masterlee1988 450 after taxes max
Thank you for doing the 1080p tests, im one of those people who intend to buy the GRE for 1080p i also use Vsync, This is mainly because im still rocking a good 60Hz 1080p LED TV (46 inch) and im content with it, Plus i can max all my settings in all my games, Sure its not as sharpe as 1440p or 4k but in the Radeon software i can turn on Radeon image sharpening which is pretty good (10% sharpening is what i use and like)
This is a super cursed comment bro. 1080p monitors are fine especially for high hz competitive fps players but why on earth would you want a 7900 gre for 60hz 1080p? That's like buying a helicopter solely to use the rotar blades to chop your food. Vsync too ahhh, lord have mercy
You can undervolt 4070 and slightly overclock memory => same performance with ~140-150W consume, lol.
So tech wise ADA is killing these AMD radiators lol
12gb is planned obsolescence
@@lifemocker85 Just buy 4090 or 7900 xtx
@@dsfdsgsd644 ridiculously overpriced
Take note that the Hellhound is NOT $550, but $580, and you can get a 4070 super for $590. If you use the reference model GRE it looks notably worse than the overclocked models.
overclock the reference model GRE
@@HanSolo__ Willfully introducing overclocked models into testing creates a minefield of problems, because no 2 cards of even the same model overclock the same. Not to mention then you need to overclock the Nvidia card as well. What if you get an amazing OC on the 4070 super and a terrible OC on the hellhound? Using overclocked models as representative of an entire class of GPU is just a bad idea.
The GRE seems to consistently run 8-12 degrees cooler even though the GRE is 270 watts and the 4070 is only 200 watts. That's a credit to the PowerColor. Maybe they should rename themselves PowerCooler. ;)
Every other week I go I will get a 7800xt this month and then another contender pops up making me go, maybe I will wait a month, hold strong my little 2070S, still, I might be holding out hope since out market tends on the side of, "do you want to be gouged, or out of stock".
That's a never ending cycle as a GPU buyer. I was just about to buy a red devil 6950xt tomorrow (really good prices on the used market btw) after thinking it through for a week. Just found out it doesn't support anti lag+ for fps shooters, now im set back a week lol
As a user with a GTX 1070 MSI Armor video card, i am sadly in need to get a new one. As i was looking around (from the start of the year) in my Country, i have decided to get a GPU for my b-day in September, as the last GPU i purchased was back in 2017. So the cards i can afford are: GIGABYTE GB R79GREGAMING OC-16GD (the one im learning towards) and a GIGABYTE GB N407SEAGLEOC ICE-12GD So i was like hmmmmm I don`t buy GPUs that often, RT is meh to me - i like it, but i don`t need it ..... And i am worried that 12gb of vRAM will NOT be enough in the next 3-5 years, before i can afford my next GPU :)
(4070 Super will cost me: 740$ / and the 7900GRE will cost me: 660$) Considering how close the 2 are id say its better to simply save the dough and not go for Nvidia as its just more expensive......
The thing about the 7900GRE is that its main issue is its held back by its bandwidth due to its memory. So i asked my cousin who works at a PC repair shop, if he could potentially change my GPU`s memory in 1-2 years down the line and he Said YES, that can be done. I even asked him how much it would cost and he gave me a reasonable price. I`m thinking of going down the AMD route this time around as i simply HATE the idea of having a 12VHPR cable in my PC case......
My question is - what would be a good CPU/ RAM speed to pair with the RX7900GRE?
Maybe a R7 7800X3D and a DDR5, 6000 mHz kit with tightest timings i can find for a decent price?
Hey Daniel , another great vid , can I ask you ...are you still loving and using the LG C2? or do you now use a different monitor. Cheers.
LG is to dim. Samsung QN90 in Game Mode is way better with 2000+ nits brightness and no burn in and better blacks than OLED. Sony has set the new standard of 4000 nits which means OLED TV is dead. Play on QN90 and you will see the difference is mind blowing and you never go back to dim OLED. OLED is only good for phones and tablets with Samsung AMOLED.
@FunFunFun8888 I think lg is fine if you where to get an oled tho it would be the smamsung qd oled 1000 nits and also oled I have the lg c3 it's only 861 nits but still good enough depends what you want tho more brightness or less perfect blacks
AMD recently confirmed the OC limits on this card are being removed. The Sapphire Pulse model at 550 with a stable aggressive OC will be an incredibly desirable midrange product.
💥 Sharpness and definition are really bad in NVIDIA, serrated? I have an notebook Lenovo Legion 5i i7 12700h RTX 3060 and the image IN THAT THINGS are worst than in my Acer Aspire 5 Ryzen 7 5700u Integrated Graphics in the same conditions! NVIDIA ugly, AMD beautiful! And the filters of Experience increaaes the sharpness, but don't turn to beautiful, like AMD!
The thing about the showing the way the games would really be played - AMD users usually don’t use FSR because of the visual quality hit so it’s really often DLSS vs native
Depend on resolution. Fsr at 4K or even 1440p is good. Maybe if I put my 4070 and 7800xt systems next to each other and compared, I could see a difference but honestly I can’t tell at quality settings. Dlss does better at the balanced setting for sure though
FSR 2+ on Quality when selecting 1440p or higher in the menu actually looks pretty good. I personally don't ever use below quality on any upscale. TSR which is an unreal engine 5 upscale can look better than even DLSS sometimes if you compare quality mode vs quality mode. They all have pros and cons, nvidia is never perfect either.
AMD users dont need FSR because of all the frames.
@@stebo5562 look at the video you’re on they aren’t 4k cards
@@BlackJesus8463 incredibly dumb take. The results of DLSS quality vs AMD native is literally in the video you’re commenting on
I would have went with AMD for my gpu but dlss is way better than fsr and the rt performance is also better with nvidia.
For me the efficiency is an important factor so the ~50% higher power consumption of the GRE is a pretty huge downside and basically, a deal-breaker.
Excellent video intersting in upgrading but will wait on Battlemage offerings when they arrive!!
What are Battlemage offerings if you dont mind my asking?
@@meesironmanIts B980 supposed flagship around 4070ti/4080 performance at half the price. Also a lower tier card just need to wait on what is released q2/3
They said same for intel a770 16gb but performance is now between 3060ti/3070, it was supposed to be a 3070/3080 tier inte could not keep up.
@@Kage0No0TenshiIntel being new to the discreet gpu market had alot catching up to do hopefully this time it much better release.
! Correct me if I'm wrong, trying to clear up what I've seen: !
As I understand it, someone needs to release an unlocked VBIOS to overcome arbitrary limitations on the GPU & Memory clocks?
That the GDDR chips on the GRE are the same as on the 7800XT and are rated for 20Gbps (GRE set to 18, 7800XT set to 19.5), which would be a board power issue at reference, but AIBs are providing a power boost as well. So if the VBIOS wasn't limiting the memory clock, it would be within spec (not even pushing them) to get 11% more memory bandwidth. Pair that with an undervolt for the GPU and there should be sufficient power to gain some ground on the GPU clock as well?
As it stands, the GRE looks like an OK offering. With an unlocked VBIOS it would be the best value, even paying $30 extra for an AIB with increased board power and dual VBIOS.
Getting the GRE 16gb vram is what I want
I wanna see someone OC the 7900 GRE because I suspect its an OC monster
but look at the power consumption
@@ryan-we1mlRDNA 3 cards go by the entire board for their power consumption metrics.
It doesnt is fully capped by amd.
maybe on Nitro plus where is not so much power limited, Hellhound also takes bit more juice than reference. Vram speeds holds this gpu and its not good for OC, maybe some models can go higher than stock on vram speed.
@infiniteblaz3416 got it, thanks for the info!
Did anyone else notice that 7900 GRE uses almost 100W or more than 4070 to be equal or win in some games by a small margin ! Thats great from fps to price ration perspective sure but when 7900 goes even above 300W for that to be achieved when 4700 almost never get more than 200, theres a huge gap there needs to be mentioned.
It because of nvdias smaller note it makes since
What not be compared to the super version? Regular 4070 is also $80 cheaper and consumes much less power.
Its clear he wants to do a clickbait video for money and that narrative doesn’t get him more money. From the benchmarks you can figure out the super is just going to be better outright here
@@luzhang2982Wtf are you talking about? Official MSRP for the 4070 and 7900 GRE is $549, the 4070 Super is $599. What clickbait and narrative is he creating?
@@luzhang2982 No its not clear. They cost around $550 in the US. What is this? Anytime someone does a comparison of products they dont like or agree with it must be bias or theyre in it just for the money. All youtubers make videos for the clicks and money so its like saying the sky is blue. You just want to see Nvidia win and other clowns who want to see AMD win accuse him of being an Nvidia shill.
Its just a comparison between two products that cost the same in his country of residence and where most of his audience resides. Its easy to understand. I can walk in at my local store and buy a 4070 for $550 or a 7900 GRE for the same price. This video makes perfect sense
In Germany the cheapest 4070 cost € 588,- (inkl. VAT) and the cheapest 7900 GRE cost € 588,- (inkl. VAT). I think this is the same price ;-)
Just one thing, ratchet and clank is not optimized for Nvidia graphics cards, it's more VRAM hungry than usual, because it was built for PS5 in mind (amd architecture, more VRAM dependent) that game doesn't benefit from the huge amount of L2 cache that Nvidia cards has
And this happens a lot on games that are ported from Xbox or PS5 to pc, not a coincidence
its not that game isn't optimized its that NVidia doesn't give enough vram
It's kind of a weird title. The level of RT it runs on the consoles is plainly not available in the PC port settings. So in raster it favours AMD a good deal but with RT it completely tanks on AMD like Cyberpunk.... and lo and behold, the PC port is an "Nvidia sponsored title".
@@javierarroyo6600Obviously it's not a coincidence. Consoles have 16 gigs of shared Vram they can dynamically gimp as needed. If you're behind the consoles, you start running into issues, always been like that every console gen.
The 4070 needs to go back to $520 or even lower as it’s meant to compete with the 7800xt while the 7900gre is similar to the 4070s in raster. At just a $50 gap, I might opt for the 4070s for the feature set. Wish they kept the gap at $100 to make it as attentive as the 7800xt was when the 4070 was still at $600
I hate how nobody does RT comparisons most of the time. Thnx for this. But tbh right from the beginning of the video the 7900GRE is using 100 more watts than the 4070... for a mere 5 fps that kinda adds up
undervolt
Im starring at the watt usage there... 290watt + n the 7900 gre is really really ouch. Im not entirely done with the video but I believe theres no undervolting involved there? cause almost 100watt more power draw is insane.
I generally don't use FG. I feel FG only works if the image is already smooth. If it is then why are you even bothering to use it at that point?
24:00 interesting, so going from 12 to 16GB VRAM might be significant for RT regardless of AMD vs Nvidia
Please add RT with 1080p testing
If TSR looks better than FSR, you should put TSR against DLSS instead of FSR since I would choose the one that makes my game look better and thus TSR is the more realistic use scenario for AMD
It would be interesting to see RX 6900 XT vs RX 7900 GRE, especially in Ray Tracing.
Raytracing doesn't even look that good and when you account for the performance hit its just nmot worth it. ¯\_(ツ)_/¯
@@BlackJesus8463 light rt the performance is the same check GN review
@@BlackJesus8463 I don't like cube maps and SSR.
you can always turn on FG in the AMD settings menu even if the game doesn't have a toggle for it. So you can have FG in games like Cyberpunk on AMD hardware.
Damn 7900 GRE actually SLAMS the 4070 unless u like really high RT to the point where the 4070 is getting 30fps, the rasterization increases is like a whole teir leap for AMD and priced the same that’s some good comp from AMD, nvidia kinda has to lower 4070 price now
The Unique problem that I SEE in LATAM Even 7900 xt cost almost the doble price That the original 4070, I hope this helps to nvidia to improve their future gpus because amd now is having a good performance with raytracing 👀
Seeing the 4070 already running out of Vram for Ratchet and Clank at 1440p is a huge worry. I have a feeling that 12gb is not going to age well at all in the next year or so. Nvidia has to make 16gb the minimum from here otherwise consumers aren't going to have a bar of it. The greed needs to stop.
Thank you for the benchmarks Daniel, but man oh man! the game industry must be in really bad state if you still keep testing some of the already irrelevant or forgotten titles.
4070 and the 7900 GRE are somewhat what I can consider the actual minimum as seen from this video as playable for RT, even not relying on upscaling.
I have been wondering how long we should be waiting for RT to be playable without much performance penalty. Considering that is if RT is an option instead as a requirement (full RT rendering).
Maybe 2 generations later from both AMD and NVIDIA that I would consider acceptable for RT performance judging from this video, if we consider 20-30% RT performance uplift per generation.
I said consider many times…
As someone with a 7900XTX, those AMD drivers and weak features just mar your experience. Driver crashes, FSR2 artifacts, weak support for frame generation, etc.
AMD needs to work on its software this generation.
Love the videos, keep up the great work..
I farted baby
i still think that the 4070 super its the real competition of the 7900 gre, sure its more expensive but not by that much the performance difference would be less but all the rtx and dlss improvement would be way more than the 7900 gre
Hello, Daniel! IMHO it is not right to show on benchmarks results with DLSS FG and not at least mention that the AMD FG can be already used for any DLSS game that have FG.
I understand that it is not official feature but considering the fact that FG from AMD became opensource, in the benchmarks it should be mentioned that FG from AMD is available on games like Cyberpunk. If you read that, I really appreciate, thank you for your great job!
Nice vid. I was surprised at the RT performance. If there was something like this that was comparable to a 4080 with RT, I would have strongly considered buying that over an Nvidia GPU.
Would be interesting to see a 4070 Super vs the 7900 GRE in a head to head. In Europe and the UK both are quite comparable in pricing.
Daniel man, you're quickly becoming the most thorough tester in the space. Cheers.
There has to be something remedy can do to improve AMD gpu performance in Alan Wake since they definitely spent the time optimizing Nvidia performance, the way starfield was initially AMD leaning but then Nvidia performance improved a few weeks later
Hi Owen, thats what I have been saying for some time now, AMD has to target Nvidias RT performance on their GPUs. As AMD match on RT, and we know their performances is lower in RT, the result will be a RX card with about 30% above Nvidias in native resolution. Then, buyers will turn their heads other than only think in Nvidia for their graphic cards. As RT is becoming more and more a reality, other than just a show off, like in the begining, I feel that that is why, Nvidia cards, blew away the competition.
Thanks for the video! Loving the battle for the midrange cards now... Finally getting interresting and affordable-ish for PC gamers.
Better frame generation. Better upscaling. Better / cleaner ray tracing with ray reconstruction. The GRE is compelling but I would personally push the upscaling another notch and stick with the 4070. Realistically I would probably just look to spend the extra 50 to get the super though.
Software tricks aint performance
@@lifemocker85 If the frame looks better, then I don't care what the render resolution is. Also, I didn't say it was.
@@lifemocker85But Software is nowadays much more important than hardware
@@acadiachair I dont care what people say - they all look worse than native
@@HotFreshBox wrong. native is always the best without any added hoax
i just WANT DLSS. i hope FSR gets exactly as good as DLSS in every regards, then i would consider an AMD GPU.
upscaling is a godsend, and will be a godsend in the longevity of any GPU
but its cool that an AMD card of the same price beating some 4070 benchmarks which costs the same. however i would say that the 4070 is bad for 550. everyone should buy the 4070 super for 600 instead, which will beat every GRE result, while still using less power.
BUT 12gb VRAM feels bad. thats why i would save up some more money and buy at least a 4070ti super.
the GRE is using 50% more power for the same or 10% more performance. very bad efficiency, thats a fact. but most ppl wont give two fks about that so its just theoretical. it is a sign of superior nvidia chip maturity tho. i just dont lile the 4070 at 550. it should be 500.
"I just WANT DLSS"
"[sic] the 4070 is bad for 550"
That's your problem. You're so infatuated with Nvidia's software tricks that you'll complain about Nvidia's prices... and then go buy them anyway. Which tells Nvidia that they're doing the right thing. Because your whining means jack shit when your wallet doesn't back it up.
@@andersjjensenwhat the fck are you talking about? i wouldnt buy a 4070. i wouldnt buy a 4070S. if you get a 4070ti super for 800.- or a 7900XT for 720-730.-, i would buy the 4070TIS bc i want DLSS and 16gb VRAM is enough for me. for 1000.-i i would buy the 4080S instead of a 900-1000.- 7900XTX bc 16GB VRAM is enough for me and i want DLSS.
also the better efficiency is just crazy. im in EU, so its relevant for me how much power my stuff uses. i would def buy a X3D AMD CPU bc they are better and more efficient for gaming than Intels offerings. But in terms of GPUs, Nvidia can ask for more money than AMD bc Nvidia GPUs are better in every way in my opinion. except for the price.
Alan Wake 2 is probably an edge case tbh. It's such an insanely heavy game to run (even on Nvidia hardware)
Got the green light from the wife to get a gpu upgrade (so she can use my old one haha), we set a budget for 600 bucks and this one caught my eye.
Would it be a logical step to upgrade from a RTX3070 to the RX 7900 GRE?
Or would you recommend a RTX4070 Super?
Why is the 4070 so expensive? The 7700XT is almost 100$ cheaper btw
In Sweden 7900 gre and rtx 4070super priced same both good vs 7800xt/4070
The next video is definitely for me. Im using a 1080ti currently and want to upgrade. Im considering between 3080, 3080ti, 4070, and 4070 super. The game im playing most I use lots of mods in and it really kills the 1080ti. Not sure if the higher memory bandwidth of the 3080's might be better or not.
12gb gpus are just stupid buy in 2020s
It strikes me that there's unusually big upgrade available to AM4 people if they were to use 5800X3D and an RX 7900 GRE...let's say you're on a 2700X with a 2060RTX at the moment, it's kinda crazy how huge the performance gains would be just for swapping out that CPU, and getting this midrange GPU...it would launch that aging PC that's only just pushing 1080p low/med quality... right into the 120fps/ultra quality/1440p domain, at a cost that would be double or more a couple of years ago...and even then, you'd need to change other components potentially the whole system! I just can't decide if it's worth upgrading from a 5800X+3070 or wait for NVidia 5XXX gpus and let AM5 hardware fall in price a bit more.
Time is money at the end of the day, can't wait forever. Next gen could be as disappointing as this gen for all we know (probably will, let's be real)
It's a great option but definitely keep in mind it is consistently using about 50% more power which is ~100W more. If you game several hours a day that cost could add up to a few extra dollars per month. Not enough for me personally to care but something to keep in mind.
undervolt
@@lifemocker85true it will probably still be faster than the 4070
Love the review but the PowerCooler Hellhound is retailing much higher than all the other 7900 GRE's ... In the US it is retailing at or around 579.99. Which is almost the price of a 4070 Super. :P :|
Mofos in chat think their electricity rates are similar to current era Europe. Check your electricity rates peeps, they are largely negligible for a device like a PC in most countries.
being an ordinary person in Europe, 100W less power consumption as a big deal for me.
@@vvhitevvizard_ Being an ordinary person in Europe, I play like 2 hours a day.
@@vvhitevvizard_ Sucks to be you mate, hope it gets better in the future.
@@shk0014 Personal preference. I prefer power efficiency.
i received two dead rx6800 from bestbuy so forced myself to buy a 4070 even though i love amd, and now im questioning getting the GRE as its the same price
the power draw for AMD is like 33% higher constantly.. scares me a bit. just ordered the Nitro+ model anyhow ;D
3440x1440 on RTX 3060Ti wasn't great all the time. Hoping for some serious power now :p
but you got an 850w power supply though right? lol
@@BlackJesus8463That's actually what I have(850w)!
@@BlackJesus8463 checked. But cable mgmt is a mess, I do think I got an corsair rx 850.
For example 68xt is 200w gpu undervolted
@@lifemocker85 No. It's a 150W GPU undervolted. No wait, it's a 100W GPU if you just undervolt it enough. Seriously, stop being a bonehead. Every GPU is better if you tweak it... unless you lost the silicon lottery and got a chip that just barely made the cut. Then you can tweak and twerk all you want, and all you're going to get is a few percent.
I find it interesting to look at the power consumption numbers. I'm surprised how much wattage this cut down card pulls compared to a 4070/Super.
Alan Wake 2, seeing 195 vs 299 watts at some points its crazy how power inefficient the 7900GRE is
For example 68xt is 200w gpu undervolted
Power to performance is what you want to look at. 4070 super leads all GPUs in this category. 7900gre is middle of the pack.
@@JADC1111 12gb gpus you dont want to look at.
@@lifemocker85 I was actually sticking up for the GRE with the comments of ‘how inefficient it is’. IMO 1440 high settings is where both of the cards belong. Doubtful either card has issues running those setting for 3+ years
Going to wait for the next gpu comparision. Btw wich 4070 super do you have?. Its a founder?.
I can't wait to see the GRE with the memory overclocked! Should be very close to the 7900xt.
It would be very cool to see Helldivers 2 in your tests. Lots of people playing it, so I think it would be very interesting to see!
Some AIB models are going for 550$, Power Color Hellhound is going for 580$. Which is still better than what some people said (600+ for all AIB models).
All AMD has to do on pricing is line it up based on RT performance. From there, raster will be much higher per $ and always have more vram than nvidia (for the parts with more than 8gb anyway, but 8gb is a joke on both sides so..). The GRE is almost a no-brainer based on that alignment today with the 4070, unless you really want lower power draw of the 4070 or 4070S
Great video Daniel and think this is a good upgrade path for the 6800xt finally
FSR3 is out in Remnant 2 and IMO it looks pretty good. I'd bet your next video is gonna be about that.
Great reviews as always. Could you include MS Flight Simulator in your game test in the future? That would be awesome.
I like to imagine Daniel cornering his work colleagues in the school break room and reciting this info verbatim.
I wonder if the issue is purely vram or if it's the memory bandwidth. I've noticed plenty of issues with my 4070 super in Ratchet & Clank and Far Cry 6 at 4K. And I'm seeing that the vram is getting nearly maxed out. At least when it comes to allocation. It would be interesting to see the 4070 super compare to the 3080 TI. Since the 4070 super should be a little bit more powerful, but the 3080 TI has a higher memory bandwidth and they're both 12 GB cards.
Thank you AMD for releasing the GRE!! Gives me flashbacks to the R9 290X a little bit. We love wide GPUs with lots of cores and memory!!! Granted, we're losing some power efficiency, but AMD has improved their power efficiency a lot since the 290x obviously.