GeForce RTX 4070 Super vs. Radeon RX 7900 GRE, GeForce Premium Worth It?

Поделиться
HTML-код
  • Опубликовано: 22 авг 2024

Комментарии • 1,4 тыс.

  • @Hardwareunboxed
    @Hardwareunboxed  4 месяца назад +133

    Update: I found a settings error in Helldivers 2, the results are incorrect, though the margins are much the same at 1440p and 4K between these two GPUs. The correct data can now be found in the TechSpot article here, apologies for the mistake: www.techspot.com/review/2826-geforce-rtx-4070-super-vs-radeon-7900-gre/

  • @markolumovic2750
    @markolumovic2750 4 месяца назад +696

    Steve definition of vacation is weird to say the least.

    • @AlexHusTech
      @AlexHusTech 4 месяца назад +27

      That's how you get to 1M subs, someone who loves his job/hobby, I need this level of dedication!

    • @Pedone_Rosso
      @Pedone_Rosso 4 месяца назад +11

      To be fair, he didn't say how much chocolate was consumed during benchmarking...
      (LOL!)

    • @darcrequiem
      @darcrequiem 4 месяца назад +7

      No weirder than his definition of a shed 😄 The man built a house single-handedly and shrugs like it was no big deal. Steve is just built differently than the rest of us.

    • @BrandensOutdoorChannel
      @BrandensOutdoorChannel 4 месяца назад +1

      Holiday*

    • @Lodinn
      @Lodinn 4 месяца назад

      Is it so uncommon though?.. Most people I know work through their vacations/holidays, maybe at like half capacity but still

  • @SaviorInTheSun
    @SaviorInTheSun 4 месяца назад +29

    2 GPU's over 58 games?!?!?! You're a madman, Steve, and I appreciate you and your teach so much!

  • @mc_sim
    @mc_sim 4 месяца назад +493

    Got GRE few days ago. Pretty happy about it. +50% gain to my 3060ti.

    • @PlainsSights
      @PlainsSights 4 месяца назад +38

      Ordered it yesterday, also to replace my 3060 TI, that 50% sounds reassurering

    • @walter1824
      @walter1824 4 месяца назад +59

      ​@@Scott99259 is DLSS really 50% boost tho?

    • @mc_sim
      @mc_sim 4 месяца назад

      @@Scott99259 i am not a marketing victim and i don't need fake performance with dlss or fsr or any other 'frame generation'. i want pure rasterization power and true native resolution quality. and don't tell me that dlss looks the same or doesn't have lag. that's physically impossible.

    • @Hito343
      @Hito343 4 месяца назад +35

      Got GRE too, Hellhound amazing card probably the quietest while running cool card I've ever had so far. I don't use any upscalers they do look like shit, acceptable at 1440p tho.

    • @Scott99259
      @Scott99259 4 месяца назад +3

      @@walter1824 check dlss vs native comparision of hardware unboxed.

  • @oktusprime3637
    @oktusprime3637 4 месяца назад +580

    It's important to remember that the 7900 GRE's memory is kind of underclocked out of the box. You can gain massive performance improvements with a memory OC, over 10% in a lot of cases.

    • @ChrispyNut
      @ChrispyNut 4 месяца назад +6

      Indeed, I'm not sure that was as fully available during testing, was it? 🤔

    • @NeverSettleForMediocrity
      @NeverSettleForMediocrity 4 месяца назад +59

      you are delusional gains are minimal which doesn't change anything about this comparison

    • @keerthan7558
      @keerthan7558 4 месяца назад +34

      10% isn't massive lol

    • @markgutierez9922
      @markgutierez9922 4 месяца назад +272

      ​@@NeverSettleForMediocrity Nvidia Shill detected.

    • @Axisoflords
      @Axisoflords 4 месяца назад +196

      ​​@@NeverSettleForMediocrityGiven the difference between the two averaged at just 1% - getting 5-10% free performance actually does change things ever so slightly. And it's still the cheaper product.

  • @Anukinihun
    @Anukinihun 4 месяца назад +284

    7900GRE in my opinion is the best high end midrange card this generation. Get last Gen Top end performance at just $550 with alot of overclocking potential and 16GB of VRAM

    • @zzavatski
      @zzavatski 4 месяца назад +6

      Too many fell for Super branding. Regular 4070 also has 12GB and is less overpriced.

    • @hansolo631
      @hansolo631 4 месяца назад +39

      I have a 4080 and despite the cost, I've been happy with it the last few years. But I think AMD straight up made better cards this gen. Maybe the last one too tbh. The 6600xt, 6800, and 6800xt were monsters for their price.
      The 7900xtx is like 80% of the 4090 performance for what? 50% of the cost? IDK. If you guys can run raytracing at 280 hz, then fair enough. But the incoming gen I am going to have a very, very good look at whatever AMD comes out with

    • @sstier48
      @sstier48 4 месяца назад +11

      ​@@zzavatskino.. the super is a better value ($ per fps)

    • @zzavatski
      @zzavatski 4 месяца назад +1

      @@sstier48 When game requires more than 12GB both are the same (unplayable) performance. That's why the cheaper one is better deal.

    • @moc5661
      @moc5661 4 месяца назад +1

      ​@@hansolo631 the 4080 hasn't even been released for a "few years" lol unless you're talking about the 3080..

  • @raute2687
    @raute2687 4 месяца назад +72

    these deep dives are always super interesting to watch. Good job and thank you!

    • @raifuzkan3971
      @raifuzkan3971 Месяц назад

      So you recommend a 7900 GRE over a RX 570? :) ... I need to change my GPU

  • @steviebhoy25
    @steviebhoy25 4 месяца назад +315

    As a 3070 owner who fell for the raytracing waffle , I've enabled it in 3 games and then turned it off, raytracing seems to be for the spectator as I've never really noticed it while playing, what I did notice was the massive hit to fps and that's why it features stay off in my case. Also 8Gb VRAM is not enough now , the same way 12Gb won't be enough in 2-3 years

    • @dantae666
      @dantae666 4 месяца назад +31

      Exactly im on a 2070 and can see it. im going xtx for the vram amount

    • @Flowave-cc5ul
      @Flowave-cc5ul 4 месяца назад +7

      Bro, same I have 3070 I just played guardians of the galaxy @ 4K ultra plus DLSS quality getting 75-89 fps
      And graphic are incredible ! Very tiny game drops doesn’t hurt my 8Vram or my enjoyment of the game highly recommended by the way

    • @lawyerlawyer1215
      @lawyerlawyer1215 4 месяца назад +29

      You guys have gpus that can’t handle RT properly so you make massive performance sacrifices to get it and of course don’t see the win.
      Try games with good RT implications with a 4090 and then com e back tot ell em it’s not a night and day difference

    • @han1508
      @han1508 4 месяца назад +12

      RT not being as big as it is today is not the graphics card’s fault. Blame the developers who would rather spend resources and time in optimizing monetization and marketing.

    • @bal7ha2ar
      @bal7ha2ar 4 месяца назад +87

      ​@@lawyerlawyer1215 so i should buy a 4090 to see that the raytracing quality in most games is just as shit as with my old card but now with double the fps? uh no thanks ill just leave rt off and be happy

  • @aeropb
    @aeropb 4 месяца назад +101

    This seemed like a lot of work. Thanks for all you do.

    • @hassosigbjoernson5738
      @hassosigbjoernson5738 4 месяца назад +1

      Nah ... you cannot talk of work when the GPU's do all of it by themselves! =)
      [Kappa]
      Sure he had plenty of time searching for easter eggs ...

  • @siddhantkamble9502
    @siddhantkamble9502 4 месяца назад +28

    That was a lot of testing. Thank you for your work Steve.

  • @brucethen
    @brucethen 4 месяца назад +64

    Steve's idea of relaxing sounds like hard work, over the Easter break I was at a friends 60th birthday party and a great time was had by all

  • @mercurio822
    @mercurio822 4 месяца назад +630

    no the Nvidia premium is not worth it when you only get 12gb vram.

    • @The_Noticer.
      @The_Noticer. 4 месяца назад +52

      Nvidia's "premium" is more mindshare than anything else.

    • @joelconolly5574
      @joelconolly5574 4 месяца назад +102

      It's enough for 1440p gaming

    • @gunturbayu6779
      @gunturbayu6779 4 месяца назад

      ​@@joelconolly5574 same mindset as people who bought 8gb 3070/ti , look at them now lol

    • @gcz4457
      @gcz4457 4 месяца назад

      ​@@joelconolly5574 Maybe now, but in the future?

    • @CrazO9111
      @CrazO9111 4 месяца назад +70

      AMD need 1-2GB VRAM more ingame aswell dont forget that. Nvidia Card is fine for 1080p/1440p for the next 3 Years.

  • @t5kcannon1
    @t5kcannon1 4 месяца назад +67

    Great content. Thanks Hardware Unboxed!

  • @sergioav7278
    @sergioav7278 4 месяца назад +76

    I think it should be important to point out if the 7900GRE was memory overclocked or not, since the uncap update gave this GPU a significant performance increase.

    • @tommysimonsen4043
      @tommysimonsen4043 4 месяца назад +14

      He was on vacation before the new driver came with the Vram OC. So this is done with no Vram OC

    • @wowzer4919
      @wowzer4919 4 месяца назад +1

      It did not give it a significant performance update for everybody.
      Only if you have the hynix memory.
      A lot of us can't even reach 2400 stable.
      So we're still pretty much 2316 fast timings. Nothing has really changed.
      And this is actually much more common than you think.

    • @aaronbulmahn7715
      @aaronbulmahn7715 6 дней назад

      @@wowzer4919 Why does it only increase performance with a hynix memory?

  • @lucidnonsense942
    @lucidnonsense942 4 месяца назад +255

    12GB vram for 600$? Yeah, hard pass...

    • @ScrayaZ
      @ScrayaZ 4 месяца назад +27

      perfectly fine for wqhd

    • @PSXman9
      @PSXman9 4 месяца назад +74

      i rather take 12GB on a functional GPU with basically flawless drivers compared to the absolute abomination of what RDNA3 is... (and i own a 7800XT and 7900XTX)

    • @nexii1479
      @nexii1479 4 месяца назад +2

      How about ultrawide?​@@ScrayaZ

    • @enragedbacon470
      @enragedbacon470 4 месяца назад +40

      kekw you probably aren't buying either of them, you're just here to complain and validate your salt

    • @isaacx593
      @isaacx593 4 месяца назад +14

      Agreed, should have been 16gb

  • @brad4690
    @brad4690 4 месяца назад +27

    I went with the 7900 gre. I saved $50, got more vram for longevity, and I don't have to worry about that Nvidia power connector. If Nvidia had put that other 4gb of vram in there, they would have had my money.

    • @asdf_asdf948
      @asdf_asdf948 4 месяца назад +5

      Now overclock the memory for a free 10% performance

    • @SmellsLikeNirvanna
      @SmellsLikeNirvanna 3 месяца назад +3

      There’s basically 0 cases of connector issues below 4090, a 4070 super with like 200 watt power consumption will never have an issue

  • @ThePurplePassage
    @ThePurplePassage 4 месяца назад +62

    given how well the GRE can respond to overclocking, I think the benchmarks ought to have included this (although I get there is a separate video on this already)

    • @NeverSettleForMediocrity
      @NeverSettleForMediocrity 4 месяца назад +2

      it responds badly to overclocking due to the lottery, don't count on more than 5% + instability in different titles

    • @markgutierez9922
      @markgutierez9922 4 месяца назад +22

      ​​@@NeverSettleForMediocrity lol you regardless of oc performance, Ray Tracing and DLSS premiums aren't worth if the GPU has only 12 GB . The only GPU worth buying is the 4090 for you know $1500

    • @MassivePaper
      @MassivePaper 4 месяца назад +1

      My gre memory crashes on anything over 2350mhz so yeah, basically stock. Got core to work at 26-2700 but it eats like 330 watts at max load

    • @colbyboucher6391
      @colbyboucher6391 4 месяца назад +5

      @@markgutierez9922 They said literally nothing about Nvidia, just stated a fact about the cards. The GRE is cheaper because it's using lower-quality bins so the lottery is real. I get the temptation to shit on the big monopoly at ever opportunity but this comment kinda just adds nothing

    • @markgutierez9922
      @markgutierez9922 4 месяца назад +1

      @@colbyboucher6391 we watched a video comparing two GPUs all comments are made with the context in mind. His comment is basically saying there is no point in OC 7900 GRE not going to see any noticeable performance, I point out that Nvidia is still over priced. So we both point out negative aspects of one of the GPUs presented.

  • @alti6942
    @alti6942 4 месяца назад +97

    The 7900 GRE with memory OC's hits the same figures as a 7900 XT (~10-15% increase on average) for $550 USD and with the 4 extra gigs of VRAM I think its a shame that it wasn't taken into consideration more heavily for your final thoughts especially since the only technical drawback to the GRE itself is its memory throughput compared to its much beefier CUs and its also relatively easy to do in AMD's own software

    • @hassosigbjoernson5738
      @hassosigbjoernson5738 4 месяца назад +3

      4 Gigs less VRAM of the 7900 GRE compared to the 7900 XT is indeed another technical drawback.

    • @mihirojha4475
      @mihirojha4475 4 месяца назад

      ​@@hassosigbjoernson5738 I mean 16 GB is more than enough even for 4k gaming on most titles.

    • @Birdman._.
      @Birdman._. 4 месяца назад +2

      You could overclock rtx 4070 super too which matches a rx 7900 gre still(in hardware unboxed's video)

    • @tommysimonsen4043
      @tommysimonsen4043 4 месяца назад +1

      @@Birdman._. I don't know wich video you watche, but in the newest video, the 7900 GRE is faster then the 4070 super when both cards are OC'ed ruclips.net/video/q5tbCbm1IYM/видео.html

    • @ulquiorrasciffer04
      @ulquiorrasciffer04 4 месяца назад +3

      Yeah but no amount of OC can get close to Nvidia's rendering speed. That 50 bucks difference is well damn worth it specialty if do both work and games.
      And Nvidia is more worth it in my country since they are priced at the same as amd cards.

  • @coyotegoot
    @coyotegoot 12 дней назад +2

    great info on both. ended up going w the asus 4070 super dual. BTW your video quality is frkn crisssssp!

  • @SilverforceX
    @SilverforceX 4 месяца назад +50

    I don't give a rats about ray tracing. Its still way too heavy on perf for the minor visual gains, and in many titles, the shadows aren't even better with RT, just different.

    • @Frozoken
      @Frozoken 4 месяца назад +6

      Yep they need to learn that the most exciting features are the ones with little to no downsides. This is why barely anyone actually cares about/uses features like ray tracing and frame gen. There's way too much of a trade off with both. Thats the same reason i love reflex so much because while the benefit is mild to moderate, there's literally 0 downside so I can always enable it no matter what.

    • @noahflare6825
      @noahflare6825 4 месяца назад +3

      💯

    • @pelpiBro
      @pelpiBro 4 месяца назад +3

      someone having amd and he cant play games with ray tracing,so mad

  • @no_ideaman
    @no_ideaman 4 месяца назад +11

    Thank you Sir for apparently being the only media outlet that I know about that includes comprehensive DLSS details. I was doubting between exactly these 2 cards for a 3-4 year life span, and DLSS was exactly the hard-to-answer yet all important question I needed to know about.

  • @vvhitevvizard_
    @vvhitevvizard_ 4 месяца назад +65

    12GB of 4070S is a big nono
    Nvidia could make RTX 4070 Super much better if they make it with 256bit bus width and 16GB VRAM.
    With 4000 series, Nvidia lost its touch with reality. 4060, 4060 ti 8/16, 4070(ti) just feel as a rip-off.
    And RX 7900 GRE would be more compelling for $500.

    • @exception6651
      @exception6651 4 месяца назад +12

      I’m sorry but most people that buy these cards won’t be playing in 4k or using max settings, Evan if they are they’ll be using dlss anyway. In new games 12gb will not be a issue it’s very rare I use over 12gb on my 4090 at 4k 🤷‍♂️

    • @vvhitevvizard_
      @vvhitevvizard_ 4 месяца назад +10

      @@exception6651 12GB is not enuf for frame generation, RTX, etc. Even for 1080P very high settings. Also, more VRAM is required for AI related stuff.
      TODAY. And its definitely NOT-FUTUREPROOF.

    • @itouchgr4ss
      @itouchgr4ss 4 месяца назад +10

      @@vvhitevvizard_May i know what game required more than 12GB VRAM at 1080p? i own 4070ti. So far the game i've played never exceed 12GB VRAM at 1440p (i don't have 4K monitor). Even if memory usage exceed 12GB VRAM 4070ti is not capable to run most of the game in 4k anyway. The same apply to the 7900GRE. Even if the VRAM is sufficient it cant do 4K too.

    • @angeltzepesh1
      @angeltzepesh1 4 месяца назад +2

      The Nvidia rip-off started way before the 4000 series, just look how bad the RTX 2000 series was, it was basically the 1000 series with RT cores while increasing the price. 3000 and 4000 series are good in a vacuum, but they suffer from gimped memory configurations and overprice, most of the lineup at least.

    • @kraithaywire
      @kraithaywire 4 месяца назад +3

      @@vvhitevvizard_ then how the hell is 4070 Super outperforming the 7900GRE at 1440p and 4k with ray tracing????????????

  • @selohcin
    @selohcin 4 месяца назад +49

    I've watched a thousand Hardware Unboxed videos over the years, but never until today have I watched one during a week when I was actually planning to buy a GPU. I'm honored that Steve just happened to put in such an enormous amount of work on the one day I really needed it. I'm going AMD. Thanks, Steve.

    • @lawyerlawyer1215
      @lawyerlawyer1215 4 месяца назад +3

      Do you put attention and still took the wrong choice going AMD? Yikes

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ 4 месяца назад +3

      great choice personally i would have gotten nvidia if they didnt ask shit prices and specs on cards that just arent good. i like amd software a lot. more user friendly.

    • @josejuanandrade4439
      @josejuanandrade4439 4 месяца назад +2

      So you didn't watch the video then? GRE loses at everything but 4k. And lots of tittles the performance at 4k isnt good... just better then the 4070 super.
      Then you also lose at upscaling, raytracing, and power consumption.
      AMD trained their fanboys well LOL

    • @dare2liv_nlove
      @dare2liv_nlove 4 месяца назад +8

      @@josejuanandrade4439 You mean, Nvidia trained their fanboys well (judging by your comment). As Steve clearly said at the end of the video, "you could go either way" depending on what you are prioritizing in your purchase.

    • @pluschy7886
      @pluschy7886 4 месяца назад

      @@dare2liv_nlovelisten I like both Sides AMD and NVIDIA but I probably would’ve gotten the 4070 super at this route. If it was the 4070 Ti super and 7900 XT then I would’ve gotten 7900 XT. Both are good sides with pros and cons.

  • @pkerch00b1
    @pkerch00b1 4 месяца назад +64

    Managed to snag a 7900gre for 510 euro. Seems like a good deal.

    • @izidor
      @izidor 4 месяца назад +11

      good deal mate. Well done. Bought mine a week ago for 560€, no regrets tho. The card is a beast, specially in OC.

    • @PillowOfEvil
      @PillowOfEvil 4 месяца назад +1

      Sounds like mindfactory prices. I can't find anything in my country under 670€

    • @Sirpesari
      @Sirpesari 4 месяца назад

      Would love one for that price, in Finland they're unfortunately starting at 649

    • @PillowOfEvil
      @PillowOfEvil 4 месяца назад

      Sucks. I could find some for around 630€, but immediately after the oc unlock, prices jumped up.

    • @urgay1992
      @urgay1992 4 месяца назад

      Yeah both the cheapest 4070 super and 7900 GRE are 650€ here.

  • @HeirofCarthage
    @HeirofCarthage 4 месяца назад +8

    These benchmark videos are awesome. The hard work put in definitely shows in the result. Also so helpful when it comes to choosing parts. Helps someone see what is right in their scenario.

  • @chrisfanning5842
    @chrisfanning5842 4 месяца назад +7

    I think your wider analysis of all the existing RT-enabled games is going to be really important. I've owned an RTX card since the very start and I've found two games where the (considerable) framerate hit was even a noticeable visual improvement. Of the 75ish games Steam claims I've played since buying a first-gen RTX card, I didn't bother playing with RT effects enabled, I merely experimented with settings and decided that the massive hit to performance detracted from the experience far more than the slightly better visuals added to it.

  • @Extreme96PL
    @Extreme96PL 4 месяца назад +13

    DLDSR + DLSS combo is alone worth it only way to make game with bad TAA to not look like blurry mess at 1080p. Difference in games like RDR2 or Jedi Survivor is like day and night on top of that DLDSR also help when game is aliased af.

    • @someasianguy8493
      @someasianguy8493 4 месяца назад +2

      Yep, people don't realise this. If you don't have DLDSR + DLSS, you are forced to increase the render scale and increase the sharpening in games like RDR2 if you want to reduce the blurriness of the awful TAA implementation.

    • @Extreme96PL
      @Extreme96PL 4 месяца назад

      ​@@someasianguy8493increasing rendering scale increases the resolution, which is roughly what DLDSR does, but without the additional improvements that DLDSR and DLSS make to ensure image stability through AI. Sharpening isn't really the solution as there's still blurring that gets worse as you start moving. Plus, it's possible that sharpening will reveal what TAA is trying to hide. So far, the only way I've found to fix the blur is to increase the resolution, which will make the image more stable and less blurry. DLDSR + DLSS is the best because both of these things are intended to improve image stability and reduce TAA blur, but if you can't use it, the resolution scale will also improve the quality, but without DLSS the performance hit will be higher

  • @timothywilliamz007
    @timothywilliamz007 3 месяца назад +6

    Love the information, maybe a dark grey background behind the graphs would be more appealing and easier on the eyes ? Just a suggestion !

  • @anthonyc2159
    @anthonyc2159 4 месяца назад +18

    7900 GRE: 16GB VRAM @ 256 bits vs 4070 Super: 12GB VRAM @ 192 bits. Which one do you think will age better? And even Steve admits raytracing is still a joke in most games. It is NOT worth the loss of 40% FPS over some 'enhanced reflections' that I can barely notice during gameplay. If Nvidia had a bit of shame, they would be releasing RT with Blackwell since full path tracing brings even the 4090 to its knees. 😮‍💨

    • @w04h
      @w04h 4 месяца назад +7

      What will age better? A gpu that uses hardware upscaling and has a company that consistently updates it and also introduces new features every few months, compared to company that let their sh1t software upscaling rot with no major updates for over a year so badly that Intel, a competitor that just entered the gpu market, already has a better looking hardware and software solution? Tell me.
      Even as a programmer who likes using AMD's Xilinx products I have to say they have been stagnating pretty bad on the software and feature side.

    • @ranjitmandal1612
      @ranjitmandal1612 4 месяца назад

      😮

  • @sparkplugbarrens
    @sparkplugbarrens 4 месяца назад +75

    It's great that you clarify, how raytracing drops frame rate to subjectively unplayable gaming experience in most titles at 1440p even with the 4070 super.

    • @AndyViant
      @AndyViant 4 месяца назад +24

      I remember buying a 2070 Super over a 2060 Super in the hope of playing ray traced games with it.
      Oh, how I laugh about that now.

    • @oktusprime3637
      @oktusprime3637 4 месяца назад +15

      Tha's complete bullshit. Most RT titles run great at 1440p on the 4070S. They fall apart at 4K but the 4070S is not a 4K+RT card anyway.

    • @SimplCup
      @SimplCup 4 месяца назад +13

      "in most titles" which were like 3, one of them unoptimized as shit (ratchet n clank), another is Cyberpunk which literally has real life lighting option basically and the most insane graphics in gaming history, and hogwarts legacy. And they all can be run with dlss/fsr and fg at at least 60-100 fps. while in reality most titles run smoothly at 1440p ultra ray tracing setting at 70-150 fps even without dlss and fg (tho i still turn them on, cus TSR and TAA AA solutions look horrendous compared to dlss quality mode, and fg is just free fps basically). Both rtx 4070 super and 7900 gre showed great results.

    • @ALLInOne-vq5em
      @ALLInOne-vq5em 4 месяца назад +9

      Nope, they're not. Take Nvidia titles like Alan Wake 2; the 4070 Super won't hit 60 fps at RT high on 1440P. Same deal with Cyberpunk 2077. Other games don't hit GPUs as hard, though, for both AMD and Nvidia.

    • @SimplCup
      @SimplCup 4 месяца назад +6

      @@ALLInOne-vq5em yes, but is it "most titles"? no, it's just a couple very demanding games, or unoptimized. 99% of games will run perfectly fine on 4070 super and 7900gre, some even at 4k.

  • @ZarathosDaimaoh
    @ZarathosDaimaoh 4 месяца назад +3

    Nice , but sadly the deciding factor for me , seems to be , as usual with AMD and the radeons , the drivers issues . To be fair however , they've made a lot of progress , and AMD's companion is more in depth and useful than Nvidia's tools . And some issues with a few titles , were mostly about shady deals with Nvidia

  • @richardnpaul_mob
    @richardnpaul_mob 4 месяца назад +10

    Yeah the non-overclocked GRE isn't great in this match up, pushing the memory interface though really let's it accelerate away from the 4070S from the benchmarks I've seen (and that was OC to OC). I've tried RT, I was not blown away by it given the free marketing everyone has been giving it, so I'll be interested in and watching the future videos.

    • @chriswright8074
      @chriswright8074 4 месяца назад

      He needs test more games that doesn't favor one or the other

    • @richardnpaul_mob
      @richardnpaul_mob 4 месяца назад +7

      ​​@@chriswright8074 by saying that you're not appreciating the amount of work goes into testing the 25-odd games (over 50 game tests when including FSR/DLSS, RT) that he already tests

  • @m4d.d0x
    @m4d.d0x 4 месяца назад +7

    Thank you for the great info I just really think that there should have been some OC numbers for the 7900 GRE since I think most will be rocking it with some degree of OC even more so with the new drivers and the memory unlock for way more than what we were able to. Other than being a bit more power hungry, about 100W compared to the 4070 Super the difference is quite noticeable and there’s a lot of frames to be gained from what I’ve seen, then again, thank you for all the work and the info on so many test for these cards!

  • @dazzlerweb
    @dazzlerweb 4 месяца назад +9

    We appreciate the work involved.

  • @poppinbubblez702
    @poppinbubblez702 4 месяца назад +19

    Thank you guys for all your hard work!

  • @richardwinstanley8219
    @richardwinstanley8219 4 месяца назад +26

    £600 for abysmal ray tracing performance is just not worth it. Paying a premium for a compromised experience is such a poor decision.
    I would rather get the 7900 GRE and not enable Ray Tracing, instead of paying more just to say "I play with RT" at a low FPS.

    • @itouchgr4ss
      @itouchgr4ss 4 месяца назад +3

      i don't care about ray tracing but for upscaling DLSS is way better than FSR. if AMD somehow can match nvidia with their RX8000 series then im switching to AMD.

    • @mick7727
      @mick7727 4 месяца назад +4

      One important other thing is local Stable Diffusion. AMD just can't do it at an acceptable level.

    • @Nick-nf1kd
      @Nick-nf1kd 4 месяца назад +3

      I play RT on a 3060ti. Man I love 30 fps and stuttering from running out of vram, so fun!

    • @vladimirdosen6677
      @vladimirdosen6677 4 месяца назад +6

      @@itouchgr4ss People pretending how Nvidia is selling mist never ceases to amaze me. They have proven time and again that their software is superior. Just because their rasterized performance sometimes is slower(by a small margin than AMD counterparts), doesn't mean that they don't have an overall well-balanced product.

    • @Superdazzu2
      @Superdazzu2 4 месяца назад +5

      ehm 1440p dlss quality 60+ fps with everything maxed including ray tracing is a compromised experience? LOL

  • @seamon9732
    @seamon9732 4 месяца назад +3

    Thrilled that you added Helldivers 2, the game is taking the gaming world by storm and is played so much!

  • @mikesunboxing
    @mikesunboxing 4 месяца назад +11

    Just swapped my 7800xt for the msi Rtx 4070 super slim and don’t regret it

    • @silvio351
      @silvio351 4 месяца назад +5

      you`ll regret it, just not today,

    • @lucaschenJC
      @lucaschenJC 4 месяца назад +5

      Congrats 4070 super is great

    • @mikesunboxing
      @mikesunboxing 4 месяца назад +3

      @@lucaschenJC yeah for me it was a better choice for my work as rasterisation in games is less important than a smooth timeline in premiere

    • @Danny-ex3tj
      @Danny-ex3tj 4 месяца назад

      @@silvio351 I got a Radeon 7800XT previously and had serious driver issue timed out twice in an hour, black screen.
      DDU in safe mode, reinstalled latest drivers(tried older and newer gen), RMA and received another dud, checked all my connections if plugged in properly.
      Do you know how to solve this problem, cause it's driving me crazy, in-fact I am not the only one with this issue(Can easily google it).
      I previously had 6800XT(XFX RMA twice)
      , Zotac 1080Ti (Artifacting so can't RMA since warranty expired)
      My current specs:
      7800x3d
      2x16GB 6000MHz (EXPO enabled) Team Group RAM
      TUF GAMING B650-PLUS WIFI
      Corsair RM850e
      2TB 980 Pro (Game drive)
      In my country it cost converted ~USD$50 more to get RTX 4070 super, so was thinking about it.
      I already regret my 7800XT many months ago. Was thinking about 7900XTX but even people reported driver timeout issue...

  • @SaviorInTheSun
    @SaviorInTheSun 4 месяца назад +1

    If ANYONE gets upset by Steve and Tim saying "It depends" when someone asks them what GPU to get, please refer them to this video. It truly matters what games someone is going to play, and what resolution you're looking to display. This video clearly demonstrates that unless you're willing to pay a huge premium, there really isn't a one shoe fits all approach to GPU's.

  • @Neonagi
    @Neonagi 4 месяца назад +11

    I ended up going with the 4070 super because it has less of a power draw, and for very small case builds this is important to keep heat and noise levels down.

    • @alexchavez3244
      @alexchavez3244 20 дней назад

      I hope your card can’t handle games because 12Gb is becoming absolute like 16gb of ram

    • @Neonagi
      @Neonagi 20 дней назад

      @@alexchavez3244 Hey there, no problems with any of the latest big games. I play on 1440p 165Hz though so I don't need large ram cache for 4K textures. My next big upgrade will be to 4K 240Hz one day when a single GPU can handle it.

  • @Shannon-ul5re
    @Shannon-ul5re 4 месяца назад +6

    Most players are using 1080p or 1440p displays, isn't 12GB of VRAM enough for that resolution?

    • @wizardfire555
      @wizardfire555 2 месяца назад +1

      Yes

    • @anhiirr
      @anhiirr 24 дня назад

      yep and most broke bois like to parrot 'talking points" as if theyve ever owned/tuned/ran "flagship" ranged products NGL. PPL that parrot bs talkingpoints and "what about ism" about played out scenarios...theyll never even live/encounter themselves just QQ'ing about how they can make an excuse to not upgrade/have what they wish they could run......mean while....im the "GUY" looking at 38" ultrwawides...driectdrive racing wheels etc....the whole shebang.... on account of GAMING interests. Played cp2077 and RDR2 on PC 21:9 day 1...had 2080ti/5700xt/3070/3080/4070ti as far as my experience with recent Faster gpus among the budget/range cards and laptops ive had/tested/sold...i owned a 7990 as well prior...and nothing comes across as more obvious to me....ppl griping about miniscule BS that in reality doesnt cross the MIND of ppl with these actual BUILDS/nice rigs if not EVER.

  • @Nightss10
    @Nightss10 4 месяца назад +6

    I'm really surprised there was no overclocked comparison at the end of this. Even just on 3 titles. It seems make quite a big difference actually. Great work as always Steve! Been here since before covid!

    • @jakeny7399
      @jakeny7399 4 месяца назад +1

      Not surprised here. Overclocking capabilities varies widely depending on silicon that the cards got shipped with. You can have two 7900 gre/ 4070s from the same manufacturer with very different overclocking results.

  • @emelyanychcool3715
    @emelyanychcool3715 4 месяца назад +2

    4070S consumes about 200w,GRE about 300w for a few FPS more,not costs it.

  • @mick7727
    @mick7727 4 месяца назад +1

    What Easter break? I work in a 24-hour service lmao. Steve will hear me some day when I'm screaming STABLE DIFFUSION.

  • @Uthleber
    @Uthleber 4 месяца назад +13

    You should only buy a 7900 GRE if you are willing to (memory) OC it. otherwise just take the 7800 xt for the AMD side.

    • @jimmybobby9400
      @jimmybobby9400 4 месяца назад +10

      You just move a slider and click a button.

    • @Uthleber
      @Uthleber 4 месяца назад

      @@jimmybobby9400 and even this, could be a challenge for some folks 🙂

    • @andersjjensen
      @andersjjensen 4 месяца назад

      It generally takes a pretty good core bump too. I haven't heard of silicon so bad that people can't get 2700MHz with a 1010mv baseline and a +15% TDP.

    • @pauloazuela8488
      @pauloazuela8488 4 месяца назад

      @@andersjjensen so far the issue of GRE with the bad overclock wasn't silicon related but bugs due to driver issues that AMD released a driver that should be the fix and I've seen videos of it that it fixes it. But I do not know how stable it is

    • @andersjjensen
      @andersjjensen 4 месяца назад

      @@pauloazuela8488 The numbers above are post "fix" numbers. You could do it before, but you had to use PowerTables via the registry.

  • @norbertnagy4468
    @norbertnagy4468 4 месяца назад +27

    Finally Helledivers 2!!!!!
    Thanks Steve

  • @tibordavid6716
    @tibordavid6716 4 месяца назад +2

    I managed to get a 4070 super for $569 a few days ago. I am very satisfied with it. Both cards are suitable, but because of DLSS and RT, it feels like a good decision for 1440P.

  • @ktvx.94
    @ktvx.94 4 месяца назад +2

    If the new FSR really gets 80-90% of the way there compared to DLSS, the GRE is a no-brainer. You'll only be really upscaling to 4K where the better samples mitigate its issues, maybe to 1440p just to save energy. If it doesn't I'm slightly inclined towards the GRE.

  • @user-qt6qe2vd1q
    @user-qt6qe2vd1q 4 месяца назад +28

    In my opinion you should add Witcher 3 in RT tests, difference there is huge in visuals.

    • @kraithaywire
      @kraithaywire 4 месяца назад

      True! And 4070 Super should be a lot faster in that game with RT.

    • @band0lero
      @band0lero 4 месяца назад +2

      @@pravnav Only on Novigrad, where it is heavily CPU limited. On more graphics intensive scenario it can give a pretty good indication of RTX performance. It is pretty heavy, but so is every Ray Tracing implementation that makes a difference.

    • @evanhooper1
      @evanhooper1 4 месяца назад +6

      @@band0lero The thing about the TW3 next gen update is now even with RTX turned off, the performance is much worse than the 1.32 'last gen' version. The directX 12 implementation was horribly done

    • @kren4449
      @kren4449 4 месяца назад

      ​@@evanhooper1with rt off, fsr quality and pretty high settings (a mix of high/ultra/ultra+) the new directx 12 version works decently enough on my radeon 6600 @ 1440p while looking better than the old version. From what I've read online it appears that people with amd cards and rt off generally don't have many problems with getting decent performance on witcher 3 ng

    • @band0lero
      @band0lero 4 месяца назад

      @@evanhooper1 yup, it is definetly worse overall, but it shouldn't matter very much if we're looking for performance deltas between GPUs rather than analyzing the game itself. Other than CPU limitations, of course.

  • @Joeyratatouille
    @Joeyratatouille 4 месяца назад +10

    I love my GRE. Got it a few weeks ago finally after using the 1070 for over 6 years. It took me a while to figure out the amd tweaks and settings but once I got it all set it is working great.

    • @juliocardona3378
      @juliocardona3378 4 месяца назад

      what did you end up doing to the settings?

    • @Joeyratatouille
      @Joeyratatouille 4 месяца назад +1

      @@juliocardona3378 memory OC and just general game-to-game optimization.

    • @farhanrafid5907
      @farhanrafid5907 29 дней назад

      What about productivity mate? Is it good for both?​@@Joeyratatouille

    • @4bodeyt646
      @4bodeyt646 26 дней назад

      What u doing for warzone to be stable and no stutters?

  • @spoots1234
    @spoots1234 4 месяца назад +7

    After seeing the RTX remix for NFS underground, its hard to ignore RT completely anymore.

    • @mihairomulus2488
      @mihairomulus2488 16 дней назад

      I've been saying stuff like this for 4 years and they still have to enter my library

  • @lifeisinserthere
    @lifeisinserthere 4 месяца назад +1

    After this video, my Ray Tracing FOMO has finally been cured. I no longer have concerns about picking Team Red over Green, which is my default bias anyway. That being said, VRAM and pricing still has me in a SERIOUS PICKLE--as I'm still undecided between 7900 GRE, XT, or XTX and seem to change my mind everyday. Ughhhh. If only pricing was the way it used to be, I would shell out for the XTX no questions asked!! Thanks for the vid as always!!

  • @TerraWare
    @TerraWare 4 месяца назад +14

    Thanks Steve.

  • @user-nn9gr8bw4n
    @user-nn9gr8bw4n 4 месяца назад +3

    Power Consumption begins at 25:50

  • @garytrawinski1843
    @garytrawinski1843 4 месяца назад +1

    I think the GRE needs to be at least $25.00 less to start to hit the "compelling" tier.

  • @wiseass2149
    @wiseass2149 4 месяца назад +40

    If you only have $550, then go with Raedon. Seriously, NVIDIA is ripping you off.

    • @_fiftyseven_
      @_fiftyseven_ 4 месяца назад +5

      What about video editing and graphics design? How do those tasks perform on Radeon?

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat 4 месяца назад +2

      ​@@_fiftyseven_ not good, if you want those go for nvidia. Also this isn't the correct channel to answer that.

    • @rygelxvi3324
      @rygelxvi3324 4 месяца назад

      @@_fiftyseven_ Nvidia

    • @BOPBOPBOPBOPBOPBOPOB
      @BOPBOPBOPBOPBOPBOPOB 4 месяца назад +1

      @@_fiftyseven_ that’s a valid point but not a strong one. I’m pretty sure over 90% of non-crypto miners would buy a graphics card for gaming only.

    • @ondrejdobrota7344
      @ondrejdobrota7344 4 месяца назад

      No, just stop thinking, becuase RX 7900 GRE is now increased price to 625 EUR. Wtf you did with 550 USD?

  • @Hjorth87
    @Hjorth87 4 месяца назад +4

    Great video. I love the amount of data 😊

  • @freedom_fighta
    @freedom_fighta 3 месяца назад +1

    In my country the RTX 4070 Super is about $200 more expensive than the RX 7900 GRE..

  • @dutyofcall7659
    @dutyofcall7659 4 месяца назад +1

    swapped my rtx 3060 for a rtx 4070 super OC and it's a huge improvement.

  • @darrannorman6185
    @darrannorman6185 4 месяца назад +3

    Interesting and great analysis. I wonder what happened in MW3. GRE was well ahead of 4070 Super in your GRE Retest Video, now reversed. Different quality settings though.

  • @Anukinihun
    @Anukinihun 4 месяца назад +8

    I still don't get why people buy Nvidia over AMD to this day when it has shown Nvidia offers less VRAM than AMD and VRAM is the most important thing moving forward. You can say how much DLSS means to you but Nvidia locks their latest tech to the latest Gen. So DLSS 4.0 isn't coming to Lovelace like how 3.0 didn't come to Ampere.

    • @timotmon
      @timotmon 4 месяца назад +1

      Many reasons.. some good and some not so good reasons. For me it's that I don't just game with my GPU's.. I use it for motion graphics and Octane rendering. Also a lot of us make money with our GPU's so point in fact I picked up a 4080 super on launch day for $999 and it's already paid for itself. Simply put, I have no use for an AMD card.

    • @puregaming7255
      @puregaming7255 4 месяца назад

      They played the amd lottery and lost i can say this as a current 6700xt having nothing but issues

    • @Anukinihun
      @Anukinihun 4 месяца назад

      @@timotmon ohhh yeah you have a reason to have Nvidia, I do 3D rendering too that's why I switched from my 6900XT to a 4090 but alot of people are just gamers or streamers. People don't know that AMD streaming software is the best on the market even better than OBS. Relive does everything you need for streaming and recording.

    • @Anukinihun
      @Anukinihun 4 месяца назад

      @@puregaming7255 what are your issues?

    • @puregaming7255
      @puregaming7255 4 месяца назад +1

      @@Anukinihun constant black screen tried new dp cable hdmi cable fresh os install. ddu several times Went from am4 platform to a intel drr5 still same issue even replaced psu still cant stop black screen

  • @0Tuxedo
    @0Tuxedo Месяц назад +1

    I have a question, beyond terms of fps consumption or price etc... Which of the two between nvidia or amd is more RELIABLE, rather than failures, I come from a bad experience of years with amd with its adrenaline software that was misconfigured , problems with its updates and problems with some games with which it did not work correctly and I had to wait for them to fix it with patches..... should I give amd a chance or is nvidia better optimized? I have never been an nvidia user

  • @mikaylaadams4264
    @mikaylaadams4264 4 месяца назад +2

    I'm an AMD fan generally because better value but Nvidia is the obvious choice here for me.

  • @TearyBoi
    @TearyBoi 4 месяца назад +16

    Can we get a massive GPU benchmark for helldivers 2? 😮

    • @fracturedlife1393
      @fracturedlife1393 4 месяца назад +2

      Maybe some CPU 🤞

    • @magikarp2063
      @magikarp2063 4 месяца назад

      Well you basically got everything you need to know from this video - Nvidia massive advantage.
      But it would be okay to see since its a veryy popular one.

    • @hak116
      @hak116 4 месяца назад +2

      Heres to hoping that amd users get driver updates to bring the gre up to par with nvidia cards, so everyone can spread democracy with smoooth framerates!

    • @ranjitmandal1612
      @ranjitmandal1612 4 месяца назад

      🔥

    • @toad7395
      @toad7395 4 месяца назад

      Nvidia currently has the advantage (I assume in all 4000 vs 7000 series gpus). Probably because the developers tested using nvidia GPUs and not AMD gpus. Hopefully AMD updates their drivers to combat it, I doubt helldivers 2 will release an update to increase AMD performance.

  • @MikaelKKarlsson
    @MikaelKKarlsson 4 месяца назад +2

    Help Steve take a load off by running the summary segment at 0.5x speed. He's been sneaking in work during his time off and can use our help. 😊

  • @Metetron
    @Metetron 4 месяца назад +2

    Dragon's Dogma 2 has surprisingly nice RT for how relatively cheap it is, though without mods it's locked at just 0.4x resolution. I still found it worthwhile to use for about half of my playtime on a 6800 XT, it made the forests gorgeous. I opted for 0.75x resolution as a compromise.

    • @colossusrageblack
      @colossusrageblack 4 месяца назад +1

      It's because it's global illumination which makes a much bigger visual difference than shadows which is what a lot of games that claim to have RT use.

  • @kraithaywire
    @kraithaywire 4 месяца назад +15

    Hi Steve you probably hear this a lot but i love you

    • @Hardwareunboxed
      @Hardwareunboxed  4 месяца назад +12

      Not enough for my liking, so thank you haha

    • @kraithaywire
      @kraithaywire 4 месяца назад +5

      @@Hardwareunboxed

  • @Yellowswift3
    @Yellowswift3 4 месяца назад +4

    A very detailed and thorough dive into the benchmarks there, thanks for the hard work, Steve. The ray tracing deep dive you mentioned would also be an interesting piece of content should you decide to undertake it.

  • @mraltoid19
    @mraltoid19 4 месяца назад +2

    This is an excellent video. I really like my 7900GRE, I just wish that overclocking was included. It looks like you can make a 7900GRE perform like a 7900XT, with skill and luck, while I don't think the 4070-Super is capable of overclocking to the performance of the 4070-TI Super. Would've been interesting to include, but I get why due to variance in models...thanks for the excellent video.

  • @jisjis87
    @jisjis87 4 месяца назад +1

    I'd love to see a behind the scenes video about these benchmarks. Like how you store the games and everything else you need to do in order to get the videos done.

  • @KimBoKastekniv47
    @KimBoKastekniv47 4 месяца назад +27

    I can't see myself spending $600+ on a 192-bit bus card, this memory configuration used to belong on 60 class cards.

    • @vladimirdosen6677
      @vladimirdosen6677 4 месяца назад +1

      If you can spend 600+, you can spend 800, you can spend 1000 too. Some people smoke a 4090 or two in a year and winge how their products are too expensive.

    • @helldogbe4077
      @helldogbe4077 4 месяца назад +5

      Even spending 500$ for an AMD product doesn't seem worth it when we're not getting a generational (30%) uplift over the 6800XT

    • @DeFroOO
      @DeFroOO 4 месяца назад +4

      I can't see myself spending $600+ on a card that aside raster gaming is totally useless

    • @cyberpunk59
      @cyberpunk59 4 месяца назад

      ​@@DeFroOOhonestly, gaming aside from "raster gaming" is pointless

    • @VoldoronGaming
      @VoldoronGaming 4 месяца назад +1

      And on the GTX 760 you have a 256-bit bus....So nvidia cut down the bus on the GTX 960, which is only 9% faster than the GTX 760...with a 256-bit bus, however it would have been around 15-20% faster.

  • @Alexruja3227
    @Alexruja3227 4 месяца назад +5

    In my country, funny enough, the 4070 Super is cheaper than the 7900 GRE, about 10-20$ cheaper, so I went with that in the end.

  • @Krisz94HU
    @Krisz94HU 2 месяца назад +1

    I chose 4070 super. I had 6700 XT and the biggest upgrade for me is the FSR to DLSS. The flickering and shimmering is so annoying considering you pay 500+$ for a card. In raster GRE is better by 5-7% but when you enable upscaling or RT nvidia take the win. I adore 7900 GRE tho so good card for its price.

  • @darkfeeels
    @darkfeeels 4 месяца назад +1

    I cannot believe the amount of effort you put into testing these products consistently. Thank you so much for your sincere efforts for this community - you guys are the GOAT.

  • @metabolic2057
    @metabolic2057 4 месяца назад +7

    Im upgrading from a gtx 1070 to 7900 gre oc 😂

    • @tomaattiX
      @tomaattiX 23 дня назад

      Doing The same but from a 1050 lmao

  • @Tainted79
    @Tainted79 4 месяца назад +7

    The problem with benchmarking the 7900GRE is it's bandwidth starved out of the box. It needs to be memory Oc'd to see its full potential. And then you'll get into endless arguments saying its not a fair match up.

    • @sgredsch
      @sgredsch 4 месяца назад +3

      yea well without manual tuning its basically a 7800xt. the 7900gre is really only interesting as a tinker gpu that needs manual tuning. thats where the value of the gpu is.

    • @The_Noticer.
      @The_Noticer. 4 месяца назад +2

      Then they should have released it that way. Which perfectly illustrates steve's comment that AMD can't afford to leave performance on the table at low market-share.

    • @markgutierez9922
      @markgutierez9922 4 месяца назад +3

      @@sgredsch it's been a while since I had the chance to mess with a GPU and get decent results. And with the radeon software per game profiles I get to set individual oc profiles for each game for maximum performance. Indeed I have been having more fun playing with the GPU settings than the actual games 😂.

    • @stangamer1151
      @stangamer1151 4 месяца назад

      What do you mean by "needs"? Every card can be overclocked, including 4070 Super. And it is even more constrained by it's mem bandwidth than GRE.

    • @josejuanandrade4439
      @josejuanandrade4439 4 месяца назад +1

      If a product NEEDS to be tinkered with, to work as it should.. THEN IS A BAD PRODUCT.
      Then again, all cards can benefit from oc'ing, not just your precious AMD cards. So please, lower your fanboyism a notch, would you?

  • @EthanLeitch
    @EthanLeitch 4 месяца назад +1

    Thanks for the great video! I've just ordered a 4070 SUPER here in Australia for $950. It was a difficult choice between the 4060 Ti 8gb/16gb, 7800 XT, and 7900 XT.
    In the end I compared $ per frame on all of them: 4060 Ti 8GB was good, but the 16GB was a terrible value even at 4K. The 7800 XT was fantastic value but I really just wanted the NVIDIA features, and the 4070 SUPER actually had better price to performance than the 4070 with current pricing.

    • @hendersonseena
      @hendersonseena 4 месяца назад

      I believe you paid too much money. I got the rx 7900 gre for $520.00 close to the proformace of your card for hundreds cheaper. All the extra bells for the nivida card is not worth the price difference. But it's your choice.

    • @EthanLeitch
      @EthanLeitch 4 месяца назад

      @@hendersonseena that's Australian dollars, here the cheapest new 7900 GRE is $929 and 4070 SUPER is $949.

  • @BruceWaynge
    @BruceWaynge 4 месяца назад +1

    If the price exceeds $400 it should automatically become a 16GB VRAM card. Even Intel isn't dumb enough to release low VRAM cards, they have $299 cards with 16GB.

  • @X2yt
    @X2yt 4 месяца назад +3

    Nvidia is not worth it, unless you need CUDA support. So basically anything even remotely AI related, Nvidia is a must. Also for productivity. Then it's worth it. For pure 100% gaming, AMD is almost always better for the money.

  • @TheGrizz485
    @TheGrizz485 4 месяца назад +7

    Not considering the 7900GREs huge ~%15 overclocking headroom is huge oversight in the analysis. It's the only reason many people are buying it. I think that this and it's 16gb memory buffer makes it a direct competitor to the 4070ti Super.

    • @fVNzO
      @fVNzO 4 месяца назад +2

      It's not guaranteed performance, both cards can be overclocked. And no, the vast majority of people do not overclock or undervolt. You should be able to easily make that decision yourself armed with your own knowledge about OC'ing if that's what you're intending to do. There's no "oversight", the video is not a big maths equation it just relays data. It's up to you calculate your own weighted sum. It's not that deep.

    • @TheGrizz485
      @TheGrizz485 4 месяца назад

      But most cards have only %4-%5 OC headroom. It's a waste not capatalize on the GREs oc headroom if it's being considered.

    • @Eleganttf2
      @Eleganttf2 4 месяца назад

      lol these amd fanboys are never satisfied

    • @AndyViant
      @AndyViant 4 месяца назад

      @@TheGrizz485 yes, but you might get a 7900GRE that only lets you memory O/C to 2300. Or you might buy a brand of 7900 GRE with a really bad power limit. The same goes the other way, the Nvidia reference Super boards were crap for power limits but some AIB partners fixed that, and that allows more like 7-8% O/C.
      Some people have had minimal memory overclocking success with the GRE and stability issues. Applying the fast memory timings option on the card seem to be almost as good as higher memory overclocks anyway.
      The only fair way to do it is to benchmark what you get with factory settings, naming the specific products, and if you've played with overclocking and found one seems pretty good (or heard it through industry) then mention it might be a plus but YMMV.
      Which is exactly what Steve did.

  • @mbp1646
    @mbp1646 4 месяца назад +1

    I feel judged. I skipped to Final Thoughts and Steve called me out on it.

  • @passablegamer2536
    @passablegamer2536 4 месяца назад

    I picture Steve at a beach as he benchmarks a few dozen games as one of his happy places.

  • @michahojwa8132
    @michahojwa8132 4 месяца назад +5

    Fabio from Ancient Gameplays did a piece on this, but with OC/UV - he basically redefined the Way it's done - like raising core voltage for better result. With the new way GRE becomes OC monster. About your video - I really like how you put raster and rt results in one place.

  • @fracturedlife1393
    @fracturedlife1393 4 месяца назад +8

    Thanks for democratically benching Helldivers 2. Must try this dx11 switch 😅

  • @twithnell
    @twithnell 4 месяца назад +1

    I think overall, the 7900 XTX looks to be the most compelling for me. I am coming from an EVGA RTX 3080 FTW3 Ultra 12GB version. I have the updated bios installed that increases the power limit to 450 watts. SO my card basically performs like a 4070, but with the faster memory bandwidth on my card, it sometimes performs better than than what the 4070 would do. The cut memory bus and bandwidth on those 4000 series cards make me not want them because it will probably make it age out faster with newer titles. They really want to push that DLSS. I know that FSR is an option on AMD, but they aren't handicapping they hardware to push it. It's more of a product of them building console hardware and porting the technology to PC. I feel more confident in buying an AMD card nowadays and having a longer life out of it.

  • @xavalintm
    @xavalintm 4 месяца назад +1

    I didn't know Helldivers 2 is a pro-Nvidia title. Thanks Steve!

    • @Cplcash87
      @Cplcash87 4 месяца назад +1

      Which is odd when you think about it, its a PS5 title as well as PC and whats in the ps5? Amd lol

    • @roki977
      @roki977 4 месяца назад +2

      That one is big win for Nvidia.. Maybe AMD fix it with drivers..

  • @lloydaran
    @lloydaran 4 месяца назад +10

    It ultimately depends on what games you like to play, personally, for just 50 bucks at this price point for 1440p gaming, I'd go for the Nvidia GPU here, overall superior upscaling and frame generation and ray-tracing, Reflex > Anti-Lag, less power consumption. On the AMD side, there are 4 GBs more of VRAM for "future-proofing" and nothing else really. The GRE needs to be no more than 500 bucks in my opinion.

    • @jutube821
      @jutube821 4 месяца назад

      And much better in Blender, local LLM's or any AI compute in general. The 4070 super is absolutely worth it the 50 bucks more. Who says otherwise is being dishonest. I say this as someone who has had more AMD cards in the past than Nvidia.

  • @TouzeenHussain
    @TouzeenHussain 4 месяца назад +4

    Rocking a 5600X and a Sapphire Pure 7900GRE

    • @dariosucevac7623
      @dariosucevac7623 4 месяца назад

      i am curious on if you get bottlenecked by your cpu? how are the 1% lows on 1440p? im planning on getting the same gpu but i have a ryzen 5600, thanks in advance

    • @TouzeenHussain
      @TouzeenHussain 4 месяца назад +1

      @@dariosucevac7623 don't care about bottlenecks. But I will get a 5800X3d if I get a good used deal in the future. Right now all my games are 90-180fps at 1440p. I use a 144hz monitor so I can it at 144 anyway. Really happy with the card. And F RT.

  • @rgstroud
    @rgstroud 4 месяца назад +1

    You didn't take advantage of the new drivers and boost the OC on the GRE rasterization by 20%. The Super OC options are limited. Hopefully, FSR 3 will fix the bugs and improve performance in upscaling. We will have to wait and see on that one. The 4070 Super only really wins in Heavy Ray Tracing for the $50 extra.

  • @luismantilla4914
    @luismantilla4914 4 месяца назад +1

    Bought my rtx 4070 super since release date ( january 17 ) been playing many games at 1440p ( cyberpunk 2077, a plague tale requiem, baldurs gate 3, horizon forbbiden west ) everything max out cyberpunk 80 hours played with frame generation + rtx Overdrive jesus solid 90-100 fps, couldn't be happier came from an 2060 6 gb massive upgrade, 2000 mhz overclock vram, 80 core clock, no crashes, same performance as 4070 ti ( non super )

    • @ALLInOne-vq5em
      @ALLInOne-vq5em 4 месяца назад +1

      It's got serious potential, even for 4K, but its VRAM limits it to 1440P. Give it 16 GB of VRAM, and it could crush the 7900 GRE easily. Although, an overclocked 7900 GRE might edge out an overclocked 4070 Super a bit...

    • @NamTran-xc2ip
      @NamTran-xc2ip 4 месяца назад +1

      @@ALLInOne-vq5emIt does 4k perfectly fine, without RT. It does suck ass since its clearly powerful enough.

    • @luismantilla4914
      @luismantilla4914 4 месяца назад

      @@ALLInOne-vq5em tbf in MY experience 12 gb is more than enough and will be enough for the next few years tho, i dont agree with nvidia about paying 600$ for a 12 gb GPU in 2k24, but i value more ray tracing, and better techs, for me its imposible to use fsr, blurry image, and the fact that nvidia GPUS use less vram + DLSS always on use i would be happy if amd brings a tech that actually can compete with dlss in terms of Quality for a less money card win win, but in the meantime it is what it is

    • @ALLInOne-vq5em
      @ALLInOne-vq5em 4 месяца назад

      @@luismantilla4914 FSR is not actually so bad if implemented well. Take Cyberpunk 2077 for example; although FSR 2.1 implementation falls short compared to unofficial mods and lags behind DLSS or even XeSS due to lacking AI or hardware support, it's still a viable option when done properly. There were titles where I could utilize FSR 2.2 on a 3440x1440 resolution without noticing any significant visual drawbacks.
      However, I have my doubts. What we consider today as 4K with ray tracing could easily transition to tomorrow's 1440p with ray tracing. Even at 1440p in games like Ratchet and Clank, utilizing ray tracing on hardware like the RTX 4070S might face VRAM limitations, which could become more prevalent as next-gen games continue to emerge. Unless there's innovation to reduce VRAM usage, this limitation will persist. For instance, Microsoft is exploring ray tracing implementations that leverage SSDs when VRAM falls short.

    • @ALLInOne-vq5em
      @ALLInOne-vq5em 4 месяца назад +1

      @@luismantilla4914 It's not actually so bad if implemented well. Take Cyberpunk 2077 for example; although FSR 2.1 implementation falls short compared to unofficial mods and lags behind DLSS or even XeSS due to lacking AI or hardware support, it's still a viable option when done properly. There were titles where I could utilize FSR 2.2 on a 3440x1440 resolution without noticing any significant visual drawbacks.
      However, I have my doubts. What we consider today as 4K with ray tracing could easily transition to tomorrow's 1440p with ray tracing. Even at 1440p in games like Ratchet and Clank, utilizing ray tracing on hardware like the RTX 4070S might face VRAM limitations, which could become more prevalent as next-gen games continue to emerge. Unless there's innovation to reduce VRAM usage, this limitation will persist. For instance, Microsoft is exploring ray tracing implementations that leverage SSDs when VRAM falls short.

  • @pw1169
    @pw1169 4 месяца назад +11

    Can we have a linux vs windows benchmark with these cards and cpu?

    • @user-xe6sm4jv8f
      @user-xe6sm4jv8f 4 месяца назад +3

      Oh man, that would be such a pain to benchmark! I can 100% guarantee that the moment Linux is mentioned, there will be comments like "yOu UsEd wRoNg dIsTrO" and "ummm ackshually thats the wrong version of dxvk it has 1.3% worse performance". Considering the sheer amount of variables to the amount of games tested it probably isn't worth it...

    • @pw1169
      @pw1169 4 месяца назад +1

      @@user-xe6sm4jv8f It don't think it would be too bad. Just go for an 'out of the box' distro config and start benchmarking. At worst the benchmarks have to be run twice. It would be so interesting though! Don't tell me you wouldn't be interested :D

    • @fugitive6549
      @fugitive6549 Месяц назад

      no one cares about linux in gaming.

    • @pw1169
      @pw1169 Месяц назад

      @@fugitive6549 enjoy gaming on your spyware infested os 😂

  • @mikesunboxing
    @mikesunboxing 4 месяца назад +4

    i might have missed it in the video, but which models of the cards were used?

    • @fateunleashed9680
      @fateunleashed9680 4 месяца назад +4

      Both are Gigabyte cards. It's shown on the thumbnail, mentioned at 1:20 which you can always find in the chapters of the video.

    • @ahairylobster8524
      @ahairylobster8524 2 месяца назад

      @@fateunleashed9680 GROSS

  • @Lazarosaliths
    @Lazarosaliths 4 месяца назад +1

    Their price difference in Europe is 25-30€, something negligible at thatbprice range. The RTX model in your review seems the better buy, but the limited Vram and memory bandwidth of the RTX card, makes the purchase unreasonable.
    I think the OC results of the GRE, make the Radeon card a better buy, along with the memory advantage of size and bandwidth.
    Thank Steve for the amazing review!!! And all the hard work putting all of this information together!!!

  • @cnieuweboer
    @cnieuweboer 4 месяца назад +2

    When upscaling was added "it doesn't belong in reviews", until they are forced into it anyway after a few years. Now with framegeneration "it doesn't belong in reviews", we'll see.

    • @Frozoken
      @Frozoken 4 месяца назад +1

      Except that isn't an actual quote that he said and frame gen specifically gives essentially a false value for what is being benchmarked which is frame rate which is actually also how frequently your inputs are registered too, a large reason as to why people want higher fps.
      The initial refusal to use upscaling was literally because it was just too sh*t to justify lmfao, frame gen on the other hand inherently has the aforementioned issue.

  • @The_Noticer.
    @The_Noticer. 4 месяца назад +5

    Hey Steve, a much fairer assessment in the final thoughts than the previous "AMD has to be 20% cheaper" argument. Fair on VRAM, Ray Tracing and the pro's and con's of both.
    As for the rest in my opinion: I still say the Adrenaline software suite is still a bit better than GFE (or whatever its called now), but DLSS is better than FSR. However, since both make your visual quality noticably softer and blurrier, i'd never use them either way. And FG is just terrible for both vendors. And as you said, ray tracing only really matters with the 4080/4090. This class of GPU's just isn't fast enough without upscaling. So you have slightly more accurate shadows, but a blurrier visual. Not really a selling point to me.

    • @tomgreene5388
      @tomgreene5388 4 месяца назад +3

      How is this a much more fair assessment when all you said was a bunch of opinions and things you prefer lol. Not everyone will want or prioritize the same things as you.

    • @The_Noticer.
      @The_Noticer. 4 месяца назад +3

      @@tomgreene5388 The first part of the comment is a remark to steve, then a new line starts, and it literally says: "IN MY OPINION". I did that specifically to prevent comments like these. But alas, bellcurve and everything. There is no way you can provide enough caveats for some room temp IQ's that failed basic reading skills.

  • @Steve30x
    @Steve30x 4 месяца назад +3

    I bought a 7900gre two weeks ago and I have a feeling my 5600x is bottlenecking it even though my ram is 3733mhz. Also I have the CPU set to boost @ 4.85ghz with pno2 curve optimizer at negative 20. I'm going to upgrade my CPU to either a 5700x3d or 5800x3d.

    • @roki977
      @roki977 4 месяца назад +1

      It does for sure. 5800x3d is minimum for that GPU.. Skip 5700x3d if you can..

    • @Steve30x
      @Steve30x 4 месяца назад

      @@roki977 there's only about €30 difference between the two anyway but it will have to wait a few weeks because my cooker died just a week after buying the 7900gre

    • @roki977
      @roki977 4 месяца назад +1

      @@Steve30x i had 5700x at 4.85ghz and frames drops in every cpu demanding scenario like in cp2077 with lots of NPCs or simillar thing in Warzone and all that with rx 6800. With 5800x3d all those drops are almost gone, i have 7800xt now. Just saying its out of my exp..

    • @Pand0rasAct0r_
      @Pand0rasAct0r_ 4 месяца назад +1

      yes just buy a 5800x3d and enjoy the motherboard for a long time still. till hell maybe even am6

  • @andrewbroadfort6856
    @andrewbroadfort6856 4 месяца назад +2

    Bought the 7900 GRE for iRacing in VR. Paired with a 5800X3D AM4 setup that had started life with an 1800X. No RT in iRacing.

    • @jimmybobby9400
      @jimmybobby9400 4 месяца назад

      It does really well in iracing, especially after an overclock.

  • @telepop123
    @telepop123 4 месяца назад +2

    I play Cyberpunk at 1440p with RT Ultra lighting and RT reflections with balanced xess on a 6800xt and get around 55-60 fps, looks fine to me so it is possible to play games with RT on an AMD card.

    • @nkozi
      @nkozi 4 месяца назад +1

      Yup, same.

  • @atirta777
    @atirta777 4 месяца назад +12

    I personally think the Nvidia premium is worth it. DLSS Balanced looks similar to FSR Quality, so realistically the performance is better if willing to upscale at 1440p or 4K. If playing competitive stuff or older games though, I'd go AMD

  • @Ferdinand208
    @Ferdinand208 4 месяца назад +15

    I would love a showcase for RT. So far I am not impressed. I still remember going from Half Life to Far Cry with beautiful DX9 water. To Crysis with beautiful shader effects and great dynamic shadows and foliage and huge levels. Those games were worth the asking price of a faster GPU so you could play on high.
    These days I play Control on a 1080ti and 3080ti and... After a while I just turn of RT and run it at 4K where the 1080ti could do 1080P. Tried Cyberpunk at different quality levels and different RT levels. After a while I also turned those off because after playing a while I don't even know if RT is on or off. I have to check the settings or find an area where RT would make a difference.

    • @tonymorris4335
      @tonymorris4335 4 месяца назад +7

      You can't tell the reflection difference on the ground with RTX on and off with Cyberpunk? I agree with every example you gave (another old gamer) but RTX in cyberpunk, ESPECIALLY in the rain, is mind blowinging good looking and realistic.

    • @wakingfromslumber9555
      @wakingfromslumber9555 4 месяца назад +4

      @@tonymorris4335 yeah Cyberpunk is the only game where you can see a difference since the game is just literally neon lights and puddles.😅

    • @Ferdinand208
      @Ferdinand208 4 месяца назад +2

      @@tonymorris4335 When I am sight seeing to see what RT can do I can see it. But ask me after 30 minutes of gaming if I have been gaming with RT on or off... I would have to check my memory if I saw better reflections in puddles.
      Go play Far Cry in DX8. Go play Crysis on low. You know that you have been gaming on a potato GPU.
      A nice example would be the DX10 mode of Crysis. That gave smoke that didn't clip through the floor. Looks nice. But wasn't worth the performance hit so you turn it off and you forget whether smoke looks normal or fancy.

    • @antonvlasov6628
      @antonvlasov6628 4 месяца назад

      I watched a russian video about RT a while ago. The gist of it was that lights in most modern raster games are built with first implementing RT on a scene and then a person manually sets shadows and light sources (which is pain in the ass sometimes). Also that's the reason why we don't have good physics in games because there won't be a correct shadow representation on certain objects or it would take more gpu power than raytracing.
      The reason for implementing real time RT is to cut time and costs of production and to get more correct lightning. So I think I'll be better in general when all GPUs can use RT at playable frame times.

    • @Ferdinand208
      @Ferdinand208 4 месяца назад +2

      @@antonvlasov6628 Yes. Most of those features are in Unreal engine 5 and run on non-RT hardware. But it is going to be great for gamemakers in 2030 when every GPU can do RT with performance.