Just How GREat is the Radeon 7900 GRE ... Golden Rabbit Edition?

Поделиться
HTML-код
  • Опубликовано: 25 фев 2024
  • Wendell puts not one, not two, but three of the new Radeon 7900 GRE cards to the test!
    **********************************
    Check us out online at the following places!
    linktr.ee/level1techs
    IMPORTANT Any email lacking “level1techs.com” should be ignored and immediately reported to Queries@level1techs.com.
    -------------------------------------------------------------------------------------------------------------
    Intro and Outro Music By: Kevin MacLeod (incompetech.com)
    Licensed under Creative Commons: By Attribution 3.0 License
    creativecommons.org/licenses/b...
  • НаукаНаука

Комментарии • 158

  • @seanunderscorepry
    @seanunderscorepry 5 месяцев назад +62

    I love the long-running Danny Devito as an AI benchmark bit.

  • @winstonsmith2391
    @winstonsmith2391 5 месяцев назад +69

    This really showcases WHY nvidia limits the ram on their lower tier cards. One of the many reasons I no longer give them my $$. Instead of giving you enough ram to get the most out of your card they purposely limit it to hinder you on higher resolutions.

    • @danieloberhofer9035
      @danieloberhofer9035 5 месяцев назад +25

      They tell you that it's not a 4K card and to get a 4080, instead.
      One of the biggest laughs I've had this gen was that the 4060ti was originally marketed as a 1080p card. Sure thing, with only 8GB of VRAM. But why on earth should a $400 card in 2023 be meant for 1080p? That's the same story they told me about the 1060 years ago.
      Be that as it may, AMD aren't saints, either. They've got their flawed products (or should I say flawed prices), as well. The 7700XT is useless at $449, the 7800XT should never have been called XT, the 7600 should've launched as the 7600XT at $299 and the 7900XT may be really good now at ~ $750, but it was a complete joke at $900.
      Personally, I don't care that much (7900XTX), but they could've done better, much better!

    • @camotech1314
      @camotech1314 5 месяцев назад +1

      They force you to spend more money on a higher tier card, since clearly you are rich enough to have a 4K monitor.

    • @ElJewPacabrah
      @ElJewPacabrah 5 месяцев назад

      So you can go up the product stack

    • @JahonCross
      @JahonCross 5 месяцев назад +3

      ​@@ElJewPacabrah
      It's the Apple product spend more to get more.

  • @guyelvy7317
    @guyelvy7317 5 месяцев назад +69

    The ultrawide testing would be greatly appreciated. Love your content regardless, always hard to not catch your enthusiasm.

  • @katzicael
    @katzicael 5 месяцев назад +8

    Wendel! You had the perfect opportunity to throw in a sesame street Count "ah ah aauuuh!" after the 3 in the intro.

  • @Pieteros21
    @Pieteros21 5 месяцев назад +20

    I already have this kart for nearly month ... Performance is great and no stupid 12vhpwr connector ;7

  • @2000jago
    @2000jago 5 месяцев назад +33

    I've been on team green all my life but for the first time I'm contemplating my next upgrade to be a team switch...

    • @MrEdioss
      @MrEdioss 5 месяцев назад +9

      Switch to intel...
      IGPU 😂

    • @camotech1314
      @camotech1314 5 месяцев назад

      Don't think too hard

    • @anthonyyoung9810
      @anthonyyoung9810 5 месяцев назад

      Do it..... You won't regret it!!

    • @churblefurbles
      @churblefurbles 5 месяцев назад

      To what, save a trivial amount of money for poor RT performance and a gimped feature set?

    • @XT64
      @XT64 5 месяцев назад +12

      ​@@churblefurblesLess money, more VRAM, usually more raw performance. Not everything is about RT and DLSS... Both sides have their advantages.

  • @lisashepard2078
    @lisashepard2078 5 месяцев назад +11

    I'm not brand loyal, I am just ANTI NVIDIA! 🤣

  • @RezaFaet
    @RezaFaet 5 месяцев назад +2

    Awesome review, thanks for all your hard work!! Really love your channel 💝

  • @Nicc93
    @Nicc93 5 месяцев назад +7

    would be cool to see an ultrawide testing video at different field of view settings to see the stress impact of increasing view

  • @Nicc93
    @Nicc93 5 месяцев назад +15

    those sapphire cards are monsters, i own a nitro xtx and pure white 7800 xt

    • @NeverSettleForMediocrity
      @NeverSettleForMediocrity 4 месяца назад

      thats just a 6800XT on steroids how is that a monster ? 1080p? definitely not over 1440p

    • @Nicc93
      @Nicc93 4 месяца назад +1

      @NeverSettleForMediocrity just my experience. I have the 7900 xtx nitro plus like I said and also the pure white 7800 xt. Both can do high fps at 1440p and both can do 4k with many games.

    • @Nicc93
      @Nicc93 4 месяца назад +1

      @NeverSettleForMediocrity ultrawide 1440p is no problem for the 7800 xt

    • @Nicc93
      @Nicc93 4 месяца назад +1

      @NeverSettleForMediocrity benchmark numbers mean nothing if you don't play those games bro :) charts and graphs cannot show you the experience

    • @leviathon98
      @leviathon98 Месяц назад

      you sounded jealous in your post ​@@NeverSettleForMediocrity

  • @WMRamadan
    @WMRamadan Месяц назад

    Love what you said about "I don't think at this point in time in 2024 anyone should be brand loyal"

  • @NemusDark
    @NemusDark 5 месяцев назад +3

    Thanks for the review

  • @quintit
    @quintit 5 месяцев назад +10

    Having to log into their software was what kept me from not buying another nvidea card lol

    • @marcogenovesi8570
      @marcogenovesi8570 5 месяцев назад +1

      newer versions don't need log in

    • @masterTigress96
      @masterTigress96 4 месяца назад

      @@marcogenovesi8570 Newer versions of NVIDIA/Geforce Experience or whatever it is called? I know that you could download the actual drivers themselves manually without logging in, but since when did the shareholders allow this change of plan? I don't really have a Windows + Nvidia combo anymore so this is the first time I'm hearing that you no longer need to log in.

  • @SAFFY7411
    @SAFFY7411 5 месяцев назад

    Great review, Wendell. It'd be interesting to see different resolutions and how performance varies from the widely accepted standard that all other reviewers test at.

  • @OLDIRTYPRIEST
    @OLDIRTYPRIEST 5 месяцев назад +5

    Good review thank you

  • @whitemoses7913
    @whitemoses7913 5 месяцев назад +5

    Surprisingly, Sapphire used to have top quality cards... but Steve from HUB also showed high temp on his Nitro+ 7900 GRE as well so... I don't think it's just a one odd example here.

  • @XiaOmegaX
    @XiaOmegaX 5 месяцев назад +6

    Every card with limited bus width, is worth looking at 3200x1800 performance. (when you use driver level upscaling to 4K it looks like 4K for the most part. ive never been able to pixel-peep a difference on a 65" TV)
    Reason being there's often a massive performance delta between 1800p and 2160p just on account of framebuffer bottlenecking. Often to the point 1800p numbers are closer to 1440p fps than 4K fps.

  • @marktackman2886
    @marktackman2886 5 месяцев назад +5

    This feels like a 3090 Ti in regards to timing.

  • @YouTuber-jz5nd
    @YouTuber-jz5nd 5 месяцев назад +3

    I have a Powercolor Red Devil 7900XTX and the cooler is impressive, to say the least. Right now with the three 140mm fans in the bottom of my Fractal Torrent blowing cool air on it, the fans on my GPU are at 0 rpm.

    • @lukasschmidt9555
      @lukasschmidt9555 5 месяцев назад +2

      PowerColor has been doing fantastic job these last few gens.

  • @progenitor_amborella
    @progenitor_amborella 5 месяцев назад

    12:28 "settle in" as the fan's logo settles in.. nice. Also lmao nice m.2 stick.

  • @JamieStuff
    @JamieStuff 5 месяцев назад +4

    Competition is wonderful.

  • @jannegrey593
    @jannegrey593 5 месяцев назад +10

    Power Color Hellhound GRE is very good. I hope the price won't be horrible.

    • @limpa756
      @limpa756 5 месяцев назад +1

      It is, almost 800 pounds.

    • @jannegrey593
      @jannegrey593 5 месяцев назад

      @@limpa756 :(

    • @lukasschmidt9555
      @lukasschmidt9555 5 месяцев назад +1

      @@limpa756Where?

    • @lukasschmidt9555
      @lukasschmidt9555 5 месяцев назад +2

      It's great card, it beats even the Sapphire Nitro+!

    • @patrickobrienstr
      @patrickobrienstr 5 месяцев назад

      @@lukasschmidt9555where did you find this info at?

  • @paulomarf628
    @paulomarf628 5 месяцев назад +1

    Great design on the Steel Legend, got to love a all Steel Legend PC Gaming build.

  • @frogboyx1Gaming
    @frogboyx1Gaming 5 месяцев назад +3

    Best review I have seen yet.

  • @PC_Gaming_Tech
    @PC_Gaming_Tech 5 месяцев назад +4

    best tech channel and its not even close

  • @bart_fox_hero2863
    @bart_fox_hero2863 5 месяцев назад +2

    I’ve been waiting about a year now for a good value proposition in this price range. 7800xt was looking like the one, but perhaps it’s the 7900 GRE. This is the most compelling so far

    • @patrickobrienstr
      @patrickobrienstr 5 месяцев назад

      I’m in the same boat as you. I think I’d spend a little bit more money at this point. I’m curious on the different models and how they compare now.

  • @WeAreMovieMakers
    @WeAreMovieMakers 5 месяцев назад +2

    Would like to hear the more about the physical cards themselves and why you would pick one brand over the other.

  • @benjaminoechsli1941
    @benjaminoechsli1941 5 месяцев назад +1

    Now I need Level1Techs to make a Frosted Flakes knockoff cereal.

  • @lobyapatty
    @lobyapatty 2 месяца назад +1

    Would love to see more ML with this card

  • @GiSWiG
    @GiSWiG 5 месяцев назад +1

    I was hoping the 4070 Super was going to have 16GB but NVIDIA greedy. I always wanted to get a Red Devil and the 7900 GRE is looking good.

  • @amigatommy7
    @amigatommy7 4 месяца назад

    About to buy my first graphics card in years. Good timing.

  • @LA-MJ
    @LA-MJ 5 месяцев назад +3

    It would be nice to see more content on ROCm and ZLUDA. I will be looking into it whether my 6800 is of any use at all on Linux or do I need an upgrade (to GRE maybe?). The amd docs looked somewhat pesimistic last time

  • @VainaBorges
    @VainaBorges 3 месяца назад

    The ultrawide benchmark would be awesome! I have one and it's really hard to find anything about it

  • @ThePirateParrot
    @ThePirateParrot 5 месяцев назад +8

    Any chance of running proton benchmarks? I know all dozen of us gaming on linux would appreciate it.

    • @nathanl2966
      @nathanl2966 5 месяцев назад +3

      Probably over on the Linux channel, like he said.

  • @Jimmy-uk8df
    @Jimmy-uk8df 5 месяцев назад +2

    Would love to see an in-depth dive into the Adrenaline Overclocking settings on these cards. Was thinking about getting a 6800XT and selling my reference 6800, but if I do that, I would definitely consider the GRE instead. Just would love to see how close one can get it to the 7900XT

    • @johndelabretonne2373
      @johndelabretonne2373 5 месяцев назад

      I 2nd this! Very much interested in overclocking results with this card...

  • @ObakuZenCenter
    @ObakuZenCenter 5 месяцев назад +3

    Timestamps are a thing in the 2020s I hear.

  • @KiraDenys
    @KiraDenys 5 месяцев назад +3

    Don’t forget about 3840x1600! ;)

  • @SpudCommando
    @SpudCommando 5 месяцев назад +1

    It's basically a RDNA3 version of a 6900XT. Same 5120 cores and even 256bit bus.

  • @ElJewPacabrah
    @ElJewPacabrah 5 месяцев назад

    Powercolor coolers are incredibly impressive for how plain they look. I like the simplistic aesthetic though.

  • @lukasschmidt9555
    @lukasschmidt9555 5 месяцев назад +1

    Pretty impressive result PowerColor got there with the Hellhound, beating Nitro by a wide margin.. great cooler.

  • @AndersHass
    @AndersHass 5 месяцев назад

    AMD does also have frame generation that needs to b build into the engine (part of FSR 3) where Fluid Motion frame generation is driver based thereby can work on various games, though not as good as the build in game frame generation.

  • @chaddwick25
    @chaddwick25 5 месяцев назад +2

    After watching this video, I kinda want a RX 7900XTX now [weird]

  • @edwardshepherd9684
    @edwardshepherd9684 5 месяцев назад

    I have the 7800xt nitro plus and I can confirm my hotspots on certain titles do go above 25c delta, while the card was still in my pc case I very carefully tightened the screws on the retention plate and there was a very very small amount of movement in them, I would think that the thermal pads need to be heated up a few times to bed in a little better? after doing this my hotspot delta never goes over 25c, I also have experienced this on my cpu after repasting it, a very small tweak on the screws did give a little movement, also the nitro plus card fan curve even on performance bios could be a little better, there is plenty of power in those fans to keep the temps at below 60c with hotspot at no higher than 80c on the most demanding games, the 7900 gre just needs a few tweaks to get a lot more from it, great cards from amd very powerful.

  • @user_23165
    @user_23165 5 месяцев назад +1

    I wonder if now with the benefit of AI , we could have a control panel similar to "Geforce Experience" on both sides red and green where you just input the desired fps and it will automatically adjust your graphic settings in order to obtain that desired fps outcome. Not sure it will happen though ...

  • @dalaransadoringfan5267
    @dalaransadoringfan5267 4 месяца назад

    The 7900 GRE overclocked VERY well.
    You can catch up to that stock 4080 in many titles.

  • @johnvandeven2188
    @johnvandeven2188 5 месяцев назад

    Thanks Wendell for this review. Steve at HUB used the 4070 Super with his own comparison and his results levelled the GRE with the Super or very damn close. I will now sit through Steve at Gamers Nexus to seek out his opinion.

    • @ObakuZenCenter
      @ObakuZenCenter 5 месяцев назад +1

      HU used a reference China only model and that was really sloppy. No surprise, given their history of shady testing but not excusable, especially as their obvious bias has now been proven to be so laughably inaccurate and just plain wrong.

    • @johnvandeven2188
      @johnvandeven2188 5 месяцев назад

      @@ObakuZenCenter I have observed watching HUB that Tim is totally biased towards Nvidia while Steve will vote either way. It fascinates me no end watching Tim compare FSR against DLSS comparing one still frame zoomed ten fold in order to support his opinion that Nvidia has better tech. I find this just plain ridiculous as games are not played this way way let alone that any differences are unlikely to be noticed.

  • @Aetherbound
    @Aetherbound 5 месяцев назад +1

    would love to see 3440x1440 testing to see if i should get a 7900gre or 7900xt.

  • @pinhead1317
    @pinhead1317 5 месяцев назад +2

    Yas gimme ultrawide benchmark

  • @davidharris5365
    @davidharris5365 5 месяцев назад

    "Ahh it seems a lil sus" lol

  • @BachNguyen-dh4im
    @BachNguyen-dh4im 3 месяца назад

    Just got a Nitro+, these card are beasts

  • @camotech1314
    @camotech1314 5 месяцев назад +3

    AMD software is miles ahead of what NVIDIA gives.

  • @Akkbar21
    @Akkbar21 5 месяцев назад +1

    Did I tell anyone that my 3080 12gb strix does 390w out of the box? Glad it’s winter is all I’m saying. 😂 prob gonna re-up in the 6000 Gen.

  • @gab882
    @gab882 5 месяцев назад

    Any comment about the developer who released Open Source ROCm 5 ZLUDA? Mind if you can make a Windows tutorial for it? Thanks.

  • @galacticusX
    @galacticusX 5 месяцев назад +3

    It's 750 euros here in Europe, so 200 more than 4070. Sux.

    • @Pieteros21
      @Pieteros21 5 месяцев назад +1

      Here in Poland 2500pln so about 620euro - not good , not bad

    • @galacticusX
      @galacticusX 5 месяцев назад

      @@Pieteros21 Nice! I'm in Greece.

    • @zygis337
      @zygis337 5 месяцев назад

      An XFX model was available here in Lithuania for 570€ a month ago.

  • @Pawcio2115
    @Pawcio2115 2 месяца назад

    Helpful video but please add timestamps next time

  • @AvroBellow
    @AvroBellow 4 месяца назад

    According to TechPowerUp's GPU Database, the RX 7900 GRE is actually 2% faster, on average, than the RTX 4070 Super. I don't consider that significant but it shows that price-wise, the RX 7900 GRE competes with the RTX 4070 but performance-wise, it competes with the RTX 4070 Super.
    This is one of the reasons that I won't use an AIO. An X3D CPU doesn't warrant one and it just gets in the way.

  • @dnakatomiuk
    @dnakatomiuk 5 месяцев назад +1

    I have the power colour RX6600 Fighter and yes a lower end card sort of only uses 100w but when gaming boy is it incredibly energy efficient. As great as the RX580 card was back in it's time if you have that card like I did the RX6600 is it's replacement and it's even lower PWR usage

  • @a36538
    @a36538 4 месяца назад

    Still waiting for your 4070 super review

  • @Axcellaful
    @Axcellaful 5 месяцев назад

    420k subs!

  • @keyboard_g
    @keyboard_g 5 месяцев назад +1

    So now almost all of AMD’s Rdna 3 cards are a flavor of Radeon 7900.

  • @sylvania4558
    @sylvania4558 5 месяцев назад

    Nitro+ tend to be more power unlocked and faster than the rest hence why the temp is higher

    • @ElJewPacabrah
      @ElJewPacabrah 5 месяцев назад

      Better cooling allows for higher manual oc though. Also board power limits play a big role as well. Dont know the board power limits of either card. Thats another big factor.

  • @Abighairyspider121
    @Abighairyspider121 5 месяцев назад +1

    Wow, that 4070 12GB performs worse than a 3080 10GB. I'm thinking there's more than "not enough VRAM" happening here.

  • @markdaga1711
    @markdaga1711 5 месяцев назад

    If AMD would put out a card with local AI/ML workloads in mind that was roughly a 7600 XT with 32GB VRAM for like $600, they wouldn't be able to manufacture them fast enough no matter how many they made, and it would spur a bunch of devs to focus on AMD/ROCM support for LLM adjacent workloads.

  • @michaelmcconnell7302
    @michaelmcconnell7302 5 месяцев назад +2

    Of the 37 reviews out this morning, this is the only one im watching. Embargos are so lame.

  • @pat727321
    @pat727321 5 месяцев назад +1

    3440x1440 user here

  • @johndelabretonne2373
    @johndelabretonne2373 5 месяцев назад

    This is a marginally decent launch (better than the 7600 and 7600 XT launches), but it doesn't seem like AMD really wants to sell a lot of these; or a lot of 7800 XT at this point. In order for an AMD launch to even be interesting, they have to be at least 20% overall better than the competition at a minimum. They're launching this now because they are worried about increasing sales of 4070 & 4070 Supers; shooting something in between is their best bet, but If you compare this to the 4070 Super, it's about the same Raster perf, but only at a 7% discount. If you compare it to the 4070, it's roughly 12-15% better, but it costs 6% more than the cheapest 4070 at $520. They need to squeeze out more performance or cut the price a little more... The real saving grace of this GPU is if the overclocking uplift is really as good as the rumors suggest!

  • @philmarsden9594
    @philmarsden9594 5 месяцев назад

    16:9 is wide screen, wendel i expect better from you son

  • @tqrules01
    @tqrules01 5 месяцев назад +1

    Throw the memory way up and I bet it's not half bad just posting before watching the video 😎

    • @tqrules01
      @tqrules01 5 месяцев назад

      p.s he didnt touch the clocks but for the price its still not bad, just to bad its sold out everywhere in the part of Europe I'm in .... p.s the memory speed it apparently what you can adjust to get a lot more performance out of the card, undervolting the core and upping the memory speed. Hope he gets a chance to show us what kind of gains you can get.

  • @bigfishoutofwater3135
    @bigfishoutofwater3135 5 месяцев назад

    He thinks we know people that can afford houses and GPUs so we can go try it.

  • @ctjmaughs
    @ctjmaughs 5 месяцев назад

    Gforce experience was garbage requiring logins, Facebook if I remember

  • @streetguru9350
    @streetguru9350 5 месяцев назад +1

    This will probably be my next GPU, unless I spot like a $400 7800 XT in the near future. Or I just wait for the 8000 series because the ray tracing will probably be a lot better on top of everything else...But then that'll probably be like $600-800...There's just no winning is there?

  • @aakkhhii
    @aakkhhii 16 дней назад

    Add timestamps

  • @Alex_whatever
    @Alex_whatever 5 месяцев назад +1

    Wendell!
    When we gonna see some Linux benchmarks?!

  • @Benny-tb3ci
    @Benny-tb3ci 5 месяцев назад

    Cool video and all but maybe people should know that AMD might abruptly completely ignore massive issues with some *cough WoW cough* games and shove them under the rug.

  • @jakobw135
    @jakobw135 3 месяца назад

    Is DLSS TECHNOLOGY proprietary to Nvidia?
    If not, why doesn't AMD use it instead of FSR?
    When is AMD going to realize that users want a COMBINATION of action and graphics performance?

  • @dakoderii4221
    @dakoderii4221 5 месяцев назад +1

    I thought that GPU was a PS5 for a second

  • @USAF_Medic
    @USAF_Medic 5 месяцев назад

    Are people still buying GPU's?

  • @GroundGame.
    @GroundGame. 5 месяцев назад

    Its GRE-ate 🤣

  • @MrPunkassfuck
    @MrPunkassfuck 5 месяцев назад +2

    On the ROCM and Linux side of things, I would be interested in videos on what you can do with them. Stable diffuion tells me nothing. What about a source image and then showing a stable-diffused-image? What kind of fun things can be done with ROCM and a 6000/7000-series AMD card? I want to play with it but I don't know what is out there. What is possible and what would interest me. I am not a photo editor or graphics designer, what could a dummy like me do? =)

  • @SilkMilkJilk
    @SilkMilkJilk 5 месяцев назад +3

    Probably hard to find a 14900k with a memory controller that can't do at least 7200. Cmon now, you of all should be able to do it.

    • @hydroponicgard
      @hydroponicgard 5 месяцев назад +1

      RAM speed barely, if at all, affects the performance, so it's not really a big deal.

    • @SilkMilkJilk
      @SilkMilkJilk 5 месяцев назад +1

      @@hydroponicgard haha, you only think that because you watch those xmp tech tubes mate^^

  • @Boorock70
    @Boorock70 5 месяцев назад +4

    $550 is still too MUCH for GRE !
    Make it $500 (and the 7800XT $450ish) then the cards will fly away from the shelves... 👍

    • @ElJewPacabrah
      @ElJewPacabrah 5 месяцев назад

      I agree. If people arent buying gre they will.

  • @ChengsHardware
    @ChengsHardware 5 месяцев назад +2

    Sadly, it's Dragon year now🤣

  • @julfy_god
    @julfy_god 5 месяцев назад +2

    with locked down overclocking it's basically a non starter. just a slightly faster 7800xt with better rt performance, that's it. and if you really want that rt perf, why not just spend 50 bucks more for 4070 super?

    • @danieloberhofer9035
      @danieloberhofer9035 5 месяцев назад +3

      What "locked down overclocking" are you talking about? The 7900GRE has the best OC potential of all cards this gen.

    • @highpraise-highcritic
      @highpraise-highcritic 5 месяцев назад +2

      ​@@danieloberhofer9035 They have no clue

    • @julfy_god
      @julfy_god 5 месяцев назад

      @@danieloberhofer9035have you ever tried to overclock amd 7000 card lol

  • @a36538
    @a36538 4 месяца назад

    AMD cares about gaming LOL when was the last time AMD was first to market with VRR, RT, upscaling, and reflex

  • @zodwraith5745
    @zodwraith5745 5 месяцев назад +2

    This is what the 7800xt should have been instead of barely matching the 6800xt. It's unfortunate we don't see decent cards until the last quarters of a generation before they're all replaced. More unfortunate that it's a _worse_ value than the existing 7800xt according to others' reviews. Somehow Wendell must have a golden sample because he's seeing far better numbers than other reviewers for the Gimped Rabbit Edition.

    • @doctorspook4414
      @doctorspook4414 5 месяцев назад

      Milking as much out of the consumers as possible that have FOMO mindset.
      Same with the CPU side, why didn't AMD have the X3D cache on all of their 5000 range of CPU's?
      Literally released them at the tail end as less and less mobo manufacturers are producing AM4 boards.

    • @zodwraith5745
      @zodwraith5745 5 месяцев назад +1

      @@doctorspook4414 Yeah, going over other reviews today Wendell's numbers look questionable as everyone else is seeing FAR smaller gains on the GRE over the XT. He either has a golden sample GRE or the world's worst Nvidia cards. Everyone else shows it doesn't even cover the price increase over the XT, making the GRE a _worse_ value.
      Between this, the 5700 non-x, and the 5700X3D AMD is literally selling everything they can dig out of the garbage cans.

    • @danieloberhofer9035
      @danieloberhofer9035 5 месяцев назад +1

      ​@@doctorspook4414Let's stay reasonable. The GRE is debatable and pricing needs to adjust after launch, I certainly give you that.
      But your comments on Vermeer-X are just not accurate. When Vermeer (aka Zen3 or Ryzen 5000) launched, they didn't have 3D-vCache ready, they had just started bring-up of the first samples themselves. Also remember, that was even before Alder Lake was out. Intel's competition at the time was 10th gen and that abomination 11th gen.
      Moreover, 3D-vCache was originally intended as a datacenter product for MilanX, and the later success as a gaming focused part started out as a skunkworks project, because they had some leftover chiplets around and someone thought "Why not on desktop? Let's give it a try and see what happens." Originally, they didn't even intend to launch it to consumers, but when they realized they had a golden opportunity at hand, of course they did.
      And just as today (and certainly with Zen5, as well), there's no reason to equip the whole lineup with 3D-vCache, because not every customer values maximum gaming performance over everything else, and the stacked cache comes with drawbacks other than costing more to make. So there's a very good reason to differentiate and have CPUs without it.

    • @foxs49er
      @foxs49er 5 месяцев назад +1

      ​@@zodwraith5745 Or a little of both. Maybe its not golden sample but a good sample vs a bad sample. +3% here -2% there vs their avg siblings instead of a golden sample.

    • @zodwraith5745
      @zodwraith5745 5 месяцев назад

      @@foxs49er Still doesn't explain why his numbers are SO much higher than the 4070 compared to everyone else. He must have the world's worst 4070.

  • @gcbification
    @gcbification 5 месяцев назад

    No money in china to be able to sell the things =)

  • @neo-vj4zq
    @neo-vj4zq 4 месяца назад

    The way RUclipsrs be hawking this shit amd must have sold fuck all in China with o how hard they are pushing them on the rest of the workd

  • @recoilman24
    @recoilman24 5 месяцев назад

    I am not going to buy something called golden rabbit edition, ever. But still watched the whole video and might even consider researching AMD to replace the 3070. Been on nvidia only forever. 2 trillion dollar company with windows98 looking control panel is a source of much chuckling in my group.

  • @johnmajewski4192
    @johnmajewski4192 5 месяцев назад

    4070 super all day no question

    • @highpraise-highcritic
      @highpraise-highcritic 5 месяцев назад +4

      Terrible mindset. You should probably ask more questions, and not just about GPUs.

    • @ElJewPacabrah
      @ElJewPacabrah 5 месяцев назад

      @@highpraise-highcriticbig time

  • @la3692
    @la3692 5 месяцев назад

    Why did AMD not put out a card that can compete with the 4090???

    • @ObakuZenCenter
      @ObakuZenCenter 5 месяцев назад +2

      Why should they? Almost nobody buys the 4090.

  • @czbrat
    @czbrat 5 месяцев назад

    12:05 LOL
    AMD cards cost more than their Nvidia counterparts in my country of India. Nobody buys AMD here, practically a non option.

  • @abavariannormiepleb9470
    @abavariannormiepleb9470 5 месяцев назад

    Still sour about the 8700G launch spec switcheroo shenanigans. AMD has to do better, not introduce useless market segmentation.

    • @bosstowndynamics5488
      @bosstowndynamics5488 5 месяцев назад +3

      The only launch shenanigans that I can find for the 8700G are about a random spec sheet reference to ECC support that got removed with the consensus seeming to be that it was an accidental listing rather than deliberate spec downgrade, agree that they need to do better but seems pretty mild all things considered (AMD in general have been pretty good about consumer ECC support, with their desktop APUs seeming to be a consistent exception rather than only just being excluded this generation).

  • @blender_wiki
    @blender_wiki 5 месяцев назад +1

    "The most interesting thing about AMD's new GPU is NVIDIA" 😂😂💀💀💀💀
    A question: why are most Tech RUclipsrs unable to understand that GPUs aren't only a gamers toy but mostly a tool for professionals ? 🤔🤔🤔

  • @ogChaaka
    @ogChaaka 5 месяцев назад

    Golden rat edition

    • @Terran.Marine.2
      @Terran.Marine.2 5 месяцев назад

      Perhaps

    • @Alfenium
      @Alfenium 5 месяцев назад +2

      Golden BNUUY you can't just MISCLASSIFY THEM IT IS *VERY OFFENSIVE!!*

  • @xXRenaxChanXx
    @xXRenaxChanXx 4 месяца назад +1

    Ah yes, the issue is totally vram being "too small" and not the fact modern games are a bloated mess.

  • @TrueThanny
    @TrueThanny 5 месяцев назад

    Look how long it took you to distinguish between two resolutions just because you're using the absurd term "1440p". Just state the width and height, so it's no longer mysterious what you're talking about.
    Resolutions are not some number followed by a "p". I don't know why that caught on, but it's extremely annoying. It's not even correct for 1080p, which is an HDTV signalling standard, not a resolution.

    • @Skelethin
      @Skelethin 5 месяцев назад

      Resolutions for TVs have been a number followed by a "p" or "i" for as long as there has been any form of HD screens.
      480 was standard TV resolution, then 720i, 720p showed up, followed by 1080i, then 1080p.
      "i" is for interlace, and "p" is for progressive, regarding how the TV updated it's pixels.
      You should learn the history of why technology has the labels it does before running your mouth like an idiot, because 1440p is, and has been, a standard resolution for over a decade.

    • @TrueThanny
      @TrueThanny 5 месяцев назад

      @@Skelethin The HDTV standards includes two signal specifications that end in "p". 1080p and 720p. The signal specifications include the resolution and refresh mode. They are not themselves resolutions.
      I know full well the history of these things, as I was there and paying attention when they were created. You're talking nonsense.
      The habit of throwing "p" at the end of a line count to replace resolutions has never been correct, and serves only to sow confusions, which is evident whenever someone tries to refer to historically prevalent resolutions with that absurd naming scheme.
      The fact that so many have taken up an incorrect terminology doesn't make it any better.

    • @Skelethin
      @Skelethin 5 месяцев назад

      @@TrueThanny except that they are still *standardized resolutions* that are*commonly and universally accepted*.
      "Throwing p at the end" is to specify that the resolution is using progressive scan for its refresh, and it has whatever the number it is as the number of vertical pixels. And with the standard being 16:9 for screen radio, knowing one of the dimensions means you *also know* the other.
      Your bitching about people *properly using p to represent a screen resolution* is pedantic, stupid gatekeeping nonsense to whine about not everyone agreeing with your personal, specific preference to have *the entire specifics* of a screen size be used all the time.
      If you think anyone would commonly use 1920x1080 or 2560x1440 *every single time* they referenced 1080p or 1440p - especially for something where it is so repetitively used as a review of a GPU - you are an idiot. It's needlessly repetitive, very annoying, a waste of time, and listeners would get annoyed and sick of it really quickly.
      Also if you have this much hate for the user of 720p, 1080p, 1440p, you must really hate how effectively useless descriptors like '4K' or '8K' are, as they don't give any actually real numbers like 2160p(3840x2160) would instead of '4K'. Except 4K is a marketing gimmick term that is too embedded to just drop for 2160p.