AMD kept their best GPU a secret - 6950XT Review

Поделиться
HTML-код
  • Опубликовано: 20 дек 2024

Комментарии • 1,9 тыс.

  • @Google_Does_Evil_Now
    @Google_Does_Evil_Now 2 года назад +2827

    GPU's are going to need a "summer mode" so they don't cook us while we game. Very useful for winter though, it's getting cold so I better game for an hour. Maybe we will even see new case designs that can direct the warm air towards us or outside via ducting.

    • @spagootest2185
      @spagootest2185 2 года назад +101

      you can underclock slightly to solve the first issue :)

    • @PNWAffliction
      @PNWAffliction 2 года назад +81

      100% agree idk why they haven't done this yet tbh. my 590 was a great heater in the long winter months.

    • @altus1226
      @altus1226 2 года назад +39

      "Quiet" mode or just turning down the max power usage will do it. I rarely run my 300Watt capable GPU over 220watts, and it's most often closer to 140watts- it idles around 8Watts.

    • @JorgetePanete
      @JorgetePanete 2 года назад +2

      GPUs*

    • @wisllayvitrio
      @wisllayvitrio 2 года назад +40

      Given how expensive gas has become in Europe that's a nice feature to have.

  • @Kraaketaer
    @Kraaketaer 2 года назад +648

    It's worth pointing out that the power consumption seems to be unique to this specific model - competitors from Gigabyte and Sapphire run closer to 380-390W with performance being very close. Still ludicrous, but far less than this.

    • @paulmeyer1001
      @paulmeyer1001 2 года назад +9

      "competitors from MSI"...it's an MSI gpu

    • @Kraaketaer
      @Kraaketaer 2 года назад +59

      @@paulmeyer1001 Lol, I meant Gigabyte. TPU has three 6950XT reviews, guess I got two of them mixed up.

    • @ItsJustVV
      @ItsJustVV 2 года назад +13

      Actually Gamers Nexus review of the Sapphire Nitro+ Pure 6950 XT shows the exact same crazy power consumption as this MSI version (or more). So it looks like the more OC-ed versions actually run this high in power and thermals too. Watch their video.

    • @zaidlacksalastname4905
      @zaidlacksalastname4905 2 года назад +4

      You're not gonna like Lovelace then lol

    • @Kraaketaer
      @Kraaketaer 2 года назад +11

      @@zaidlacksalastname4905 If the rumors are accurate, I doubt anyone not owning their own power plant will like that much. Though I guess a lot of people can use a space heater in winter.

  • @qwertyuiop.
    @qwertyuiop. 2 года назад +1013

    Missed opportunity there. They should've called in 6969 XT.

    • @Boringpenguin
      @Boringpenguin 2 года назад +39

      0:14 We were on the verge of greatness, we were this close.

    • @DvineCupcake
      @DvineCupcake 2 года назад +55

      6969 XD

    • @3ormorecharactersmaybe5
      @3ormorecharactersmaybe5 2 года назад +7

      Nah, the performance of it doesn't really justify the "6969 XT" moniker.
      However, AMD isn't like Nvidia, where their "TI" cards only means "Tie" now, phonetically and literally in performance, so I guess that's a good thing.

    • @suba1030
      @suba1030 2 года назад +1

      💀

    • @Mystic-Voyager
      @Mystic-Voyager 2 года назад +16

      6969 XXX

  • @IssacSir
    @IssacSir 2 года назад +326

    Maybe an idea for a video, with the current prices for energy going up, what would be a decent rig to build with as much power efficiency as possible in mind, while running current gen games competently. = ) Most obvious builds would also most likely end up being in the affordable range, while a maxed out build could be expensive but interesting as in what parts can actually deliver great performance while heavily stunted in terms of power allowance etc... I feel like lowering power consumption should be a market advantage, but is not covered enough.

    • @tilapiadave3234
      @tilapiadave3234 2 года назад +29

      I would love to see that also . MUST include IDLE power consumption as well. It AMAZES me the stupidity of some AMD fans blabbering on about an AMD cpu using 10 watts less and at the same time they run their entire system on a CHEAP low grade PSU that wastes MUCH more power.

    • @louisvaught2495
      @louisvaught2495 2 года назад +4

      This question is already sort-of answered. Because of how binning goes, the most efficient hardware is usually high-end stuff that's been undervolted/downclocked/etc.
      And generally the best way to get power efficiency is to downclock, at least on desktop hardware which tends to be tuned a little bit past the point of diminishing returns.

    • @reijhinru1474
      @reijhinru1474 2 года назад +2

      5800x3d + rtx 3070 Founders Edition. Both undervolting and you are at probably 250wattage at 100%load. Super strong and efficient.

    • @williampaabreeves
      @williampaabreeves 2 года назад +13

      @@tilapiadave3234 using a lower quality PSU wouldn't be anything to do with AMD, that would be down to the person choosing that PSU

    • @tilapiadave3234
      @tilapiadave3234 2 года назад

      @@williampaabreeves WHAT a REDICULOUS thing to say......The person choosing the PSU , yes , and as I stated those people are choosing CPU's just because an AMD is a few watts lower in power usage.

  • @robertsay4374
    @robertsay4374 2 года назад +1272

    Wow that GPU is almost as big as my whole PC! Nice to have a fan heater which can also play games though, love a good multitasking appliance 🤪

    • @jankratochvil9779
      @jankratochvil9779 2 года назад +54

      Yea GPU is basicaly another pc inside your own pc, another ram, graphic processor, motherboard and cooling. GPUs nowadays shows it to us proudly with its lenght, hight and even its weight is quite impresive.

    • @geerstyresoil3136
      @geerstyresoil3136 2 года назад +11

      Equivalent Nvidia runs hotter and uses much more power actually.

    • @MyrKnof
      @MyrKnof 2 года назад +7

      nvidia users have had that for the whole 3000 generation :'D

    • @JustIn-sr1xe
      @JustIn-sr1xe 2 года назад

      Chuck two of them on top of each other, fan to backplate & you have the size of my PC.

    • @theeuropeanlegacy5075
      @theeuropeanlegacy5075 2 года назад +1

      @@geerstyresoil3136 and much better in all around performance

  • @Raress96
    @Raress96 2 года назад +616

    Didn't expect this to beat the RTX 3090 ti in gaming, especially costing so much less.

    • @Gamingtechgg
      @Gamingtechgg 2 года назад +18

      dont talk shit ;D they will find you and it wont lead to good stuff you know

    • @helloukw
      @helloukw 2 года назад +10

      Yes, but can you brag you have a 3090 Ti?

    • @maaax1173
      @maaax1173 2 года назад +216

      @@helloukw Well I could brag I have a 6950XT just as well. IMO, it‘s even cooler because I didn’t waste 1000 bucks

    • @mohammeded-dahbi7603
      @mohammeded-dahbi7603 2 года назад +8

      It's basically an overclocked 6900xt, overclocked 3090ti srill beats it

    • @Fancy405
      @Fancy405 2 года назад +153

      @@mohammeded-dahbi7603 then a overclocked 6950xt beats it

  • @Michael-jb6oc
    @Michael-jb6oc 2 года назад +282

    As someone who has an Nvidia Mx130 I can confirm this gpu is good.

    • @yasaldesilva
      @yasaldesilva 2 года назад +6

      Lmao same. mx130 gang

    • @Michael-jb6oc
      @Michael-jb6oc 2 года назад +10

      @@yasaldesilva I get like 40-60 fps on Valorant lol, If I plug in my laptop cooler i get 70+

    • @zero-ej6rt
      @zero-ej6rt 2 года назад +4

      The quadro m620 gang where? (cut down gtx 950m that still performs the same somehow)

    • @hexoson
      @hexoson 2 года назад +1

      @@zero-ej6rt Sorry, I'm Quadro P520 gang...
      ...and RX 6600XT gang.

    • @twizz420
      @twizz420 2 года назад +4

      Intel UHD 730 iGPU represent!

  • @jlm2648
    @jlm2648 2 года назад +66

    I still cannot comprehend how Nvidia thought it was okay to release their 3090 with those kind of heat issues, seeing it sit that high above the rest of the cards on that graph is just depressing

    • @spagootest2185
      @spagootest2185 2 года назад +25

      having the fastest card makes shareholders happy, no matter the consequence

    • @tilapiadave3234
      @tilapiadave3234 2 года назад +6

      @@spagootest2185 And the STUPIDITY of that is amazing ,, the VAST majority of sales are in the mid performance level

    • @iprfenix
      @iprfenix 2 года назад +7

      They really fucked up the cooling solution for the 3090. Having memory on both sides of the PCB and no real cooling for that backside is terrible. It's common for my 3090 to be at 80C core but have memory pegged at 100-110C. It's absolutely insane. Nvidia made water cooled back plates go from 'pointless gimmick' to actual necessity.

    • @wa2368
      @wa2368 2 года назад +4

      Nvidia sucks man.

    • @vmafarah9473
      @vmafarah9473 2 года назад

      I think these over heating cards may die sooner than mid range cards

  • @TheKazragore
    @TheKazragore 2 года назад +469

    From the very beginning of the RDNA2 generation it was clear AMD wasn't focusing on productivity, but gaming. Which makes sense because they split their architecture between RDNA and CDNA. Nvidia doesn't do this which is why their cards tend to perform better in productivity; that's what they're designed for, but aren't marketed for.

    • @ThaexakaMavro
      @ThaexakaMavro 2 года назад +43

      because cuda optix and all proprietary bs

    • @evertchin
      @evertchin 2 года назад +29

      @@ThaexakaMavro only the losing side would blame on proprietary, if being an open standard is so great why amd hasn't caught up yet...?

    • @kkon5ti
      @kkon5ti 2 года назад +5

      they‘re

    • @Niosus
      @Niosus 2 года назад +89

      @@evertchin It's a bit of both. CUDA is more mature and older, so a lot of software is better optimized for CUDA. Nvidia has put a lot of work into CUDA, but they've also kept it proprietary. The open standards aren't as mature or well supported/documented, so a lot of software doesn't support that or the support is of much lower quality. AMD is trying to close that gap, but it's nearly impossible to close Nvidia's lead while Nvidia is still investing so heavily into the platform.
      So yes, these issues could be caused by the software makers not putting enough time/effort into the open standards to make them as performant, but no that is not an excuse. While the AMD hardware may be competitive or even superior, for a lot of workloads it is simply the inferior platform.

    • @jojivlogs_4255
      @jojivlogs_4255 2 года назад +74

      @@evertchin imagine being so smooth brained you break it down into winning and losing sides

  • @danielberglv259
    @danielberglv259 2 года назад +52

    If these cards keep growing in size like this, we will end up plugging the motherboard on to the GFX PCB

    • @fuchu92
      @fuchu92 2 года назад +2

      Loot at rtx 4090, humongous 😮

    • @pieterrossouw8596
      @pieterrossouw8596 Год назад

      I have the XFX 6800XT on an ITX motherboard - it absolutely looks as silly as you'd expect. That said, it runs like a champ.

  • @nanolog522
    @nanolog522 2 года назад +438

    I believe that splitting their architectures in two, RDNA and CDNA, AMD hindered their image.
    It helped them to increase traditional gaming performance relative to size of the die and TDP, but when people look at benchmarks, they see „AMD is a tiny bit faster than Nvidia when gaming, but sucks at rendering“, even though most don’t render anything.

    • @DanKaschel
      @DanKaschel 2 года назад +91

      Maybe. I think if AMD can convincingly trounce Nvidia in gaming, productivity scores won't hurt their sales much. Right now it's things like ray-tracing, driver compatibility, and DLSS that make me still favor Nvidia, but boy am I ready to jump ship.

    • @jamesyu5212
      @jamesyu5212 2 года назад +85

      Absolutely false. AMD’s one size fits all strategy pre-Rdna made them a jack of all trades master of none.

    • @s8wc3
      @s8wc3 2 года назад +1

      Is the RDNA hw actually crap at it though or is it just garbage... or dare I say purposefully nerfed drivers. The world may never know

    • @lasbrujazz
      @lasbrujazz 2 года назад +10

      Makes sense. I mean, AMD provides the ability to custom their products to their customer's needs, such as console that doesn't need the capabilities to run SPECviewperf. Splitting the capabilities and needs would provide more flexibility for AMD.

    • @theunknown2923
      @theunknown2923 2 года назад +11

      I don't know if it helps, but I would actually switch on Radeon Software from Graphics to Compute to mine sometimes (Radeon Software, Click on the top right Gear - Graphics - Advanced settings and it's on the bottom, "GPU Workload". It might be enough to sweep NVIDIA in some productivity/compute tasks, like SPECviewperf ?

  • @ravenclawgamer6367
    @ravenclawgamer6367 2 года назад +60

    AMD card at 335W trades blows with Nvidia card at 465W. RDNA 3 will definitely become the new best in the GPU industry.

    • @chem1kal
      @chem1kal 2 года назад +21

      only reason amd is losing because of nvidia fan boys buying a double price 6900xt for barely any gain

    • @ravenclawgamer6367
      @ravenclawgamer6367 2 года назад

      @@chem1kal please elaborate

    • @denis1428-m1i
      @denis1428-m1i 2 года назад +4

      they showed later in the video it was drawing about the same power as the 3090Ti

    • @gambino883
      @gambino883 2 года назад +11

      @@ravenclawgamer6367 There are still a LOT of fanboys of nGREEDIA. Just look at the market share for GPUs. NGREEDIA still has over 60% of the market.
      Yet AMD has proven it's better in price/performance with the 6000 series. BUT BUT MUH 2k$ 3090. It's sad really.

    • @chem1kal
      @chem1kal 2 года назад +3

      @@ravenclawgamer6367 what the guy above me said.

  • @EdwinPohan
    @EdwinPohan 2 года назад +66

    Price-wise, AMD GPUs are the way to go. Until NVIDIA gets their pricing as “normal” as AMD, it seems Radeon GPUs will be the king of value this time around.

    • @ancientflames
      @ancientflames 2 года назад +5

      They always have been the king of value. 480 and 580 sold like hot cakes for this exact reason.

    • @cythascruseo6696
      @cythascruseo6696 2 года назад +1

      Haven't they always?

    • @OugaBoogaShockwave
      @OugaBoogaShockwave 2 года назад

      that's like me saying chickens lay eggs 🤣🤣

    • @ILoveTinfoilHats
      @ILoveTinfoilHats 2 года назад

      @@ancientflames because miners, not gamers. The 480 and 580 we're flops in the eyes of gamers

    • @cythascruseo6696
      @cythascruseo6696 2 года назад

      @@ILoveTinfoilHats HAHAHAHAHAHAHAHAAHAHHA

  • @shadow105720
    @shadow105720 2 года назад +95

    The "ultra overclocked" super cooler versions wouldn't be MSRP even in a perfect world they would be at least $100 more.

  • @CrispymexicanDUCK
    @CrispymexicanDUCK 2 года назад +89

    I gladly took a 3080 at just over MSRP about a week ago. I'm sure the next generation cards launching later this year will be awesome, but it's just going to be a repeat of the 30 series launch in my eyes.

    • @Hybris51129
      @Hybris51129 2 года назад +4

      I am debating getting a 3090 TI for just that reason. I could wait to see if I can get a 4000 series card or bite the bullet and buy the current gen.

    • @galgrunfeld9954
      @galgrunfeld9954 2 года назад +2

      @@Hybris51129 I'd buy it if I were you after the new cards drop - I think the interest in them would lower its price and potentially increase its availability.

    • @toddblankenship7164
      @toddblankenship7164 2 года назад +1

      value is best if you buy the cards at release. I got my 3080 a few months after release for 1200 Canadian and that was just before the prices went mental.

    • @Hybris51129
      @Hybris51129 2 года назад +2

      @@galgrunfeld9954 That is something I am seriously considering as well. Since my buy new GPU's only every 5-7 years or 10 years for my current one I have to try and get the absolute best I can get.

    • @cseversin
      @cseversin 2 года назад

      Congrats! I camped out at Best Buy last summer for a 3080 FE at MSRP and I don't regret it one bit, haha. Enjoy!

  • @vmystikilv
    @vmystikilv 2 года назад +14

    Picked up on release day. Been waiting 2 years for card prices to normalize. Had 3dfx and nvidia for over 20 years. Even as a Nvidia fan boy, i am beyond impressed with this card and luckily i got it on release day at MSRP pricing. For the price and power, it's what I been waiting for. If Nvidia ever gets their stuff together ill prob go back but until then, I am happy with this card.

    • @thatoutsider
      @thatoutsider Год назад

      Any update after 10 months? How is it performing? Deciding on this or 4070ti

    • @vmystikilv
      @vmystikilv Год назад +8

      @@thatoutsider still have the card and no regrets . Handles everything beyond well and have no desire to upgrade to any 4000 series or 7000 series card. This card has been solid

    • @thatoutsider
      @thatoutsider Год назад

      Thanks for sharing!

    • @xredoxi
      @xredoxi Год назад

      @@vmystikilvThat's nice to hear, i just ordered the 6950xt today because my rtx 2070 doesn't have enough vram for games like Resident Evil 4 unfortunately, it will be delivered in 2-3 days im so excited for that one, also ordered a new 240hz WQHD screen that will also come in 2 days.
      Im excited like a kid rn and can't wait anymore :D

    • @chesesuz3784
      @chesesuz3784 Год назад

      ​@@xredoxiwhat cpu do you have?

  • @LatinLegacyNY
    @LatinLegacyNY 2 года назад +322

    Too little too late. Although it will be chaotic trying to get a next gen GPU when they release later on in the year, they'll be worth waiting for. Refreshes like this should of dropped towards the end of last year but it's understandable why it didn't with everything that was going on with the shortages & a pandemic.

    • @petervansan1054
      @petervansan1054 2 года назад +10

      nah

    • @gondolagripes1674
      @gondolagripes1674 2 года назад +17

      They probably wanted to wait until they had enough units to ship.

    • @thunderarch5951
      @thunderarch5951 2 года назад +19

      Better than Nvidia releasing 200 high end GPU you couldn't buy, lmao

    • @sharpsnowflake8721
      @sharpsnowflake8721 2 года назад +1

      Just wait if china decides to storm taiwan inspired by russians in ukraine we will be home soldering 486s to play tetris and not these vague kids arguing when is it best to buy 600w eating toy to play games.

    • @girlsdrinkfeck
      @girlsdrinkfeck 2 года назад +3

      technological prices increase at a higher curve than the performance given is due to it being stagnated ,until a new type of processing tech is created we will always be paying a extreme price for little performance gain, same applies to smartphone market ,petrol base cars had this issue for many decades now ,its just started to happen to the silicon market

  • @RaysGuide
    @RaysGuide 2 года назад +43

    With these cards drawing more and more power and consumers not really seeming to notice maybe it is the time to have the same 'annual energy consumption' labels like we see for home appliances. The energy draw of this card puts it square in the middle of the range of a full size refrigerator.

    • @shadowreaver752
      @shadowreaver752 2 года назад +8

      I'd like to see them perform with power restrictions

    • @luminatrixfanfiction
      @luminatrixfanfiction 2 года назад +2

      This is going to be funny to see with California state banning gaming pcs because they draw too much power. The politicians in that state are going to have an aneurysm when the 4080ti cards draw more than 600 watts.

    • @ram89572
      @ram89572 2 года назад +2

      You make a lot of assumptions about power draw. Different games and activities hit these cards at different levels of power draw. People game in drastically different amounts. All you would be doing by throwing a regulation that they be required to come up with a number based on completely arbitrary input values is add to the cost of the cards. That makes literally 0 sense. With a refrigerator you have a device that is running constantly. It will still vary person to person but at the very least you can give a good general idea of annual usage. You cannot do that with a gpu. Government regulation is not the answer here and that is ultimately what you are asking for.

    • @ram89572
      @ram89572 2 года назад +1

      I would also add that no I don't give that much of a damn about the energy consumed by the card. My a/c consumes far more in a year and is far more impactful to my bill considering I live in a hot ass climate. The amount of time my card is pulling a lot of energy is limited and is a minor blip in annual cost

    • @canadajones9635
      @canadajones9635 2 года назад +5

      @@luminatrixfanfiction No, they did not ban gaming PCs. They put higher minimums on efficiency and power draw at idle for pre-built computers. Dell and some other prebuilt manufacturers were just shipping such dogshit PSUs that they had very little that met the requirements.

  • @DuncDog
    @DuncDog 2 года назад +73

    As a fun sciencey follow-up, RDNA has a terrible habit of not being able to hold lower FPS at Freesync and/or V-SYNC levels. I play more than a dozen games that my 6900XT can push well above the 170 FPS at 1440p of my display, but by capping at that setting, I save nearly 100W off the GPU draw. Worth it!
    However, this is a double edged sword as the AMD driver/vbios does it's best to minimize clock speeds if the Core isn't at 100% load, and in some cases, even if it is at 100% load. Minecraft, Deep Rock Galactic, and at least a dozen others all can't maintain their FPS stability, or in some cases, even refuse to clock up at all (particularly minecraft!) resulting in some cripplingly low performance from something that should be running at 400fps.
    My old 1060 Max-Q Dell laptop runs Minecraft at 2-3x the FPS of my Strix G15 6800M OR my desktop's 5600X+6900XT... tell me what doesn't look right here lol.

    • @thevaultsup
      @thevaultsup 2 года назад +22

      It's just that RDNA in general runs poorly on outdated OpenGL titles, that why it runs terrible on Minecraft, i think Sodium mod or Lunar Client can somewhat fix that

    • @luminatrixfanfiction
      @luminatrixfanfiction 2 года назад +10

      @@thevaultsup No he is right, RDNA under clocks when not at 100% causing spikes in frame times which is what contributes to micro stutters in some titles. It's a known issue, one I hope AMD will fix. Theoretically, it should be a simple fix by running a script that encourages the gpu to remain at 100% load while gaming but there's a reason why they designed it this way, for longevity of the card to last longer and with less power draw.
      Bit pointless if you ask me since most people buy new cards every 4 years.

    • @OGPatriot03
      @OGPatriot03 2 года назад +2

      Sodium for Fabric MC is where it's at.

    • @tofuguru941
      @tofuguru941 2 года назад +17

      I don't have this problem with my 6900xt. I run a high "minimum frequency" setting in the AMD software and that solves it for me.

    • @PaNDoRaKiNGg
      @PaNDoRaKiNGg 2 года назад +4

      u can fix this with MorePowerTool taking the power saving feature off. freethy has video about this, been running for months 6900xt with no probs 2600/2100mhz.

  • @T4gProd
    @T4gProd 2 года назад +13

    As someone living in the Nordics where we have no cooling air conditioning, since it's not normally needed, I'm starting to look at the wattage these new cards demand. I rock a 3080 and my room gets pretty toasty if I'm playing a graphically intesive game. My PC drains about 500 Wats at max load, it really heats up the room. I really didn't care about power draw before, but now we're reaching levels where it actually matters.

    • @vikingro
      @vikingro 2 года назад +1

      GTX 1070 here - the whole psu provides around 150-200 W at full load - Intel 10700, nothing overclocked. I want to be able to play and watch movies in 4k, but man, when you are saying 500 W, you are ripping my world. I don't want to sell my future for electricity. What next, 1000 W per hour?!

  • @snowbeddow
    @snowbeddow 2 года назад +39

    I would like to see a look into the power vs performance increases over the years. It seems like all they are doing is pushing up the power to get more performance, has the performance increased much per watt in the last few years?

    • @emissarygw2264
      @emissarygw2264 2 года назад +5

      Agreed, we're getting to power levels comparable to that of an actual microwave... 10 minutes of gaming to cook your dinner, an hour or two to set your house on fire lol

    • @matejnovosad9152
      @matejnovosad9152 2 года назад +5

      In cpu it has. In gpu not really

    • @goblinslayer5404
      @goblinslayer5404 2 года назад

      it has, but so has the upper limit of power consumption

    • @tilapiadave3234
      @tilapiadave3234 2 года назад

      @@goblinslayer5404 There is NO upper limit ,,, RX 9990xt SUPER shipped with it's own small nuclear power system

    • @dakyno
      @dakyno 2 года назад

      Power efficiency has improved, but the truth is if you can afford a 6900xt or a 3090, you can probably afford a 1000W PSU and electricity bills.

  • @dominickyeo
    @dominickyeo 2 года назад +8

    My first GPU was an AMD Radeon 7970. Looks like it is going to go full circle with their naming conventions. Ill definitely buy an RX 7970 XT if it comes out for nostalgia sake

  • @bjet1371
    @bjet1371 2 года назад +15

    Oh boy, you had to mention the GPU shortage getting better 😂

  • @TheTardis157
    @TheTardis157 2 года назад +2

    With how hot these cards run I can see the return of side panel fans in our near future. It's still one of my favorite parts of my old HAF 932.

  • @william132465
    @william132465 2 года назад +3

    Got my 6950xt off amds site today thank god…would have been fine with the 6800xt but just the price difference in Canada was just worth getting the 6950xt

  • @bigoof9170
    @bigoof9170 2 года назад +6

    3:59 Holy crap is that a GD reference

  • @TheCallMeCrazy
    @TheCallMeCrazy 2 года назад +4

    There was more going on here than just binning - GDDR6 yields have come a long way, and this allows them to make use of the higher quality chips without downclocking them to match 6900XT specs.

  • @chrisdavis273
    @chrisdavis273 Год назад +2

    Just bought ASRocK 6950xt for for 630. Can’t wait!

    • @TheWorstGamerr
      @TheWorstGamerr Год назад

      What is the maximum temperature on your 6950 XT?

  • @wileymonair
    @wileymonair 2 года назад +4

    It's really good to see D finally catching up to Nvidia again, I hope they come out ahead soon to keep competition strong.

  • @Karvan420
    @Karvan420 Год назад +2

    Took a while, but decided to buy the amd 6950xt. It was on sale for 649$ so that's about half the price it was less than a year ago. Feels like a steal, specially with the 4070 not being better or cheaper

  • @mrmidnight32
    @mrmidnight32 Год назад +3

    As a heavy team green. I just bought one today for $580 ($620 with last of us game code).
    I’m really hoping my 750 can supply it, and it performed as the 3080 TI shown (currently $1,300 today.
    I’ve always had bad luck with AMD drivers bricking apps until fixing them. Nvidia bad drivers at least open the apps and run terribly. This is reds last chance I hope I’m happily disappointed.

  • @omarblanco1015
    @omarblanco1015 2 года назад +1

    Nice to see Alex doing some review videos!

  • @OGPatriot03
    @OGPatriot03 2 года назад +23

    It's great to see AMD excel in gaming performance (and honestly better RT performance than my 2080 ti) but I used to love AMD for their superior productivity performance (particularly in Blender) but that trend has completely inverted in the past few years.
    I hope AMD makes big strides with Blender in the coming years, OPTIX on Nvidia is just epic.

    • @Kazya1988
      @Kazya1988 2 года назад

      Let's hope they have something to offer in rDNA 3

    • @morpheus_9
      @morpheus_9 2 года назад

      @@Kazya1988 man, RDNA1 was so much better than Vega and RDNA2 was insanely more impressive than RDNA1 while using the SAME node!!! Rdna2 is a freaking beast and RDNA3 is supposed to be 65-100% faster!

    • @amrishpatel3501
      @amrishpatel3501 2 года назад

      @@morpheus_9 I might wait for the RDNA3 cards. Currently got an AMD Sapphire Pulse RX 5700 XT, that uses RDNA1. Its still doing a great job! :)

  • @45eno
    @45eno 8 месяцев назад

    Just bought one of these used for $360 locally. Came with the retail box and appears to be in perfect condition. Haven't had a issue yet in the last 24hrs. Pushing it with a Seasonic Prime GX850 Gold and a 7800X3D. Running a dual 1440p monitor (5120x1440p @ 240hz) and it is doing great. I can't justify new $500+ midrange cards or $700 or higher upper end.

  • @HumbertoHernandez
    @HumbertoHernandez 2 года назад +59

    Good to see Alex having his own review.
    PD: There was a time that I thought Alex and Riley were the same person... oh god.

    • @why_tho_
      @why_tho_ 2 года назад +2

      Oh God 😂

    • @oskrm
      @oskrm 2 года назад +13

      I have never seen Alex and Riley in the same room

    • @bagasfabianmaulana
      @bagasfabianmaulana 2 года назад +3

      @@oskrm watch "Can We Make DIY Thermal Paste". They both in the same room.

    • @ninadganore
      @ninadganore 2 года назад +1

      Same here. I used to get confused all the time.

    • @MethmalDhananjaya
      @MethmalDhananjaya 2 года назад +2

      Riley is basically Alex in Sport Mode.

  • @iuhere
    @iuhere 2 года назад

    Suggestions, ignore if irrelevant. Editor's Note: please use appropriate background( i suppose to match video release/posting schedule templates might have been used) as it makes illegible to read graphs even when paused or may be I have some rare eye disorder, pick one. besides the legends (sample colour ) are also small like a hyphen/dash might not be enough area to represent the colour chosen,for example @05:31 and some other graphs. Hope this helps. Good luck for the next one.

  • @WarriorsPhoto
    @WarriorsPhoto 2 года назад +6

    I haven't watched and LTT video in a long time. I am glad I saw this one. AMD makes great GPUS and I am glad they are trading blows at the high end. (:

  • @cnipon
    @cnipon 2 года назад +1

    Well... I have AMD 7950! it have been good for me about 10 years.. still going on strong... fan from fin

  • @justlo0k33
    @justlo0k33 2 года назад +4

    Finally put my finger on it! LTT has become what CNET should and could have been! And I thank them for it ❤️

  • @lolilolplix
    @lolilolplix 2 года назад +1

    Thanks for having MSFS as one of your benchmarks!

  • @JagoTFC
    @JagoTFC 2 года назад +12

    Ah I remember my 6950 that then became a crossfire. Never in my digital life I made a worse mistake, maybe second only to a Raid10 of C300's 64GB. Then I went 770 Phantom (usually I switched ATi and Nvidia every 2-3 gen). But then G-synx happened and then I married the adaptive sync ever since.

    • @CakePrincessCelestia
      @CakePrincessCelestia 2 года назад +2

      Had a single one and was rather happy with it, just one exception: Their driver politics began to suck around that time and the performance in DCS was utter rubbish. Had a crossfire setup with X1800XTs before, and a passively cooled 3870 as well.

    • @aninditabasak7694
      @aninditabasak7694 2 года назад +1

      Are you talking about HD 6950?

  • @Diavire
    @Diavire 2 года назад

    His face & voice when he says "but I can believe I am going to tell you about our sponsor" is channelling some serious "By Grabthar's Hammer......... what a savings...." (Galaxy Quest) :D

  • @Endonae
    @Endonae 2 года назад +16

    Alex's performance on camera has vastly improved in recent videos. The enthusiasm makes him far more engaging to watch. Great job!

  • @hirokjyotideka5571
    @hirokjyotideka5571 2 года назад +2

    0:05 I like your optimism.

  • @Nick85
    @Nick85 8 месяцев назад +5

    From the future comment here:
    Just bought a red devil 6950xt for $500 retail.

    • @JahonCross
      @JahonCross 3 месяца назад

      Why? It's a fucking bad card with terrible power consumption...better off getting the 6800xt or a power efficient 7800xt

    • @dragoonxgamer
      @dragoonxgamer 2 месяца назад

      i have been using it for years

    • @gogogadgetGlock
      @gogogadgetGlock Месяц назад

      @@JahonCross definitely not a bad card for $500 lol.

  • @miledeep3810
    @miledeep3810 2 года назад

    I run my PC off a 20A battery backup inverter. So I'm good as long as spikes aren't over 4k watts. One way to bypass that dedicated circut would be a good UPS or solar generator. Plus when the power goes out... you don't. You usually pair up with 2000-3000watt batteries. With 3080ti and 5950x pulling 500-600 watt to game, I can go hours without grid power.

  • @-Baikal-
    @-Baikal- 2 года назад +7

    I still have my old HD 6950 in a box, now we've gone a full circle

    • @Fractal_32
      @Fractal_32 2 года назад +2

      Nice! Personally I’m waiting for a RX 7990 or RX 7970 next generation, I think it would be an amazing throwback.

    • @DigitalJedi
      @DigitalJedi 2 года назад +2

      @@Fractal_32 There are rumors of AMD overclocking the balls off of some top-bin chips to compete with the 4090ti. That would either be a 7950, 7970, or 7990 depending on the naming they go for. I'd love to see another 7990 personally

    • @Fractal_32
      @Fractal_32 2 года назад

      @@DigitalJedi personally I’ve heard RDNA3 is going to be MCM but that’s according to leakers (particularly Moore’s Law is dead) take it with a grain of salt since it is not out yet.

    • @DigitalJedi
      @DigitalJedi 2 года назад +1

      @@Fractal_32 My understanding is that it's mcm at the top end and only uses a single unit for the lower end stuff.

    • @Fractal_32
      @Fractal_32 2 года назад

      @@DigitalJedi That seems understandable, I’m hoping these leaks are correct because it would be cool to have a MCM GPU especially if that allows modularity in the future-imagine AMD putting dedicated chiplets on GPU’s in order to speed up specific tasks or to build massive dies that would not be feasible with a monolithic approach.

  • @luckylukeskywalker
    @luckylukeskywalker 2 года назад +1

    I like that you show much more productivity benchmarks nowerdays.
    Only thing to optimize would maybe be a general performance comparisson graphic beforehand instead of so many single benchmarks but even you are still getting better - perfecting you should say. Informative as always.

  • @MatWilson2612
    @MatWilson2612 2 года назад +3

    The real question is;
    Clock for clock is a 6900XT and 6950XT the same (or are AMDs claims of infinity cache true?) - in addition; can an average 6900XT match or come close to a stock 6950xt?

    • @Fractal_32
      @Fractal_32 2 года назад +1

      Do you mean infinity cache? Infinity fabric is on AMD’s CPU’s and is what Intel criticized as “gluing” the cores together.

    • @MatWilson2612
      @MatWilson2612 2 года назад +1

      @@Fractal_32 yes thats what I meant :)

  • @ManWatchingtheStars
    @ManWatchingtheStars 2 года назад

    As someone who uses Siemens NX with an NVIDIA RTX A4000, my jaw kinda dropped to the floor @ 3:59.

  • @MeriaDuck
    @MeriaDuck 2 года назад +7

    That powerincrease to performance gain is so crazy. I almost always run my GPU at half to sixty percent power, with a ten, twenty percent performance penalty. Hardly ever run above 70 pct, because not gets warm and loud.
    Triple fans at 4kRpm is datacenter level... And that's a 200W AMD GPU.

    • @Wrublos212
      @Wrublos212 Год назад

      True, power/performance curve is insane. Especially if you also adjust GPU voltage (Undervolting). Hot spot temps 6950XT are high, would be nice to do that.

  • @tjchoe5824
    @tjchoe5824 2 года назад

    0:42 that subtitle. Whoever made that deserve a raise

  • @MegaAgamon
    @MegaAgamon 2 года назад +62

    I will never forgive AMD for missing this chance to name it the 6942XT

  • @seancooley345
    @seancooley345 2 года назад +1

    In case people were curious, the difference in clock speeds isn’t necessarily due to corporate greed. Common practice for all kinds of silicon for companies is to produce a run and benchmark their clock speeds. They will then create a new SKU for a batch if it tests significantly different than the rest, listing a new clock speed for the silicon. From a consumer POV, it looks like they made a new product that is identical just with a different clock speed
    You could call this corporate greed but it goes both ways, and the companies will slightly increase speed for slightly faster clock speeds and slightly decrease prices for slow clock speeds

  • @murdoch9106
    @murdoch9106 2 года назад +7

    The MSFS 2020 seem abit odd to me, it should run just fine, cousin has a 5700XT and I got a 2060 Super and he beats me despite having a much older system vs my brand new 5900X, or maybe we tie... still, it should run fine on AMD, sure, this is RDNA 2 cars but probably just a bug or temporary issue there... sadly it all it takes to put a stain on one brand or the other since its rare the tests are done again and if it is none will see them or care at that point... MSFS 2020 is probably more to blame than the GPU tbh... As much as I love it, it could be great one week and a wreck the next... depending on what happened, what updates has dropped and so on...
    I went from CTD's for 6 months to running just fine with SU9 update a few weeks ago...

    • @OneDumbOrangeCat
      @OneDumbOrangeCat 2 года назад +5

      5700 XT is just a better overall card.

    • @luminatrixfanfiction
      @luminatrixfanfiction 2 года назад +2

      By getting a 2060, wasted opportunity of taking advantage of smart access memory (SAM) which allows the CPU to access all of the GPU memory which AMD gpu pairs off very well with AMD cpus. You get about 10-15% performance boost with SAM by itself.

    • @canadajones9635
      @canadajones9635 2 года назад +1

      @@luminatrixfanfiction You can still enable resizable BAR in the bios to get much the same effect.

  • @sgtmaggi
    @sgtmaggi 2 года назад +1

    Dude peaks power consumption close to my power supply. I have i5 9400 (would've preferred R5 3600) and a RTX2060 powered by a 600W PSU.
    It's insane how much these new cards draw

    • @DigitalJedi
      @DigitalJedi 2 года назад

      I have a gaming laptop with a 9750HK and a desktop class 1660ti (same TDP). I peak at 220W from the wall outlet when docked.

  • @jeo228
    @jeo228 2 года назад +16

    You can easilly get a 6900xt for around 1100-1200. And for the performance, your beating 3090s and 3090 tis in raster and even competeing with 3080s in raytracing, all for the price of an msrp 3080ti. the 6900xt has been the sleeper card this entire shortage.

    • @watchacc7109
      @watchacc7109 2 года назад +4

      Tried buying 3080 for 2months on GPU drops and failed. that was looking on market and people are selling it for X3 wtf and saw that 6900xt was sold for 1.1x - 1.2x and I got one.

    • @TheObsesedAnimeFreaks
      @TheObsesedAnimeFreaks 2 года назад

      crys in 2k for a water cooled 6900xt

    • @untitled795
      @untitled795 2 года назад

      ONLY for gaming.

    • @jeo228
      @jeo228 2 года назад

      @@TheObsesedAnimeFreaks just buy a custom block seperatley and ur good.

    • @TheObsesedAnimeFreaks
      @TheObsesedAnimeFreaks 2 года назад +1

      @@jeo228 crys in bought the watercooled gpu back in October or so.

  • @octoman_games
    @octoman_games Год назад +1

    I just picked up a 6950XT new for 700 with 2 games on Newegg!

  • @colewelden
    @colewelden 2 года назад +10

    I swear with all these oddly numbered cards we are now seeing a lot of overlap between brands. I mean Intel already made a CPU called the 6950X.. This is the 6950XT as a GPU from AMD. It has definitely been long enough it wont lead to any confusion, but it just seems odd. Especially back with the whole X299 / X399 thing.

    • @CakePrincessCelestia
      @CakePrincessCelestia 2 года назад +1

      There's been a Radeon 6950 in 2011 already, got one of them in an older rig. Was a pretty decent card for its time, used it until the 970 was released and I got myself one. Both cards were around 300 bucks.

    • @colewelden
      @colewelden 2 года назад +2

      There was also the RX series. The same names as older GTX cards, just with an RX instead. Ryzen as well. R7 2700 vs i7 2700. I know there are gaps in time usually to help with confusion. But do they really not have any way of just.. making a uniquely named product? It was particularly bad with Intel's workstation X299 chipset. AMD came along and took the X399 chipset for THEIR workstation chip. Or just products with the exact same name that aren't remotely the same thing. Like the Titan X.. and the Titan X. And then when people found a way to distinguish the two by calling the Pascal version the "Titan Xp", what did Nvidia do? They made a new Titan called.. The Titan XP. It's just astounding to me some of the naming decisions that all three of them make. Sometimes it seems they are just being petty to one another at the expense of the consumer.

    • @colewelden
      @colewelden 2 года назад

      And I guess with the Titan X thing it was just Nvidia being stupid as hell.

  • @zombl337og
    @zombl337og 2 года назад +2

    there will ALWAYS be something better, right around the corner. Just buy what you can afford/makes sense to you, use it for all its worth, and THEN upgrade.
    Anyways, good video, and always love seeing new tech

  • @SurgStriker
    @SurgStriker 2 года назад +10

    They will need to offer special venting systems that allow you to direct the heat from your GPU, like during winter you can aim it to blow over your keyboard and mouse, keeping your hands toasty when the room gets cold. or pipe it out through a ceiling vent during summers so you don't end up getting heatstroke just trying to game.

  • @EhEhEhEINSTEIN
    @EhEhEhEINSTEIN 2 года назад

    Still have both of my XFX Radeon HD 6950 2GB "Double Ds" from back in the z77/2600k with x-fire days lol. One in my mom's Mahjongg machine and one spare to swap out occasionally for a clean/repaste.

  • @Xfade81
    @Xfade81 2 года назад +6

    Hopefully some day we don't need GPU's the size of a placemat. I thought my 2080ti was huge.

  • @alproxxy
    @alproxxy 2 года назад +2

    Honestly, I’m excited for the next generation of gpus. Since amds rdna3 Vs rtx is gonna be an actual battle. And hopefully gives navida a run for there money. It be a nice battle to see

  • @AMDRyzenEnthusiastGroup
    @AMDRyzenEnthusiastGroup 2 года назад +3

    All things considered, in a world where ppl were paying more than this for a 6700 XT/RTX 3070, a couple months ago, this doesn't feel like a bad deal at all, TBH. And IF you're the type of person to drop over 1K on a GPU, it's really hard to justify buying a 3090/3090 Ti instead of one of RX 6950 XT, unless you just have a major boner for raytracing. I mean, frankly, if playing at 4K or trying to max out 1440p FPS, this thing looks like a great option, if the prices stay real. (Side-note, as of now, these are in stock for $1099 on Newegg)

    • @shre6619
      @shre6619 2 года назад

      But, for productivity , and ml/ai type stuff
      nvidia is better (with their propritary cuda cores and cuda accelerated workflows)

    • @AMDRyzenEnthusiastGroup
      @AMDRyzenEnthusiastGroup 2 года назад +1

      @@shre6619 For productivity you shouldn't be looking at an RDNA card anyway, you should be choosing between CDNA and Nvidia. This is a gaming card. Radeon segmented the two use-cases into different products.

  • @Demorthus
    @Demorthus 2 года назад

    5:49 as a apartment dweller; that pisses me off lol... Just a laser printer in my room makes the lights dim & flicker when it's turned on, so it's realistically not far off for someone else with a power hungry GPU to actually demand an additional breaker

  • @ShadowFace-fh5zc
    @ShadowFace-fh5zc 2 года назад +4

    Nice! I hope that u can buy it and dont have to pay the doubled price on ebay...

  • @The_Absolute_Dog
    @The_Absolute_Dog 2 года назад

    @7:48
    Alex: But I can believe, that I'm going to tell you about our sponsor.
    James: Isn't Alex.

  • @mees05
    @mees05 2 года назад +15

    It's in stock for €1200 while the 3090 costs €2000 (Europe prices are still trash), Nvidia really has to drop prices now (though I know people will still pay 800 too much because they know Nvidia works for them)

    • @Hybris51129
      @Hybris51129 2 года назад +7

      I think that is the key factor here. AMD's GPU's have been so mediocre for so long that people acknowledge that they exist and are better than nothing but when they are ready to spend their money its on a Nvidia. I remember walking through my local computer store back in December or January and the Nvidia shelves were completely empty while across the isle the AMD section was full and actually had dust forming on the boxes.
      The product was there but it wasn't moving.

    • @pieter1234569
      @pieter1234569 2 года назад +3

      It's still not necesarrily better. They lack real RTX support and DLSS so any RTX card may still be the best option. Although no one buys a 3090 for gaming.

    • @mees05
      @mees05 2 года назад +6

      @@pieter1234569 FSR 2 will be added to games soon (the first one will be this Thursday). From what we currently know it seems like it can be as good or better than DLSS, so I feel like that's not really a reason to get an RTX card.
      The 3090 is better at RT, but would you really pay €800 (67%) extra just for some nicer reflections in games?
      In this case the 3090 only makes sense when you need it for work

    • @Fractal_32
      @Fractal_32 2 года назад +2

      @@pieter1234569 as someone who plays older games/games without Raytracing I really don’t get why people are amazed or disappointed based off a cards Raytracing performance. I see Raytracing as a gimmick and will continue to focus on Rasterization performance until games I play do Raytracing.

    • @quackatit
      @quackatit 2 года назад +1

      @@pieter1234569 dlss is not much better than fsr, and fsr works with every card and every game. But yeah amd rt is not as good as Nvidia

  • @dracolnyte
    @dracolnyte 2 года назад

    0:19 "and thats not the only thing that is weird" - i thought you were going to introduce your sponsor like that haha

  • @waverleyjournalise5757
    @waverleyjournalise5757 2 года назад +6

    Since every piece of today's productivity software has been optimised for Nvidia hardware from Day 1, those benchmarks will get better the older the card gets. But the gaming results are incredible.

  • @bobbleczar
    @bobbleczar 2 года назад

    I got my 6800XT when it launched for MSRP, and its great. I'm a fulltime editor, and part time gamer.

  • @RikkSpencer
    @RikkSpencer 2 года назад +20

    At some point gamers and enthusiasts are going to have to stand on their principles. At some point we need to all say “enough” and make a mass commitment to stop buying Nvidia and AMD desktop GPUs for like six months. These prices have gotten out of hand - and it is not simply a product of inflationary pressures. AMD, Nvidia and their AIBs have gotten too comfortable with soaking their consumer base. Either we need to do something in the market, or the FTC needs to target both companies for their anti-consumer practices through anti-trust legislation.

    • @PruTroom
      @PruTroom 2 года назад +4

      Would happen if we aren't so into gaming and other heavy workloads

    • @Fractal_32
      @Fractal_32 2 года назад +2

      Why buying a top to tier card now if you can buy a mid tier card later? It will perform the exact same at a lower price and power consumption.

    • @evilcab
      @evilcab 2 года назад

      I agree the price have been rising, but nobody forcing you to buy the top of the line card, you can always buy the low end one.

    • @patrickballou1
      @patrickballou1 2 года назад +4

      In general I agree, but this card at $1100 massively undercuts the current competition, so I’m actually happy with it. If supply goes up and scalpers stop, then the secondhand market will allow for good deals which would help too

    • @teapouter
      @teapouter 2 года назад +2

      But you could just buy a 3080 or a 6800XT. Nothing other than your own ego is making you buy these super high-end cards. It's an established fact that they have diminishing returns.

  • @imadecoy.
    @imadecoy. 2 года назад

    It's called a reference card. "Founder's Edition" is an Nvidia marketing term introduced with the GTX 10 series.

  • @S1LV3R03
    @S1LV3R03 2 года назад +4

    My reference 6900xt is overclocked at 2500Mhz, memory at 2080Mhz and with a serious undervolt. All of that with a max power consumption of 270 watts, so waaaaay better that the 6950xt tested here. This card is completely useless if you know basic tweaking.

    • @jasonmajere2165
      @jasonmajere2165 2 года назад

      Seems like you hit the silicon lottery also.

  • @keisaboru1155
    @keisaboru1155 2 года назад +1

    Germany's stock shifted . 3070 ti is same - 3080ti & 3090 ti increased by 45% for no reason .
    I saw a 3090 for 1200 . Now it's gone and the 3080s now cost 1200 . 😁

  • @schwarzmakromedia3951
    @schwarzmakromedia3951 2 года назад +1

    if its really the compeditor to the 3080 ti like you said at start, why dont you just compare overall scores to it in the end??? why compare it only to the 6900 XT at 6:16???

    • @roybrown6058
      @roybrown6058 2 года назад

      At that time stamp he's comparing the fact that it costs 10% more for almost 10% better performance. Literally every other chart has the 3080ti in it.

    • @schwarzmakromedia3951
      @schwarzmakromedia3951 2 года назад

      @@roybrown6058 ok... im just gonna say again that i would have loved to see all 3080 Ti scores combined. i dont know why its your argument that there are many charts with the 3080 Ti in it.

  • @SasukeKunGaming
    @SasukeKunGaming Год назад +2

    it's crazy how this card costs now around 600$

    • @pieterrossouw8596
      @pieterrossouw8596 Год назад +1

      Yes this or even the 6800XT are among the best deals everywhere.
      Pretty sure that's why AMD priced their 7000 series so much higher... AMD gets put down for having no marketshare in e.g. Steam hardware surveys but if I sort by popularity at my local retailers website it's 6800XT, 3060 Ti, 6900XT, 4070, 7900XT. These 6000 series cards are shipping in bulk right now.

    • @SasukeKunGaming
      @SasukeKunGaming Год назад

      @@pieterrossouw8596 But sadly i live in india so it's still like 700-800$ here

    • @pieterrossouw8596
      @pieterrossouw8596 Год назад

      @@SasukeKunGaming yikes that's rough

  • @_Azariah
    @_Azariah 2 года назад

    I walked into a Best Buy, bought a MSI Mech 2X 6600 xt, to then the following day to return it. Not because nothing was wrong with it, but because now that I had one in my hands, I felt like I didn’t need it. So I owned a 6600 xt for like 10 hours.

  • @talhahtaco2035
    @talhahtaco2035 2 года назад +3

    The 6950, a card that rivals Nvidias 3090 and 3090ti while being hundreds of dollars cheaper in msrp, although the power draw is kinda insane. gpus are so power-hungry these days, what's next a gpu that sucks down 1000 watts sustained?

    • @trippybruh1592
      @trippybruh1592 2 года назад

      Hell the way more downstairs is set up, if I run the microwave and the space heater at the same time I trip a breaker and that's with a mini fridge on the same circuit.
      Once my PC starts tripping the breaker I'll just microwave popcorn upstairs.

  • @Juice-chan
    @Juice-chan 2 года назад +1

    The 4000 series of Nvidia will probably pull off the same shenanigans than last time. They really found a way to double the prices in one generation and made people roll with it.

  • @CeroSantos
    @CeroSantos Год назад +3

    got one on sale for $630 cant wait😁

  • @millar876
    @millar876 2 года назад

    Not strictly related, but wrt the dedicated circuit comment, then I’m glad I’m not in a 110v territory. Here in uk our standard outlets can supply 3KW and those circuits are usually wired on a ring main with a 16A breaker giving around 3.6KW whilst the cables themselves are rated to 3.9KW

  • @jochar3216
    @jochar3216 Год назад +4

    11 months later and I got one for $610

    • @jochar3216
      @jochar3216 Год назад

      @@JayzBeerz I didn’t get my tlou code cuz I ordered it on Amazon and I’m seriously considering returning it and ordering it from Newegg at the same price just so I can get the game lol

    • @TheWorstGamerr
      @TheWorstGamerr Год назад

      ​@@jochar3216What is the maximum temperature on your 6950 XT?

  • @nemtudom5074
    @nemtudom5074 2 года назад

    6:19 That is absolutely NOT FINE
    Its not a FKIN improvement if the price goes up too (and so does the power consumption...)
    The goal is to perform the same at a lower price
    Or
    Perform better at the same price
    Its not a 'gain' if both the price and the performance goes up!

  • @rajder656
    @rajder656 2 года назад +11

    Imo this is a good deal. You get mostly 3090ti performance for almost half the price

    • @chillhour6155
      @chillhour6155 2 года назад

      It's going to be 3 grand at least

    • @rajder656
      @rajder656 2 года назад

      @@chillhour6155 the msrp is still 1100 compared to at least 2k for 3090ti

    • @Killswitch1411
      @Killswitch1411 2 года назад

      I like to get one.. But I run a Varjo Aero and its not compatible with Radeon cards and that MSFS performance is strange.

  • @deansmith4752
    @deansmith4752 2 года назад

    No doubt these are the result of testing the cards in production and those that meet the higher clock rate being kept for enough stock (and timing) to make a SKU with Nvidia 4 series just a few months away

  • @RealKacho
    @RealKacho 2 года назад +6

    id like to see these tests 6 months or so down the line,
    amd's software engineers tend to drop the ball on keeping performance good through updates
    then again the issues i have with nvidias control panel crap lately it might still be neck in neck

    • @just_tom00
      @just_tom00 Год назад

      just grabbed one a month ago for 750. Came with 2 new games and I haven't gone below 100fps ultra settings in 1440p in any game i play

  • @deeeezel
    @deeeezel 2 года назад +1

    Technically this gpu has been out for a while, but it was intended to be sold only by oem’s, and they only came water cooled, amd 6900xt lc

  • @russellsylvia
    @russellsylvia 2 года назад +5

    I hope so i need a new GPU for my gaming rig

  • @andrewmaughan1205
    @andrewmaughan1205 2 года назад

    When the sale began this morning, I managed to purchase a RX 6750 XT from AMD directly ($549.99) to replace the Strix RX 480 8gb OC in my GF's Ryzen 3600 PC build. I have no doubt that the will love the upgrade where her gaming experience is concerned.

  • @aaronrio4271
    @aaronrio4271 2 года назад +3

    Welp now that these are available for like 700ish these are a killer deal lol

    • @Fheezi
      @Fheezi 2 года назад

      Would you say it’s worth the buy?

    • @aaronrio4271
      @aaronrio4271 2 года назад

      @@Fheezi definitely

    • @atvkid0805
      @atvkid0805 2 года назад +1

      yup, just got one for $730 heck of a deal

    • @Buhtbeard
      @Buhtbeard 2 года назад

      @@atvkid0805 Where did you get yours? Cheapest I can find it 775

    • @GandalfGreyhame
      @GandalfGreyhame Год назад +2

      @@Buhtbeard The MSI version is at $700 on Amazon. Honestly insane when comparing to other cards, and the fact that it launched with a price of $1100

  • @chandlergloyd4230
    @chandlergloyd4230 2 года назад

    Yea it doesn't really make sense to say they contributed to the shortage for money when they held onto chips from the peak prices to sell once the shortage was over and the amount of computational power produced per dollar spent is the same.

  • @radomiami
    @radomiami 2 года назад +13

    Honestly, this launch seems like AMD's laughing at NVIDIA for their pricing woes, while AMD's pricing basically goes back to normal.

  • @peterpersson1967
    @peterpersson1967 2 года назад

    Thanks for including SolidWorks into your test suit

  • @TheKarina-ns1ob
    @TheKarina-ns1ob 2 года назад +7

    HOW MUCH MONEY CAN A BEGINNER MAKE IN THE STOCK MARKET WITHIN 6 MONTHS?

    • @rashidjp7882
      @rashidjp7882 2 года назад

      You can make a lot if you invest through a registered reliable and certified broker likeMrs Pamela kay weaver

    • @mrbob2687
      @mrbob2687 2 года назад

      How much money can a beginner make in the stock market within 6 months

    • @mrbob2687
      @mrbob2687 2 года назад

      I've been working with her for years now.

    • @samem.washington5136
      @samem.washington5136 2 года назад

      I'm really proud to have cashed out more than Four hundred thousand USD. have quite good performance with this recommended broker. She really takes care of her customers. Execution is pretty good and withdrawals are really fast. Very good trading conditions and a wide range of trading instrument

    • @SCOTTBDIAZ
      @SCOTTBDIAZ 2 года назад

      Wow😯😯

  • @clark85
    @clark85 2 года назад

    IVe had the rx6800 for a year now and Ive been pretty happy. Got it just before prices went crazy 890 CAD for it and this new release sounds like a good one especially now that there is already FSR 2.0 and RSR available.

  • @blackwizard5109
    @blackwizard5109 2 года назад +5

    What we need is a new budget graphics card that is really budget price

    • @blackwizard5109
      @blackwizard5109 2 года назад

      @@theplayerofus319 I agree, but not everyone can afford it

  • @DSWEIG
    @DSWEIG 2 года назад +1

    With driver updates, have the thermals and power consumption been improved at all?

  • @wetplant1748
    @wetplant1748 2 года назад +3

    It's impressive that in 5 years, AMD went from a crappy cpu and gpu company used only by poor people to a company that can rival Intel and Nvidia