GPU's are going to need a "summer mode" so they don't cook us while we game. Very useful for winter though, it's getting cold so I better game for an hour. Maybe we will even see new case designs that can direct the warm air towards us or outside via ducting.
"Quiet" mode or just turning down the max power usage will do it. I rarely run my 300Watt capable GPU over 220watts, and it's most often closer to 140watts- it idles around 8Watts.
It's worth pointing out that the power consumption seems to be unique to this specific model - competitors from Gigabyte and Sapphire run closer to 380-390W with performance being very close. Still ludicrous, but far less than this.
Actually Gamers Nexus review of the Sapphire Nitro+ Pure 6950 XT shows the exact same crazy power consumption as this MSI version (or more). So it looks like the more OC-ed versions actually run this high in power and thermals too. Watch their video.
@@zaidlacksalastname4905 If the rumors are accurate, I doubt anyone not owning their own power plant will like that much. Though I guess a lot of people can use a space heater in winter.
Nah, the performance of it doesn't really justify the "6969 XT" moniker. However, AMD isn't like Nvidia, where their "TI" cards only means "Tie" now, phonetically and literally in performance, so I guess that's a good thing.
Maybe an idea for a video, with the current prices for energy going up, what would be a decent rig to build with as much power efficiency as possible in mind, while running current gen games competently. = ) Most obvious builds would also most likely end up being in the affordable range, while a maxed out build could be expensive but interesting as in what parts can actually deliver great performance while heavily stunted in terms of power allowance etc... I feel like lowering power consumption should be a market advantage, but is not covered enough.
I would love to see that also . MUST include IDLE power consumption as well. It AMAZES me the stupidity of some AMD fans blabbering on about an AMD cpu using 10 watts less and at the same time they run their entire system on a CHEAP low grade PSU that wastes MUCH more power.
This question is already sort-of answered. Because of how binning goes, the most efficient hardware is usually high-end stuff that's been undervolted/downclocked/etc. And generally the best way to get power efficiency is to downclock, at least on desktop hardware which tends to be tuned a little bit past the point of diminishing returns.
@@williampaabreeves WHAT a REDICULOUS thing to say......The person choosing the PSU , yes , and as I stated those people are choosing CPU's just because an AMD is a few watts lower in power usage.
Yea GPU is basicaly another pc inside your own pc, another ram, graphic processor, motherboard and cooling. GPUs nowadays shows it to us proudly with its lenght, hight and even its weight is quite impresive.
I still cannot comprehend how Nvidia thought it was okay to release their 3090 with those kind of heat issues, seeing it sit that high above the rest of the cards on that graph is just depressing
They really fucked up the cooling solution for the 3090. Having memory on both sides of the PCB and no real cooling for that backside is terrible. It's common for my 3090 to be at 80C core but have memory pegged at 100-110C. It's absolutely insane. Nvidia made water cooled back plates go from 'pointless gimmick' to actual necessity.
From the very beginning of the RDNA2 generation it was clear AMD wasn't focusing on productivity, but gaming. Which makes sense because they split their architecture between RDNA and CDNA. Nvidia doesn't do this which is why their cards tend to perform better in productivity; that's what they're designed for, but aren't marketed for.
@@evertchin It's a bit of both. CUDA is more mature and older, so a lot of software is better optimized for CUDA. Nvidia has put a lot of work into CUDA, but they've also kept it proprietary. The open standards aren't as mature or well supported/documented, so a lot of software doesn't support that or the support is of much lower quality. AMD is trying to close that gap, but it's nearly impossible to close Nvidia's lead while Nvidia is still investing so heavily into the platform. So yes, these issues could be caused by the software makers not putting enough time/effort into the open standards to make them as performant, but no that is not an excuse. While the AMD hardware may be competitive or even superior, for a lot of workloads it is simply the inferior platform.
I believe that splitting their architectures in two, RDNA and CDNA, AMD hindered their image. It helped them to increase traditional gaming performance relative to size of the die and TDP, but when people look at benchmarks, they see „AMD is a tiny bit faster than Nvidia when gaming, but sucks at rendering“, even though most don’t render anything.
Maybe. I think if AMD can convincingly trounce Nvidia in gaming, productivity scores won't hurt their sales much. Right now it's things like ray-tracing, driver compatibility, and DLSS that make me still favor Nvidia, but boy am I ready to jump ship.
Makes sense. I mean, AMD provides the ability to custom their products to their customer's needs, such as console that doesn't need the capabilities to run SPECviewperf. Splitting the capabilities and needs would provide more flexibility for AMD.
I don't know if it helps, but I would actually switch on Radeon Software from Graphics to Compute to mine sometimes (Radeon Software, Click on the top right Gear - Graphics - Advanced settings and it's on the bottom, "GPU Workload". It might be enough to sweep NVIDIA in some productivity/compute tasks, like SPECviewperf ?
@@ravenclawgamer6367 There are still a LOT of fanboys of nGREEDIA. Just look at the market share for GPUs. NGREEDIA still has over 60% of the market. Yet AMD has proven it's better in price/performance with the 6000 series. BUT BUT MUH 2k$ 3090. It's sad really.
Price-wise, AMD GPUs are the way to go. Until NVIDIA gets their pricing as “normal” as AMD, it seems Radeon GPUs will be the king of value this time around.
I gladly took a 3080 at just over MSRP about a week ago. I'm sure the next generation cards launching later this year will be awesome, but it's just going to be a repeat of the 30 series launch in my eyes.
@@Hybris51129 I'd buy it if I were you after the new cards drop - I think the interest in them would lower its price and potentially increase its availability.
value is best if you buy the cards at release. I got my 3080 a few months after release for 1200 Canadian and that was just before the prices went mental.
@@galgrunfeld9954 That is something I am seriously considering as well. Since my buy new GPU's only every 5-7 years or 10 years for my current one I have to try and get the absolute best I can get.
Picked up on release day. Been waiting 2 years for card prices to normalize. Had 3dfx and nvidia for over 20 years. Even as a Nvidia fan boy, i am beyond impressed with this card and luckily i got it on release day at MSRP pricing. For the price and power, it's what I been waiting for. If Nvidia ever gets their stuff together ill prob go back but until then, I am happy with this card.
@@thatoutsider still have the card and no regrets . Handles everything beyond well and have no desire to upgrade to any 4000 series or 7000 series card. This card has been solid
@@vmystikilvThat's nice to hear, i just ordered the 6950xt today because my rtx 2070 doesn't have enough vram for games like Resident Evil 4 unfortunately, it will be delivered in 2-3 days im so excited for that one, also ordered a new 240hz WQHD screen that will also come in 2 days. Im excited like a kid rn and can't wait anymore :D
Too little too late. Although it will be chaotic trying to get a next gen GPU when they release later on in the year, they'll be worth waiting for. Refreshes like this should of dropped towards the end of last year but it's understandable why it didn't with everything that was going on with the shortages & a pandemic.
Just wait if china decides to storm taiwan inspired by russians in ukraine we will be home soldering 486s to play tetris and not these vague kids arguing when is it best to buy 600w eating toy to play games.
technological prices increase at a higher curve than the performance given is due to it being stagnated ,until a new type of processing tech is created we will always be paying a extreme price for little performance gain, same applies to smartphone market ,petrol base cars had this issue for many decades now ,its just started to happen to the silicon market
With these cards drawing more and more power and consumers not really seeming to notice maybe it is the time to have the same 'annual energy consumption' labels like we see for home appliances. The energy draw of this card puts it square in the middle of the range of a full size refrigerator.
This is going to be funny to see with California state banning gaming pcs because they draw too much power. The politicians in that state are going to have an aneurysm when the 4080ti cards draw more than 600 watts.
You make a lot of assumptions about power draw. Different games and activities hit these cards at different levels of power draw. People game in drastically different amounts. All you would be doing by throwing a regulation that they be required to come up with a number based on completely arbitrary input values is add to the cost of the cards. That makes literally 0 sense. With a refrigerator you have a device that is running constantly. It will still vary person to person but at the very least you can give a good general idea of annual usage. You cannot do that with a gpu. Government regulation is not the answer here and that is ultimately what you are asking for.
I would also add that no I don't give that much of a damn about the energy consumed by the card. My a/c consumes far more in a year and is far more impactful to my bill considering I live in a hot ass climate. The amount of time my card is pulling a lot of energy is limited and is a minor blip in annual cost
@@luminatrixfanfiction No, they did not ban gaming PCs. They put higher minimums on efficiency and power draw at idle for pre-built computers. Dell and some other prebuilt manufacturers were just shipping such dogshit PSUs that they had very little that met the requirements.
As a fun sciencey follow-up, RDNA has a terrible habit of not being able to hold lower FPS at Freesync and/or V-SYNC levels. I play more than a dozen games that my 6900XT can push well above the 170 FPS at 1440p of my display, but by capping at that setting, I save nearly 100W off the GPU draw. Worth it! However, this is a double edged sword as the AMD driver/vbios does it's best to minimize clock speeds if the Core isn't at 100% load, and in some cases, even if it is at 100% load. Minecraft, Deep Rock Galactic, and at least a dozen others all can't maintain their FPS stability, or in some cases, even refuse to clock up at all (particularly minecraft!) resulting in some cripplingly low performance from something that should be running at 400fps. My old 1060 Max-Q Dell laptop runs Minecraft at 2-3x the FPS of my Strix G15 6800M OR my desktop's 5600X+6900XT... tell me what doesn't look right here lol.
It's just that RDNA in general runs poorly on outdated OpenGL titles, that why it runs terrible on Minecraft, i think Sodium mod or Lunar Client can somewhat fix that
@@thevaultsup No he is right, RDNA under clocks when not at 100% causing spikes in frame times which is what contributes to micro stutters in some titles. It's a known issue, one I hope AMD will fix. Theoretically, it should be a simple fix by running a script that encourages the gpu to remain at 100% load while gaming but there's a reason why they designed it this way, for longevity of the card to last longer and with less power draw. Bit pointless if you ask me since most people buy new cards every 4 years.
u can fix this with MorePowerTool taking the power saving feature off. freethy has video about this, been running for months 6900xt with no probs 2600/2100mhz.
As someone living in the Nordics where we have no cooling air conditioning, since it's not normally needed, I'm starting to look at the wattage these new cards demand. I rock a 3080 and my room gets pretty toasty if I'm playing a graphically intesive game. My PC drains about 500 Wats at max load, it really heats up the room. I really didn't care about power draw before, but now we're reaching levels where it actually matters.
GTX 1070 here - the whole psu provides around 150-200 W at full load - Intel 10700, nothing overclocked. I want to be able to play and watch movies in 4k, but man, when you are saying 500 W, you are ripping my world. I don't want to sell my future for electricity. What next, 1000 W per hour?!
I would like to see a look into the power vs performance increases over the years. It seems like all they are doing is pushing up the power to get more performance, has the performance increased much per watt in the last few years?
Agreed, we're getting to power levels comparable to that of an actual microwave... 10 minutes of gaming to cook your dinner, an hour or two to set your house on fire lol
My first GPU was an AMD Radeon 7970. Looks like it is going to go full circle with their naming conventions. Ill definitely buy an RX 7970 XT if it comes out for nostalgia sake
Got my 6950xt off amds site today thank god…would have been fine with the 6800xt but just the price difference in Canada was just worth getting the 6950xt
There was more going on here than just binning - GDDR6 yields have come a long way, and this allows them to make use of the higher quality chips without downclocking them to match 6900XT specs.
Took a while, but decided to buy the amd 6950xt. It was on sale for 649$ so that's about half the price it was less than a year ago. Feels like a steal, specially with the 4070 not being better or cheaper
As a heavy team green. I just bought one today for $580 ($620 with last of us game code). I’m really hoping my 750 can supply it, and it performed as the 3080 TI shown (currently $1,300 today. I’ve always had bad luck with AMD drivers bricking apps until fixing them. Nvidia bad drivers at least open the apps and run terribly. This is reds last chance I hope I’m happily disappointed.
It's great to see AMD excel in gaming performance (and honestly better RT performance than my 2080 ti) but I used to love AMD for their superior productivity performance (particularly in Blender) but that trend has completely inverted in the past few years. I hope AMD makes big strides with Blender in the coming years, OPTIX on Nvidia is just epic.
@@Kazya1988 man, RDNA1 was so much better than Vega and RDNA2 was insanely more impressive than RDNA1 while using the SAME node!!! Rdna2 is a freaking beast and RDNA3 is supposed to be 65-100% faster!
Just bought one of these used for $360 locally. Came with the retail box and appears to be in perfect condition. Haven't had a issue yet in the last 24hrs. Pushing it with a Seasonic Prime GX850 Gold and a 7800X3D. Running a dual 1440p monitor (5120x1440p @ 240hz) and it is doing great. I can't justify new $500+ midrange cards or $700 or higher upper end.
Suggestions, ignore if irrelevant. Editor's Note: please use appropriate background( i suppose to match video release/posting schedule templates might have been used) as it makes illegible to read graphs even when paused or may be I have some rare eye disorder, pick one. besides the legends (sample colour ) are also small like a hyphen/dash might not be enough area to represent the colour chosen,for example @05:31 and some other graphs. Hope this helps. Good luck for the next one.
I haven't watched and LTT video in a long time. I am glad I saw this one. AMD makes great GPUS and I am glad they are trading blows at the high end. (:
Ah I remember my 6950 that then became a crossfire. Never in my digital life I made a worse mistake, maybe second only to a Raid10 of C300's 64GB. Then I went 770 Phantom (usually I switched ATi and Nvidia every 2-3 gen). But then G-synx happened and then I married the adaptive sync ever since.
Had a single one and was rather happy with it, just one exception: Their driver politics began to suck around that time and the performance in DCS was utter rubbish. Had a crossfire setup with X1800XTs before, and a passively cooled 3870 as well.
His face & voice when he says "but I can believe I am going to tell you about our sponsor" is channelling some serious "By Grabthar's Hammer......... what a savings...." (Galaxy Quest) :D
I run my PC off a 20A battery backup inverter. So I'm good as long as spikes aren't over 4k watts. One way to bypass that dedicated circut would be a good UPS or solar generator. Plus when the power goes out... you don't. You usually pair up with 2000-3000watt batteries. With 3080ti and 5950x pulling 500-600 watt to game, I can go hours without grid power.
@@Fractal_32 There are rumors of AMD overclocking the balls off of some top-bin chips to compete with the 4090ti. That would either be a 7950, 7970, or 7990 depending on the naming they go for. I'd love to see another 7990 personally
@@DigitalJedi personally I’ve heard RDNA3 is going to be MCM but that’s according to leakers (particularly Moore’s Law is dead) take it with a grain of salt since it is not out yet.
@@DigitalJedi That seems understandable, I’m hoping these leaks are correct because it would be cool to have a MCM GPU especially if that allows modularity in the future-imagine AMD putting dedicated chiplets on GPU’s in order to speed up specific tasks or to build massive dies that would not be feasible with a monolithic approach.
I like that you show much more productivity benchmarks nowerdays. Only thing to optimize would maybe be a general performance comparisson graphic beforehand instead of so many single benchmarks but even you are still getting better - perfecting you should say. Informative as always.
The real question is; Clock for clock is a 6900XT and 6950XT the same (or are AMDs claims of infinity cache true?) - in addition; can an average 6900XT match or come close to a stock 6950xt?
That powerincrease to performance gain is so crazy. I almost always run my GPU at half to sixty percent power, with a ten, twenty percent performance penalty. Hardly ever run above 70 pct, because not gets warm and loud. Triple fans at 4kRpm is datacenter level... And that's a 200W AMD GPU.
True, power/performance curve is insane. Especially if you also adjust GPU voltage (Undervolting). Hot spot temps 6950XT are high, would be nice to do that.
In case people were curious, the difference in clock speeds isn’t necessarily due to corporate greed. Common practice for all kinds of silicon for companies is to produce a run and benchmark their clock speeds. They will then create a new SKU for a batch if it tests significantly different than the rest, listing a new clock speed for the silicon. From a consumer POV, it looks like they made a new product that is identical just with a different clock speed You could call this corporate greed but it goes both ways, and the companies will slightly increase speed for slightly faster clock speeds and slightly decrease prices for slow clock speeds
The MSFS 2020 seem abit odd to me, it should run just fine, cousin has a 5700XT and I got a 2060 Super and he beats me despite having a much older system vs my brand new 5900X, or maybe we tie... still, it should run fine on AMD, sure, this is RDNA 2 cars but probably just a bug or temporary issue there... sadly it all it takes to put a stain on one brand or the other since its rare the tests are done again and if it is none will see them or care at that point... MSFS 2020 is probably more to blame than the GPU tbh... As much as I love it, it could be great one week and a wreck the next... depending on what happened, what updates has dropped and so on... I went from CTD's for 6 months to running just fine with SU9 update a few weeks ago...
By getting a 2060, wasted opportunity of taking advantage of smart access memory (SAM) which allows the CPU to access all of the GPU memory which AMD gpu pairs off very well with AMD cpus. You get about 10-15% performance boost with SAM by itself.
Dude peaks power consumption close to my power supply. I have i5 9400 (would've preferred R5 3600) and a RTX2060 powered by a 600W PSU. It's insane how much these new cards draw
You can easilly get a 6900xt for around 1100-1200. And for the performance, your beating 3090s and 3090 tis in raster and even competeing with 3080s in raytracing, all for the price of an msrp 3080ti. the 6900xt has been the sleeper card this entire shortage.
Tried buying 3080 for 2months on GPU drops and failed. that was looking on market and people are selling it for X3 wtf and saw that 6900xt was sold for 1.1x - 1.2x and I got one.
I swear with all these oddly numbered cards we are now seeing a lot of overlap between brands. I mean Intel already made a CPU called the 6950X.. This is the 6950XT as a GPU from AMD. It has definitely been long enough it wont lead to any confusion, but it just seems odd. Especially back with the whole X299 / X399 thing.
There's been a Radeon 6950 in 2011 already, got one of them in an older rig. Was a pretty decent card for its time, used it until the 970 was released and I got myself one. Both cards were around 300 bucks.
There was also the RX series. The same names as older GTX cards, just with an RX instead. Ryzen as well. R7 2700 vs i7 2700. I know there are gaps in time usually to help with confusion. But do they really not have any way of just.. making a uniquely named product? It was particularly bad with Intel's workstation X299 chipset. AMD came along and took the X399 chipset for THEIR workstation chip. Or just products with the exact same name that aren't remotely the same thing. Like the Titan X.. and the Titan X. And then when people found a way to distinguish the two by calling the Pascal version the "Titan Xp", what did Nvidia do? They made a new Titan called.. The Titan XP. It's just astounding to me some of the naming decisions that all three of them make. Sometimes it seems they are just being petty to one another at the expense of the consumer.
there will ALWAYS be something better, right around the corner. Just buy what you can afford/makes sense to you, use it for all its worth, and THEN upgrade. Anyways, good video, and always love seeing new tech
They will need to offer special venting systems that allow you to direct the heat from your GPU, like during winter you can aim it to blow over your keyboard and mouse, keeping your hands toasty when the room gets cold. or pipe it out through a ceiling vent during summers so you don't end up getting heatstroke just trying to game.
Still have both of my XFX Radeon HD 6950 2GB "Double Ds" from back in the z77/2600k with x-fire days lol. One in my mom's Mahjongg machine and one spare to swap out occasionally for a clean/repaste.
Honestly, I’m excited for the next generation of gpus. Since amds rdna3 Vs rtx is gonna be an actual battle. And hopefully gives navida a run for there money. It be a nice battle to see
All things considered, in a world where ppl were paying more than this for a 6700 XT/RTX 3070, a couple months ago, this doesn't feel like a bad deal at all, TBH. And IF you're the type of person to drop over 1K on a GPU, it's really hard to justify buying a 3090/3090 Ti instead of one of RX 6950 XT, unless you just have a major boner for raytracing. I mean, frankly, if playing at 4K or trying to max out 1440p FPS, this thing looks like a great option, if the prices stay real. (Side-note, as of now, these are in stock for $1099 on Newegg)
@@shre6619 For productivity you shouldn't be looking at an RDNA card anyway, you should be choosing between CDNA and Nvidia. This is a gaming card. Radeon segmented the two use-cases into different products.
5:49 as a apartment dweller; that pisses me off lol... Just a laser printer in my room makes the lights dim & flicker when it's turned on, so it's realistically not far off for someone else with a power hungry GPU to actually demand an additional breaker
It's in stock for €1200 while the 3090 costs €2000 (Europe prices are still trash), Nvidia really has to drop prices now (though I know people will still pay 800 too much because they know Nvidia works for them)
I think that is the key factor here. AMD's GPU's have been so mediocre for so long that people acknowledge that they exist and are better than nothing but when they are ready to spend their money its on a Nvidia. I remember walking through my local computer store back in December or January and the Nvidia shelves were completely empty while across the isle the AMD section was full and actually had dust forming on the boxes. The product was there but it wasn't moving.
It's still not necesarrily better. They lack real RTX support and DLSS so any RTX card may still be the best option. Although no one buys a 3090 for gaming.
@@pieter1234569 FSR 2 will be added to games soon (the first one will be this Thursday). From what we currently know it seems like it can be as good or better than DLSS, so I feel like that's not really a reason to get an RTX card. The 3090 is better at RT, but would you really pay €800 (67%) extra just for some nicer reflections in games? In this case the 3090 only makes sense when you need it for work
@@pieter1234569 as someone who plays older games/games without Raytracing I really don’t get why people are amazed or disappointed based off a cards Raytracing performance. I see Raytracing as a gimmick and will continue to focus on Rasterization performance until games I play do Raytracing.
Since every piece of today's productivity software has been optimised for Nvidia hardware from Day 1, those benchmarks will get better the older the card gets. But the gaming results are incredible.
At some point gamers and enthusiasts are going to have to stand on their principles. At some point we need to all say “enough” and make a mass commitment to stop buying Nvidia and AMD desktop GPUs for like six months. These prices have gotten out of hand - and it is not simply a product of inflationary pressures. AMD, Nvidia and their AIBs have gotten too comfortable with soaking their consumer base. Either we need to do something in the market, or the FTC needs to target both companies for their anti-consumer practices through anti-trust legislation.
In general I agree, but this card at $1100 massively undercuts the current competition, so I’m actually happy with it. If supply goes up and scalpers stop, then the secondhand market will allow for good deals which would help too
But you could just buy a 3080 or a 6800XT. Nothing other than your own ego is making you buy these super high-end cards. It's an established fact that they have diminishing returns.
My reference 6900xt is overclocked at 2500Mhz, memory at 2080Mhz and with a serious undervolt. All of that with a max power consumption of 270 watts, so waaaaay better that the 6950xt tested here. This card is completely useless if you know basic tweaking.
Germany's stock shifted . 3070 ti is same - 3080ti & 3090 ti increased by 45% for no reason . I saw a 3090 for 1200 . Now it's gone and the 3080s now cost 1200 . 😁
if its really the compeditor to the 3080 ti like you said at start, why dont you just compare overall scores to it in the end??? why compare it only to the 6900 XT at 6:16???
@@roybrown6058 ok... im just gonna say again that i would have loved to see all 3080 Ti scores combined. i dont know why its your argument that there are many charts with the 3080 Ti in it.
Yes this or even the 6800XT are among the best deals everywhere. Pretty sure that's why AMD priced their 7000 series so much higher... AMD gets put down for having no marketshare in e.g. Steam hardware surveys but if I sort by popularity at my local retailers website it's 6800XT, 3060 Ti, 6900XT, 4070, 7900XT. These 6000 series cards are shipping in bulk right now.
I walked into a Best Buy, bought a MSI Mech 2X 6600 xt, to then the following day to return it. Not because nothing was wrong with it, but because now that I had one in my hands, I felt like I didn’t need it. So I owned a 6600 xt for like 10 hours.
The 6950, a card that rivals Nvidias 3090 and 3090ti while being hundreds of dollars cheaper in msrp, although the power draw is kinda insane. gpus are so power-hungry these days, what's next a gpu that sucks down 1000 watts sustained?
Hell the way more downstairs is set up, if I run the microwave and the space heater at the same time I trip a breaker and that's with a mini fridge on the same circuit. Once my PC starts tripping the breaker I'll just microwave popcorn upstairs.
The 4000 series of Nvidia will probably pull off the same shenanigans than last time. They really found a way to double the prices in one generation and made people roll with it.
Not strictly related, but wrt the dedicated circuit comment, then I’m glad I’m not in a 110v territory. Here in uk our standard outlets can supply 3KW and those circuits are usually wired on a ring main with a 16A breaker giving around 3.6KW whilst the cables themselves are rated to 3.9KW
@@JayzBeerz I didn’t get my tlou code cuz I ordered it on Amazon and I’m seriously considering returning it and ordering it from Newegg at the same price just so I can get the game lol
6:19 That is absolutely NOT FINE Its not a FKIN improvement if the price goes up too (and so does the power consumption...) The goal is to perform the same at a lower price Or Perform better at the same price Its not a 'gain' if both the price and the performance goes up!
No doubt these are the result of testing the cards in production and those that meet the higher clock rate being kept for enough stock (and timing) to make a SKU with Nvidia 4 series just a few months away
id like to see these tests 6 months or so down the line, amd's software engineers tend to drop the ball on keeping performance good through updates then again the issues i have with nvidias control panel crap lately it might still be neck in neck
When the sale began this morning, I managed to purchase a RX 6750 XT from AMD directly ($549.99) to replace the Strix RX 480 8gb OC in my GF's Ryzen 3600 PC build. I have no doubt that the will love the upgrade where her gaming experience is concerned.
Yea it doesn't really make sense to say they contributed to the shortage for money when they held onto chips from the peak prices to sell once the shortage was over and the amount of computational power produced per dollar spent is the same.
I'm really proud to have cashed out more than Four hundred thousand USD. have quite good performance with this recommended broker. She really takes care of her customers. Execution is pretty good and withdrawals are really fast. Very good trading conditions and a wide range of trading instrument
IVe had the rx6800 for a year now and Ive been pretty happy. Got it just before prices went crazy 890 CAD for it and this new release sounds like a good one especially now that there is already FSR 2.0 and RSR available.
GPU's are going to need a "summer mode" so they don't cook us while we game. Very useful for winter though, it's getting cold so I better game for an hour. Maybe we will even see new case designs that can direct the warm air towards us or outside via ducting.
you can underclock slightly to solve the first issue :)
100% agree idk why they haven't done this yet tbh. my 590 was a great heater in the long winter months.
"Quiet" mode or just turning down the max power usage will do it. I rarely run my 300Watt capable GPU over 220watts, and it's most often closer to 140watts- it idles around 8Watts.
GPUs*
Given how expensive gas has become in Europe that's a nice feature to have.
It's worth pointing out that the power consumption seems to be unique to this specific model - competitors from Gigabyte and Sapphire run closer to 380-390W with performance being very close. Still ludicrous, but far less than this.
"competitors from MSI"...it's an MSI gpu
@@paulmeyer1001 Lol, I meant Gigabyte. TPU has three 6950XT reviews, guess I got two of them mixed up.
Actually Gamers Nexus review of the Sapphire Nitro+ Pure 6950 XT shows the exact same crazy power consumption as this MSI version (or more). So it looks like the more OC-ed versions actually run this high in power and thermals too. Watch their video.
You're not gonna like Lovelace then lol
@@zaidlacksalastname4905 If the rumors are accurate, I doubt anyone not owning their own power plant will like that much. Though I guess a lot of people can use a space heater in winter.
Missed opportunity there. They should've called in 6969 XT.
0:14 We were on the verge of greatness, we were this close.
6969 XD
Nah, the performance of it doesn't really justify the "6969 XT" moniker.
However, AMD isn't like Nvidia, where their "TI" cards only means "Tie" now, phonetically and literally in performance, so I guess that's a good thing.
💀
6969 XXX
Maybe an idea for a video, with the current prices for energy going up, what would be a decent rig to build with as much power efficiency as possible in mind, while running current gen games competently. = ) Most obvious builds would also most likely end up being in the affordable range, while a maxed out build could be expensive but interesting as in what parts can actually deliver great performance while heavily stunted in terms of power allowance etc... I feel like lowering power consumption should be a market advantage, but is not covered enough.
I would love to see that also . MUST include IDLE power consumption as well. It AMAZES me the stupidity of some AMD fans blabbering on about an AMD cpu using 10 watts less and at the same time they run their entire system on a CHEAP low grade PSU that wastes MUCH more power.
This question is already sort-of answered. Because of how binning goes, the most efficient hardware is usually high-end stuff that's been undervolted/downclocked/etc.
And generally the best way to get power efficiency is to downclock, at least on desktop hardware which tends to be tuned a little bit past the point of diminishing returns.
5800x3d + rtx 3070 Founders Edition. Both undervolting and you are at probably 250wattage at 100%load. Super strong and efficient.
@@tilapiadave3234 using a lower quality PSU wouldn't be anything to do with AMD, that would be down to the person choosing that PSU
@@williampaabreeves WHAT a REDICULOUS thing to say......The person choosing the PSU , yes , and as I stated those people are choosing CPU's just because an AMD is a few watts lower in power usage.
Wow that GPU is almost as big as my whole PC! Nice to have a fan heater which can also play games though, love a good multitasking appliance 🤪
Yea GPU is basicaly another pc inside your own pc, another ram, graphic processor, motherboard and cooling. GPUs nowadays shows it to us proudly with its lenght, hight and even its weight is quite impresive.
Equivalent Nvidia runs hotter and uses much more power actually.
nvidia users have had that for the whole 3000 generation :'D
Chuck two of them on top of each other, fan to backplate & you have the size of my PC.
@@geerstyresoil3136 and much better in all around performance
Didn't expect this to beat the RTX 3090 ti in gaming, especially costing so much less.
dont talk shit ;D they will find you and it wont lead to good stuff you know
Yes, but can you brag you have a 3090 Ti?
@@helloukw Well I could brag I have a 6950XT just as well. IMO, it‘s even cooler because I didn’t waste 1000 bucks
It's basically an overclocked 6900xt, overclocked 3090ti srill beats it
@@mohammeded-dahbi7603 then a overclocked 6950xt beats it
As someone who has an Nvidia Mx130 I can confirm this gpu is good.
Lmao same. mx130 gang
@@yasaldesilva I get like 40-60 fps on Valorant lol, If I plug in my laptop cooler i get 70+
The quadro m620 gang where? (cut down gtx 950m that still performs the same somehow)
@@zero-ej6rt Sorry, I'm Quadro P520 gang...
...and RX 6600XT gang.
Intel UHD 730 iGPU represent!
I still cannot comprehend how Nvidia thought it was okay to release their 3090 with those kind of heat issues, seeing it sit that high above the rest of the cards on that graph is just depressing
having the fastest card makes shareholders happy, no matter the consequence
@@spagootest2185 And the STUPIDITY of that is amazing ,, the VAST majority of sales are in the mid performance level
They really fucked up the cooling solution for the 3090. Having memory on both sides of the PCB and no real cooling for that backside is terrible. It's common for my 3090 to be at 80C core but have memory pegged at 100-110C. It's absolutely insane. Nvidia made water cooled back plates go from 'pointless gimmick' to actual necessity.
Nvidia sucks man.
I think these over heating cards may die sooner than mid range cards
From the very beginning of the RDNA2 generation it was clear AMD wasn't focusing on productivity, but gaming. Which makes sense because they split their architecture between RDNA and CDNA. Nvidia doesn't do this which is why their cards tend to perform better in productivity; that's what they're designed for, but aren't marketed for.
because cuda optix and all proprietary bs
@@ThaexakaMavro only the losing side would blame on proprietary, if being an open standard is so great why amd hasn't caught up yet...?
they‘re
@@evertchin It's a bit of both. CUDA is more mature and older, so a lot of software is better optimized for CUDA. Nvidia has put a lot of work into CUDA, but they've also kept it proprietary. The open standards aren't as mature or well supported/documented, so a lot of software doesn't support that or the support is of much lower quality. AMD is trying to close that gap, but it's nearly impossible to close Nvidia's lead while Nvidia is still investing so heavily into the platform.
So yes, these issues could be caused by the software makers not putting enough time/effort into the open standards to make them as performant, but no that is not an excuse. While the AMD hardware may be competitive or even superior, for a lot of workloads it is simply the inferior platform.
@@evertchin imagine being so smooth brained you break it down into winning and losing sides
If these cards keep growing in size like this, we will end up plugging the motherboard on to the GFX PCB
Loot at rtx 4090, humongous 😮
I have the XFX 6800XT on an ITX motherboard - it absolutely looks as silly as you'd expect. That said, it runs like a champ.
I believe that splitting their architectures in two, RDNA and CDNA, AMD hindered their image.
It helped them to increase traditional gaming performance relative to size of the die and TDP, but when people look at benchmarks, they see „AMD is a tiny bit faster than Nvidia when gaming, but sucks at rendering“, even though most don’t render anything.
Maybe. I think if AMD can convincingly trounce Nvidia in gaming, productivity scores won't hurt their sales much. Right now it's things like ray-tracing, driver compatibility, and DLSS that make me still favor Nvidia, but boy am I ready to jump ship.
Absolutely false. AMD’s one size fits all strategy pre-Rdna made them a jack of all trades master of none.
Is the RDNA hw actually crap at it though or is it just garbage... or dare I say purposefully nerfed drivers. The world may never know
Makes sense. I mean, AMD provides the ability to custom their products to their customer's needs, such as console that doesn't need the capabilities to run SPECviewperf. Splitting the capabilities and needs would provide more flexibility for AMD.
I don't know if it helps, but I would actually switch on Radeon Software from Graphics to Compute to mine sometimes (Radeon Software, Click on the top right Gear - Graphics - Advanced settings and it's on the bottom, "GPU Workload". It might be enough to sweep NVIDIA in some productivity/compute tasks, like SPECviewperf ?
AMD card at 335W trades blows with Nvidia card at 465W. RDNA 3 will definitely become the new best in the GPU industry.
only reason amd is losing because of nvidia fan boys buying a double price 6900xt for barely any gain
@@chem1kal please elaborate
they showed later in the video it was drawing about the same power as the 3090Ti
@@ravenclawgamer6367 There are still a LOT of fanboys of nGREEDIA. Just look at the market share for GPUs. NGREEDIA still has over 60% of the market.
Yet AMD has proven it's better in price/performance with the 6000 series. BUT BUT MUH 2k$ 3090. It's sad really.
@@ravenclawgamer6367 what the guy above me said.
Price-wise, AMD GPUs are the way to go. Until NVIDIA gets their pricing as “normal” as AMD, it seems Radeon GPUs will be the king of value this time around.
They always have been the king of value. 480 and 580 sold like hot cakes for this exact reason.
Haven't they always?
that's like me saying chickens lay eggs 🤣🤣
@@ancientflames because miners, not gamers. The 480 and 580 we're flops in the eyes of gamers
@@ILoveTinfoilHats HAHAHAHAHAHAHAHAAHAHHA
The "ultra overclocked" super cooler versions wouldn't be MSRP even in a perfect world they would be at least $100 more.
I gladly took a 3080 at just over MSRP about a week ago. I'm sure the next generation cards launching later this year will be awesome, but it's just going to be a repeat of the 30 series launch in my eyes.
I am debating getting a 3090 TI for just that reason. I could wait to see if I can get a 4000 series card or bite the bullet and buy the current gen.
@@Hybris51129 I'd buy it if I were you after the new cards drop - I think the interest in them would lower its price and potentially increase its availability.
value is best if you buy the cards at release. I got my 3080 a few months after release for 1200 Canadian and that was just before the prices went mental.
@@galgrunfeld9954 That is something I am seriously considering as well. Since my buy new GPU's only every 5-7 years or 10 years for my current one I have to try and get the absolute best I can get.
Congrats! I camped out at Best Buy last summer for a 3080 FE at MSRP and I don't regret it one bit, haha. Enjoy!
Picked up on release day. Been waiting 2 years for card prices to normalize. Had 3dfx and nvidia for over 20 years. Even as a Nvidia fan boy, i am beyond impressed with this card and luckily i got it on release day at MSRP pricing. For the price and power, it's what I been waiting for. If Nvidia ever gets their stuff together ill prob go back but until then, I am happy with this card.
Any update after 10 months? How is it performing? Deciding on this or 4070ti
@@thatoutsider still have the card and no regrets . Handles everything beyond well and have no desire to upgrade to any 4000 series or 7000 series card. This card has been solid
Thanks for sharing!
@@vmystikilvThat's nice to hear, i just ordered the 6950xt today because my rtx 2070 doesn't have enough vram for games like Resident Evil 4 unfortunately, it will be delivered in 2-3 days im so excited for that one, also ordered a new 240hz WQHD screen that will also come in 2 days.
Im excited like a kid rn and can't wait anymore :D
@@xredoxiwhat cpu do you have?
Too little too late. Although it will be chaotic trying to get a next gen GPU when they release later on in the year, they'll be worth waiting for. Refreshes like this should of dropped towards the end of last year but it's understandable why it didn't with everything that was going on with the shortages & a pandemic.
nah
They probably wanted to wait until they had enough units to ship.
Better than Nvidia releasing 200 high end GPU you couldn't buy, lmao
Just wait if china decides to storm taiwan inspired by russians in ukraine we will be home soldering 486s to play tetris and not these vague kids arguing when is it best to buy 600w eating toy to play games.
technological prices increase at a higher curve than the performance given is due to it being stagnated ,until a new type of processing tech is created we will always be paying a extreme price for little performance gain, same applies to smartphone market ,petrol base cars had this issue for many decades now ,its just started to happen to the silicon market
With these cards drawing more and more power and consumers not really seeming to notice maybe it is the time to have the same 'annual energy consumption' labels like we see for home appliances. The energy draw of this card puts it square in the middle of the range of a full size refrigerator.
I'd like to see them perform with power restrictions
This is going to be funny to see with California state banning gaming pcs because they draw too much power. The politicians in that state are going to have an aneurysm when the 4080ti cards draw more than 600 watts.
You make a lot of assumptions about power draw. Different games and activities hit these cards at different levels of power draw. People game in drastically different amounts. All you would be doing by throwing a regulation that they be required to come up with a number based on completely arbitrary input values is add to the cost of the cards. That makes literally 0 sense. With a refrigerator you have a device that is running constantly. It will still vary person to person but at the very least you can give a good general idea of annual usage. You cannot do that with a gpu. Government regulation is not the answer here and that is ultimately what you are asking for.
I would also add that no I don't give that much of a damn about the energy consumed by the card. My a/c consumes far more in a year and is far more impactful to my bill considering I live in a hot ass climate. The amount of time my card is pulling a lot of energy is limited and is a minor blip in annual cost
@@luminatrixfanfiction No, they did not ban gaming PCs. They put higher minimums on efficiency and power draw at idle for pre-built computers. Dell and some other prebuilt manufacturers were just shipping such dogshit PSUs that they had very little that met the requirements.
As a fun sciencey follow-up, RDNA has a terrible habit of not being able to hold lower FPS at Freesync and/or V-SYNC levels. I play more than a dozen games that my 6900XT can push well above the 170 FPS at 1440p of my display, but by capping at that setting, I save nearly 100W off the GPU draw. Worth it!
However, this is a double edged sword as the AMD driver/vbios does it's best to minimize clock speeds if the Core isn't at 100% load, and in some cases, even if it is at 100% load. Minecraft, Deep Rock Galactic, and at least a dozen others all can't maintain their FPS stability, or in some cases, even refuse to clock up at all (particularly minecraft!) resulting in some cripplingly low performance from something that should be running at 400fps.
My old 1060 Max-Q Dell laptop runs Minecraft at 2-3x the FPS of my Strix G15 6800M OR my desktop's 5600X+6900XT... tell me what doesn't look right here lol.
It's just that RDNA in general runs poorly on outdated OpenGL titles, that why it runs terrible on Minecraft, i think Sodium mod or Lunar Client can somewhat fix that
@@thevaultsup No he is right, RDNA under clocks when not at 100% causing spikes in frame times which is what contributes to micro stutters in some titles. It's a known issue, one I hope AMD will fix. Theoretically, it should be a simple fix by running a script that encourages the gpu to remain at 100% load while gaming but there's a reason why they designed it this way, for longevity of the card to last longer and with less power draw.
Bit pointless if you ask me since most people buy new cards every 4 years.
Sodium for Fabric MC is where it's at.
I don't have this problem with my 6900xt. I run a high "minimum frequency" setting in the AMD software and that solves it for me.
u can fix this with MorePowerTool taking the power saving feature off. freethy has video about this, been running for months 6900xt with no probs 2600/2100mhz.
As someone living in the Nordics where we have no cooling air conditioning, since it's not normally needed, I'm starting to look at the wattage these new cards demand. I rock a 3080 and my room gets pretty toasty if I'm playing a graphically intesive game. My PC drains about 500 Wats at max load, it really heats up the room. I really didn't care about power draw before, but now we're reaching levels where it actually matters.
GTX 1070 here - the whole psu provides around 150-200 W at full load - Intel 10700, nothing overclocked. I want to be able to play and watch movies in 4k, but man, when you are saying 500 W, you are ripping my world. I don't want to sell my future for electricity. What next, 1000 W per hour?!
I would like to see a look into the power vs performance increases over the years. It seems like all they are doing is pushing up the power to get more performance, has the performance increased much per watt in the last few years?
Agreed, we're getting to power levels comparable to that of an actual microwave... 10 minutes of gaming to cook your dinner, an hour or two to set your house on fire lol
In cpu it has. In gpu not really
it has, but so has the upper limit of power consumption
@@goblinslayer5404 There is NO upper limit ,,, RX 9990xt SUPER shipped with it's own small nuclear power system
Power efficiency has improved, but the truth is if you can afford a 6900xt or a 3090, you can probably afford a 1000W PSU and electricity bills.
My first GPU was an AMD Radeon 7970. Looks like it is going to go full circle with their naming conventions. Ill definitely buy an RX 7970 XT if it comes out for nostalgia sake
Oh boy, you had to mention the GPU shortage getting better 😂
With how hot these cards run I can see the return of side panel fans in our near future. It's still one of my favorite parts of my old HAF 932.
Got my 6950xt off amds site today thank god…would have been fine with the 6800xt but just the price difference in Canada was just worth getting the 6950xt
3:59 Holy crap is that a GD reference
There was more going on here than just binning - GDDR6 yields have come a long way, and this allows them to make use of the higher quality chips without downclocking them to match 6900XT specs.
Just bought ASRocK 6950xt for for 630. Can’t wait!
What is the maximum temperature on your 6950 XT?
It's really good to see D finally catching up to Nvidia again, I hope they come out ahead soon to keep competition strong.
Took a while, but decided to buy the amd 6950xt. It was on sale for 649$ so that's about half the price it was less than a year ago. Feels like a steal, specially with the 4070 not being better or cheaper
As a heavy team green. I just bought one today for $580 ($620 with last of us game code).
I’m really hoping my 750 can supply it, and it performed as the 3080 TI shown (currently $1,300 today.
I’ve always had bad luck with AMD drivers bricking apps until fixing them. Nvidia bad drivers at least open the apps and run terribly. This is reds last chance I hope I’m happily disappointed.
Got an update for us Buddy?
A now?
Nice to see Alex doing some review videos!
It's great to see AMD excel in gaming performance (and honestly better RT performance than my 2080 ti) but I used to love AMD for their superior productivity performance (particularly in Blender) but that trend has completely inverted in the past few years.
I hope AMD makes big strides with Blender in the coming years, OPTIX on Nvidia is just epic.
Let's hope they have something to offer in rDNA 3
@@Kazya1988 man, RDNA1 was so much better than Vega and RDNA2 was insanely more impressive than RDNA1 while using the SAME node!!! Rdna2 is a freaking beast and RDNA3 is supposed to be 65-100% faster!
@@morpheus_9 I might wait for the RDNA3 cards. Currently got an AMD Sapphire Pulse RX 5700 XT, that uses RDNA1. Its still doing a great job! :)
Just bought one of these used for $360 locally. Came with the retail box and appears to be in perfect condition. Haven't had a issue yet in the last 24hrs. Pushing it with a Seasonic Prime GX850 Gold and a 7800X3D. Running a dual 1440p monitor (5120x1440p @ 240hz) and it is doing great. I can't justify new $500+ midrange cards or $700 or higher upper end.
Good to see Alex having his own review.
PD: There was a time that I thought Alex and Riley were the same person... oh god.
Oh God 😂
I have never seen Alex and Riley in the same room
@@oskrm watch "Can We Make DIY Thermal Paste". They both in the same room.
Same here. I used to get confused all the time.
Riley is basically Alex in Sport Mode.
Suggestions, ignore if irrelevant. Editor's Note: please use appropriate background( i suppose to match video release/posting schedule templates might have been used) as it makes illegible to read graphs even when paused or may be I have some rare eye disorder, pick one. besides the legends (sample colour ) are also small like a hyphen/dash might not be enough area to represent the colour chosen,for example @05:31 and some other graphs. Hope this helps. Good luck for the next one.
I haven't watched and LTT video in a long time. I am glad I saw this one. AMD makes great GPUS and I am glad they are trading blows at the high end. (:
Well... I have AMD 7950! it have been good for me about 10 years.. still going on strong... fan from fin
Finally put my finger on it! LTT has become what CNET should and could have been! And I thank them for it ❤️
Thanks for having MSFS as one of your benchmarks!
Ah I remember my 6950 that then became a crossfire. Never in my digital life I made a worse mistake, maybe second only to a Raid10 of C300's 64GB. Then I went 770 Phantom (usually I switched ATi and Nvidia every 2-3 gen). But then G-synx happened and then I married the adaptive sync ever since.
Had a single one and was rather happy with it, just one exception: Their driver politics began to suck around that time and the performance in DCS was utter rubbish. Had a crossfire setup with X1800XTs before, and a passively cooled 3870 as well.
Are you talking about HD 6950?
His face & voice when he says "but I can believe I am going to tell you about our sponsor" is channelling some serious "By Grabthar's Hammer......... what a savings...." (Galaxy Quest) :D
Alex's performance on camera has vastly improved in recent videos. The enthusiasm makes him far more engaging to watch. Great job!
0:05 I like your optimism.
From the future comment here:
Just bought a red devil 6950xt for $500 retail.
Why? It's a fucking bad card with terrible power consumption...better off getting the 6800xt or a power efficient 7800xt
i have been using it for years
@@JahonCross definitely not a bad card for $500 lol.
I run my PC off a 20A battery backup inverter. So I'm good as long as spikes aren't over 4k watts. One way to bypass that dedicated circut would be a good UPS or solar generator. Plus when the power goes out... you don't. You usually pair up with 2000-3000watt batteries. With 3080ti and 5950x pulling 500-600 watt to game, I can go hours without grid power.
I still have my old HD 6950 in a box, now we've gone a full circle
Nice! Personally I’m waiting for a RX 7990 or RX 7970 next generation, I think it would be an amazing throwback.
@@Fractal_32 There are rumors of AMD overclocking the balls off of some top-bin chips to compete with the 4090ti. That would either be a 7950, 7970, or 7990 depending on the naming they go for. I'd love to see another 7990 personally
@@DigitalJedi personally I’ve heard RDNA3 is going to be MCM but that’s according to leakers (particularly Moore’s Law is dead) take it with a grain of salt since it is not out yet.
@@Fractal_32 My understanding is that it's mcm at the top end and only uses a single unit for the lower end stuff.
@@DigitalJedi That seems understandable, I’m hoping these leaks are correct because it would be cool to have a MCM GPU especially if that allows modularity in the future-imagine AMD putting dedicated chiplets on GPU’s in order to speed up specific tasks or to build massive dies that would not be feasible with a monolithic approach.
I like that you show much more productivity benchmarks nowerdays.
Only thing to optimize would maybe be a general performance comparisson graphic beforehand instead of so many single benchmarks but even you are still getting better - perfecting you should say. Informative as always.
The real question is;
Clock for clock is a 6900XT and 6950XT the same (or are AMDs claims of infinity cache true?) - in addition; can an average 6900XT match or come close to a stock 6950xt?
Do you mean infinity cache? Infinity fabric is on AMD’s CPU’s and is what Intel criticized as “gluing” the cores together.
@@Fractal_32 yes thats what I meant :)
As someone who uses Siemens NX with an NVIDIA RTX A4000, my jaw kinda dropped to the floor @ 3:59.
That powerincrease to performance gain is so crazy. I almost always run my GPU at half to sixty percent power, with a ten, twenty percent performance penalty. Hardly ever run above 70 pct, because not gets warm and loud.
Triple fans at 4kRpm is datacenter level... And that's a 200W AMD GPU.
True, power/performance curve is insane. Especially if you also adjust GPU voltage (Undervolting). Hot spot temps 6950XT are high, would be nice to do that.
0:42 that subtitle. Whoever made that deserve a raise
I will never forgive AMD for missing this chance to name it the 6942XT
its nothing without the zero
amd radeon 6942.0XT
Should have called it the 6942XD
Not 6969xt?
Why not 6969XXX
In case people were curious, the difference in clock speeds isn’t necessarily due to corporate greed. Common practice for all kinds of silicon for companies is to produce a run and benchmark their clock speeds. They will then create a new SKU for a batch if it tests significantly different than the rest, listing a new clock speed for the silicon. From a consumer POV, it looks like they made a new product that is identical just with a different clock speed
You could call this corporate greed but it goes both ways, and the companies will slightly increase speed for slightly faster clock speeds and slightly decrease prices for slow clock speeds
The MSFS 2020 seem abit odd to me, it should run just fine, cousin has a 5700XT and I got a 2060 Super and he beats me despite having a much older system vs my brand new 5900X, or maybe we tie... still, it should run fine on AMD, sure, this is RDNA 2 cars but probably just a bug or temporary issue there... sadly it all it takes to put a stain on one brand or the other since its rare the tests are done again and if it is none will see them or care at that point... MSFS 2020 is probably more to blame than the GPU tbh... As much as I love it, it could be great one week and a wreck the next... depending on what happened, what updates has dropped and so on...
I went from CTD's for 6 months to running just fine with SU9 update a few weeks ago...
5700 XT is just a better overall card.
By getting a 2060, wasted opportunity of taking advantage of smart access memory (SAM) which allows the CPU to access all of the GPU memory which AMD gpu pairs off very well with AMD cpus. You get about 10-15% performance boost with SAM by itself.
@@luminatrixfanfiction You can still enable resizable BAR in the bios to get much the same effect.
Dude peaks power consumption close to my power supply. I have i5 9400 (would've preferred R5 3600) and a RTX2060 powered by a 600W PSU.
It's insane how much these new cards draw
I have a gaming laptop with a 9750HK and a desktop class 1660ti (same TDP). I peak at 220W from the wall outlet when docked.
You can easilly get a 6900xt for around 1100-1200. And for the performance, your beating 3090s and 3090 tis in raster and even competeing with 3080s in raytracing, all for the price of an msrp 3080ti. the 6900xt has been the sleeper card this entire shortage.
Tried buying 3080 for 2months on GPU drops and failed. that was looking on market and people are selling it for X3 wtf and saw that 6900xt was sold for 1.1x - 1.2x and I got one.
crys in 2k for a water cooled 6900xt
ONLY for gaming.
@@TheObsesedAnimeFreaks just buy a custom block seperatley and ur good.
@@jeo228 crys in bought the watercooled gpu back in October or so.
I just picked up a 6950XT new for 700 with 2 games on Newegg!
I swear with all these oddly numbered cards we are now seeing a lot of overlap between brands. I mean Intel already made a CPU called the 6950X.. This is the 6950XT as a GPU from AMD. It has definitely been long enough it wont lead to any confusion, but it just seems odd. Especially back with the whole X299 / X399 thing.
There's been a Radeon 6950 in 2011 already, got one of them in an older rig. Was a pretty decent card for its time, used it until the 970 was released and I got myself one. Both cards were around 300 bucks.
There was also the RX series. The same names as older GTX cards, just with an RX instead. Ryzen as well. R7 2700 vs i7 2700. I know there are gaps in time usually to help with confusion. But do they really not have any way of just.. making a uniquely named product? It was particularly bad with Intel's workstation X299 chipset. AMD came along and took the X399 chipset for THEIR workstation chip. Or just products with the exact same name that aren't remotely the same thing. Like the Titan X.. and the Titan X. And then when people found a way to distinguish the two by calling the Pascal version the "Titan Xp", what did Nvidia do? They made a new Titan called.. The Titan XP. It's just astounding to me some of the naming decisions that all three of them make. Sometimes it seems they are just being petty to one another at the expense of the consumer.
And I guess with the Titan X thing it was just Nvidia being stupid as hell.
there will ALWAYS be something better, right around the corner. Just buy what you can afford/makes sense to you, use it for all its worth, and THEN upgrade.
Anyways, good video, and always love seeing new tech
They will need to offer special venting systems that allow you to direct the heat from your GPU, like during winter you can aim it to blow over your keyboard and mouse, keeping your hands toasty when the room gets cold. or pipe it out through a ceiling vent during summers so you don't end up getting heatstroke just trying to game.
Scam
@@nickcollins1528 Indeed
Still have both of my XFX Radeon HD 6950 2GB "Double Ds" from back in the z77/2600k with x-fire days lol. One in my mom's Mahjongg machine and one spare to swap out occasionally for a clean/repaste.
Hopefully some day we don't need GPU's the size of a placemat. I thought my 2080ti was huge.
Honestly, I’m excited for the next generation of gpus. Since amds rdna3 Vs rtx is gonna be an actual battle. And hopefully gives navida a run for there money. It be a nice battle to see
All things considered, in a world where ppl were paying more than this for a 6700 XT/RTX 3070, a couple months ago, this doesn't feel like a bad deal at all, TBH. And IF you're the type of person to drop over 1K on a GPU, it's really hard to justify buying a 3090/3090 Ti instead of one of RX 6950 XT, unless you just have a major boner for raytracing. I mean, frankly, if playing at 4K or trying to max out 1440p FPS, this thing looks like a great option, if the prices stay real. (Side-note, as of now, these are in stock for $1099 on Newegg)
But, for productivity , and ml/ai type stuff
nvidia is better (with their propritary cuda cores and cuda accelerated workflows)
@@shre6619 For productivity you shouldn't be looking at an RDNA card anyway, you should be choosing between CDNA and Nvidia. This is a gaming card. Radeon segmented the two use-cases into different products.
5:49 as a apartment dweller; that pisses me off lol... Just a laser printer in my room makes the lights dim & flicker when it's turned on, so it's realistically not far off for someone else with a power hungry GPU to actually demand an additional breaker
Nice! I hope that u can buy it and dont have to pay the doubled price on ebay...
Lmao yup same here
@7:48
Alex: But I can believe, that I'm going to tell you about our sponsor.
James: Isn't Alex.
It's in stock for €1200 while the 3090 costs €2000 (Europe prices are still trash), Nvidia really has to drop prices now (though I know people will still pay 800 too much because they know Nvidia works for them)
I think that is the key factor here. AMD's GPU's have been so mediocre for so long that people acknowledge that they exist and are better than nothing but when they are ready to spend their money its on a Nvidia. I remember walking through my local computer store back in December or January and the Nvidia shelves were completely empty while across the isle the AMD section was full and actually had dust forming on the boxes.
The product was there but it wasn't moving.
It's still not necesarrily better. They lack real RTX support and DLSS so any RTX card may still be the best option. Although no one buys a 3090 for gaming.
@@pieter1234569 FSR 2 will be added to games soon (the first one will be this Thursday). From what we currently know it seems like it can be as good or better than DLSS, so I feel like that's not really a reason to get an RTX card.
The 3090 is better at RT, but would you really pay €800 (67%) extra just for some nicer reflections in games?
In this case the 3090 only makes sense when you need it for work
@@pieter1234569 as someone who plays older games/games without Raytracing I really don’t get why people are amazed or disappointed based off a cards Raytracing performance. I see Raytracing as a gimmick and will continue to focus on Rasterization performance until games I play do Raytracing.
@@pieter1234569 dlss is not much better than fsr, and fsr works with every card and every game. But yeah amd rt is not as good as Nvidia
0:19 "and thats not the only thing that is weird" - i thought you were going to introduce your sponsor like that haha
Since every piece of today's productivity software has been optimised for Nvidia hardware from Day 1, those benchmarks will get better the older the card gets. But the gaming results are incredible.
I got my 6800XT when it launched for MSRP, and its great. I'm a fulltime editor, and part time gamer.
At some point gamers and enthusiasts are going to have to stand on their principles. At some point we need to all say “enough” and make a mass commitment to stop buying Nvidia and AMD desktop GPUs for like six months. These prices have gotten out of hand - and it is not simply a product of inflationary pressures. AMD, Nvidia and their AIBs have gotten too comfortable with soaking their consumer base. Either we need to do something in the market, or the FTC needs to target both companies for their anti-consumer practices through anti-trust legislation.
Would happen if we aren't so into gaming and other heavy workloads
Why buying a top to tier card now if you can buy a mid tier card later? It will perform the exact same at a lower price and power consumption.
I agree the price have been rising, but nobody forcing you to buy the top of the line card, you can always buy the low end one.
In general I agree, but this card at $1100 massively undercuts the current competition, so I’m actually happy with it. If supply goes up and scalpers stop, then the secondhand market will allow for good deals which would help too
But you could just buy a 3080 or a 6800XT. Nothing other than your own ego is making you buy these super high-end cards. It's an established fact that they have diminishing returns.
It's called a reference card. "Founder's Edition" is an Nvidia marketing term introduced with the GTX 10 series.
My reference 6900xt is overclocked at 2500Mhz, memory at 2080Mhz and with a serious undervolt. All of that with a max power consumption of 270 watts, so waaaaay better that the 6950xt tested here. This card is completely useless if you know basic tweaking.
Seems like you hit the silicon lottery also.
Germany's stock shifted . 3070 ti is same - 3080ti & 3090 ti increased by 45% for no reason .
I saw a 3090 for 1200 . Now it's gone and the 3080s now cost 1200 . 😁
if its really the compeditor to the 3080 ti like you said at start, why dont you just compare overall scores to it in the end??? why compare it only to the 6900 XT at 6:16???
At that time stamp he's comparing the fact that it costs 10% more for almost 10% better performance. Literally every other chart has the 3080ti in it.
@@roybrown6058 ok... im just gonna say again that i would have loved to see all 3080 Ti scores combined. i dont know why its your argument that there are many charts with the 3080 Ti in it.
it's crazy how this card costs now around 600$
Yes this or even the 6800XT are among the best deals everywhere.
Pretty sure that's why AMD priced their 7000 series so much higher... AMD gets put down for having no marketshare in e.g. Steam hardware surveys but if I sort by popularity at my local retailers website it's 6800XT, 3060 Ti, 6900XT, 4070, 7900XT. These 6000 series cards are shipping in bulk right now.
@@pieterrossouw8596 But sadly i live in india so it's still like 700-800$ here
@@SasukeKunGaming yikes that's rough
I walked into a Best Buy, bought a MSI Mech 2X 6600 xt, to then the following day to return it. Not because nothing was wrong with it, but because now that I had one in my hands, I felt like I didn’t need it. So I owned a 6600 xt for like 10 hours.
The 6950, a card that rivals Nvidias 3090 and 3090ti while being hundreds of dollars cheaper in msrp, although the power draw is kinda insane. gpus are so power-hungry these days, what's next a gpu that sucks down 1000 watts sustained?
Hell the way more downstairs is set up, if I run the microwave and the space heater at the same time I trip a breaker and that's with a mini fridge on the same circuit.
Once my PC starts tripping the breaker I'll just microwave popcorn upstairs.
The 4000 series of Nvidia will probably pull off the same shenanigans than last time. They really found a way to double the prices in one generation and made people roll with it.
got one on sale for $630 cant wait😁
Ditto!!!
What is the maximum temperature on your 6950 XT?
Not strictly related, but wrt the dedicated circuit comment, then I’m glad I’m not in a 110v territory. Here in uk our standard outlets can supply 3KW and those circuits are usually wired on a ring main with a 16A breaker giving around 3.6KW whilst the cables themselves are rated to 3.9KW
11 months later and I got one for $610
@@JayzBeerz I didn’t get my tlou code cuz I ordered it on Amazon and I’m seriously considering returning it and ordering it from Newegg at the same price just so I can get the game lol
@@jochar3216What is the maximum temperature on your 6950 XT?
6:19 That is absolutely NOT FINE
Its not a FKIN improvement if the price goes up too (and so does the power consumption...)
The goal is to perform the same at a lower price
Or
Perform better at the same price
Its not a 'gain' if both the price and the performance goes up!
Imo this is a good deal. You get mostly 3090ti performance for almost half the price
It's going to be 3 grand at least
@@chillhour6155 the msrp is still 1100 compared to at least 2k for 3090ti
I like to get one.. But I run a Varjo Aero and its not compatible with Radeon cards and that MSFS performance is strange.
No doubt these are the result of testing the cards in production and those that meet the higher clock rate being kept for enough stock (and timing) to make a SKU with Nvidia 4 series just a few months away
id like to see these tests 6 months or so down the line,
amd's software engineers tend to drop the ball on keeping performance good through updates
then again the issues i have with nvidias control panel crap lately it might still be neck in neck
just grabbed one a month ago for 750. Came with 2 new games and I haven't gone below 100fps ultra settings in 1440p in any game i play
Technically this gpu has been out for a while, but it was intended to be sold only by oem’s, and they only came water cooled, amd 6900xt lc
I hope so i need a new GPU for my gaming rig
When the sale began this morning, I managed to purchase a RX 6750 XT from AMD directly ($549.99) to replace the Strix RX 480 8gb OC in my GF's Ryzen 3600 PC build. I have no doubt that the will love the upgrade where her gaming experience is concerned.
Welp now that these are available for like 700ish these are a killer deal lol
Would you say it’s worth the buy?
@@Fheezi definitely
yup, just got one for $730 heck of a deal
@@atvkid0805 Where did you get yours? Cheapest I can find it 775
@@Buhtbeard The MSI version is at $700 on Amazon. Honestly insane when comparing to other cards, and the fact that it launched with a price of $1100
Yea it doesn't really make sense to say they contributed to the shortage for money when they held onto chips from the peak prices to sell once the shortage was over and the amount of computational power produced per dollar spent is the same.
Honestly, this launch seems like AMD's laughing at NVIDIA for their pricing woes, while AMD's pricing basically goes back to normal.
Thanks for including SolidWorks into your test suit
HOW MUCH MONEY CAN A BEGINNER MAKE IN THE STOCK MARKET WITHIN 6 MONTHS?
You can make a lot if you invest through a registered reliable and certified broker likeMrs Pamela kay weaver
How much money can a beginner make in the stock market within 6 months
I've been working with her for years now.
I'm really proud to have cashed out more than Four hundred thousand USD. have quite good performance with this recommended broker. She really takes care of her customers. Execution is pretty good and withdrawals are really fast. Very good trading conditions and a wide range of trading instrument
Wow😯😯
IVe had the rx6800 for a year now and Ive been pretty happy. Got it just before prices went crazy 890 CAD for it and this new release sounds like a good one especially now that there is already FSR 2.0 and RSR available.
What we need is a new budget graphics card that is really budget price
@@theplayerofus319 I agree, but not everyone can afford it
With driver updates, have the thermals and power consumption been improved at all?
It's impressive that in 5 years, AMD went from a crappy cpu and gpu company used only by poor people to a company that can rival Intel and Nvidia