Seriously?for that amount of money and the minimal diff in performance, you're better off with a 4070 or even that 4060ti with 16gb if they release it.
I was thinking: I see so many comments asking me if I can test a GDDR6X version. Surely it's much better than the normal one, right? ... I will start pasting this video's link under those comments 😅 Thanks for the great comparison!
@@sexdefender86 Not much demonstrable performance from it though. At best you're talking 1-3 FPS, although in my experience even a modest +500Mhz can improve 1% lows depending on the game / engine.
@@arch1107 but doesn't require a second power connector. That second power connector might rule it out as an option for someone with a lower wattage psu, or one that doesn't have a lot of power connectors to spare, like most OEM systems.
@@gamewizard1760 the second connector is used when you overclock and cant rely on the 75 watts delivered by the pci express slot theoretically you don't need it if you don't overclock, when you do, of course you will go over the 150 watts from the 8 pin power cable and the 75 watts from the pci express connector they should have used a 6 pin connector, but, it is what it is
My favorite hardware testing channel by far. In depth reviews of games that show gameplay talk about important factors that determine if the hardware is worth it etc, thank you for these videos!
I'd be curious how the two cards respond in a VRAM limited situation. Something like Hogwarts Legacy with the textures cranked high enough to challenge the 8 gigabyte buffer.
If you actually watched the video he did show usage of over 7gb vram. And if he increased more he would hit close to 8gb. But from the video its obvious that its very slight difference , this is pure cash grab lie.
GUYS, nvidia released the GDDR6X version of the video card for a reason, they released it to prevent further problems with burning out the gddr6 chips, because for some users the chip started to burn, this is due to incorrectly set voltages directly from the Hunix factory. so you can safely take the gddr6X version, it's very reliable
That make Nvidia's choice to go for that instead of a Vram capacity upgrade questionnable. It's obviously just a left over stock of chips but they could at least make the upgrade on the 3070 instead.
Heck I use one limited to the lowest voltage, the VRAM downclocks to 5000mhz but it barely changes the Timespy score by 8% or something compared to 7000
Would maybe have been a more interesting card if they changed the memory to 12GB VRAM on a 192 bit bus, but GDDR6X to make up the loss in bandwidth from the smaller bus. As it is, I'm guessing the improvements are minimal, because the card isn't really reaching its bandwidth limit most of the time even on the original version.
The memory subsystem in these cards are designed to act more like the consoles. This is Nvidia trying to be AMD. More memory to compensate for bandwidth limitations. Makes sense but in practice? Devs need to spit polish their turds and stop churning out non optimized garbage.
@@MDxGano Yep. They get to it. Like I said, they need to keep at it.. I'm to the point where I stopped buying release day games. Too expensive for the sheer volume of game breaking moments. On console they know better... but the PC crowd will deal with it? Meanwhile we pay 10x for our "Game machines".
from an engineering point of view, a consistent 1.5% uplift is definitely an improvement. From a budget gaming point of view - the extra price and energy/power costs are not worth it, of course.
Seeing this make me happy with my EVGA RTX 2080 Super I got for doing some work on a friend's car. And recently just upgraded from an AMD Ryzen 5 1600X to an AMD Ryzen 5 5600X. I can say that I got a HUGE upgrade vs the 1600X and Zotac GTX 1070 I was running before. And it is nice to finally run my RAM at its full 3600MHz instead of 3200MHz running the 1600X. Went from a 95 watt processor (stock) that only hit 4.0GHz all core and pulling closer to 125 watts when maxed out and 70*C, to a 65 watt processor (stock) that will auto overclock to between 4.65-4.8GHz all core, some cores peaking at 5GHz and drawing 110-115 watts and 65*C with the same Corsair AIO. Pretty happy with that all on a B450M motherboard that I flashed a new BIOS to.
So my takeaway is if you already own the og 3060 ti it's not worth the upgrade. Would love to see the comparison between a 2060 6gb to see if it would be worth my time and money upgrading
If he would’ve used a faster cpu there would’ve been a noticeable increase in performance. I got almost 10% FPS increase in some games and I’m running a 13600K.. The stock 12400f with the crappy ram he uses holds the Gpu back.
And it would definitely be worth the upgrade from a 2060.. But if your running an older cpu than 11thgen Intel or 5000 series Ryzen there isn’t a point. Upgrade your CPU and ram first if that’s the case.
This is usually the case with higher memory bandwidth... as game engines become more taxing on VRAM bandwidth, the difference between the 2 cards will become more noticeable. however, since most game engines today are well optimized to still run well on GDDR5 memory (and maybe GDDR6 memory in some titles) the extra bandwidth wont yet make a difference until the workload from newer game engines gets more demanding.
I would highly suggest using Dx11 for Witcher 3 testing as it generally gets approximately +10% and doesn't have any shader stutter, that way you'd be able to use ultra+ without a performance hit. It feels very silly that nvidia would do this with a 3060 ti as opposed to a 3070 since that card is much more likely to have bandwidth related performance issues.
Overall, more VRAM woulda helped this card more than faster VRAM. 3060ti shoulda had 10GB, 3070 shoulda had 12GB, and the 3080 shoulda had 16GB for this generation to shine fully for years to come. the 3060 shoulda been capped at 8GB of GDDR6 VRAM, not the TI version.
Why they can't put 10 or 12 Gb of vram ?! It will be the best move to do for this 3060TI ! Great video, I don't know this GDDR6X was out in the market !
Love the videos, did note the power draw was an extreme 20 - 25watts too... Personally don't think the price difference is worth it... Although my systems rocking a 3060 😁
I was expecting to get a VRAM upgrade that genuinely wouldve made it a great upgrade but it seems like this is a very nominal upgrade to a regular 3060 Ti
Man, there are so many "nice" numbers in the results😊 BTW, another one in favor of .X model is VRAM consumption(at least in RTSS): it "eats" about 150 Meg less. Does it make any difference due to quicker speeds or one should opt for 12G card anyway?
@@RandomGaminginHD It'll be nice to see how it actually stacks up with hard numbers, my target is 1080p 75fps and I really haven't had to turn down a lot of settings yet
Yeah I think I'll stick with my RX6600, and still recommend it to others over this (or the OG version) Double the power consumption, double the price just to gain a mere 20 fps average in 1440p on most games is not worth the cash. Especially when the 7600 comes out next month (if it's priced competitively enough.) I've been gaming on Nvidia for the quite awhile, but they have lost there value charm when they went to RTX.
3060ti regular is still better option than 6600 or 6700. Only if price difference is much larger is a good buy. Here where i live not so much. 6700xt is okey for larger vram. But again useless because we cant use RT, so if we compare it like that, 3060ti is still better.
@@milosstojanovic4623 First let me say that your English is not that great so your comment is all over the place, but I will try my best to respond. I don't know were you are to see how your markets are in terms of pricing, but were I am there is a $200+ (U.S.) average difference in price between the 6600 and the 3060TI, and the deference matters to people that don't have "more money than brains." If RT is more important to my customer(s) then saving money then I just point them to the 6700XT it's $40 bucks less than the regular DDR6 Zotac 3060TI and they are on par with one another, but most think it's not worth the cost of the card nor the cost added to their electric bill each month especially if they have more than one gamer in the house.
@@JeremyLeePotocki First of all my English is good enough, and when you learn to speak and write Serbian as i do English, then we could discuss that, before that stop with bullshit remarks. Second, here 6600 regular non XT costs 280e from what i found the cheapest, 3060ti Evga costs 330e (that's the cheapest, and 350e is an average price), and that ends any further comparison. That's why i specifically wrote HERE, because prices vary from country to country. The cheapest 6600xt costs 375e.
@@milosstojanovic4623 Milos I was not trying to be insinuative just informative so no need to be this defensive nor offensive in your comment. Also if you would have just stated what your paying for your cards in your area it would have informed me a lot better, and you would have received a better response. So a 50e difference dose make it a better option if you don't take in count the cost of actually using the card on a daily bases. So with that said if you don't like what I said maybe just don't reply at all, or at least be more civil about it.
@@JeremyLeePotocki this is me being civil, if i was not civil, i would use entirely different words to describe people who are being smartass because first thing they say is YoUr EngLiSh iS NoT gOOd and "i can't understand context of what had been said" ;) I talked to native speaking people, and some of their English is less understandable than people which are not native speakers. And all what you and others should do is to use 2 more brain cells from regular 2 to understand words spoken ;) Best regards ✌
10-20% less efficient for 1-2% performance gain. What a deal ! I'm gonna get five of them. Anyway thanks for the video i find out there was X version and still i was not 100% sure to get a 6700XT or a 3060Ti because of the 4Gb Vram less, i'm confident in my choice now. No ragrets👌
Reminds me of when the GTX 1650 got a GDDR6 model, apparently because manufacturers were basically just done with GDDR5 and it was harder to source the older model memory
might try a ryzen 5 5600 on your bench for these 2 cards. From what i saw in benchmarks from other YTers, the 3060ti G6X performs closer to the 3070..... I think your 12400F might be the limiting factor here
The memory on RTX 3060ti was already faster than GPU could benefit from, nvidia should have put lower bandwidth VRAM like 3060 (Memory Bus: 192 bit Bandwidth: 360.0 GB/s) into the 3060ti, it would've been fine, more VRAM was much more necessary than that of more bandwidth on 3060ti. Also using the same buswidth of 256bit, adding 4x2GB chips and 4x 1GB chips of memory would've resultant in still higher bandwidth than 3060, I'm not super technician but that what I think. :)
1:36 Be interested to see the power draw at the wall if there's any difference.... especially as that one has 2 x 8 pin when some only have a single 8 pin.
The problem with these cards, as I have the 3070 variant of this brand of card myself, is the stupid position of the 8 pin sockets! Surely it would not of killed them to put the sockets higher. They are so awkward to put the cables in. And, with the watching of this video, no, to me this card is a complete waste of money. The plusses are so minuscule, it's not worth it. But, thanks again, Steve for another interesting video!
CPU clocks are slow and resolution medium. If there was no bandwidth bottleneck in the GDDR6 install, why would there be one in the GDDR6X? You have no signal to test and got those exact results.
I got a gaming laptop in mid 2021 with an RTX 3060 and ryzen 7 5800H. The RTX 3060 can run upto 130W which is about as powerful as you can get in a laptop. The VRAM can be a limit sometimes but I don't mind turning the settings down and I think this laptop will run my applications comfortably for the next 2 years. Pretty happy with this purchase.
Its all about price. During the gpu apocalypse I was on the EVGA wait list so long I forgot about it. When the email came to purchase a 3060ti at msrp I jumped on it. My oldest Son has no complaints and sure the 8g of Vram may be an issue with the latest unoptimized hot garbage games.... The games he actually plays run fantastic and we won't be replacing it anytime soon.
Eh, most people aren't that concerned with RTX performance. I can see why it's "cool" but it still doesn't have the performance to "game" on without making dramatic compromises that defeat the value of ray tracing in videogames anyways. And no, DLSS is not a valid solution for a $650+ GPU, let alone the $1,600 RTX 4090.
@@KiraSlith I know that most people doesn’t turn up RTX, but I’m saying that testing RTX with different types of memories to show the difference in performance
I own a somewhat unknown or just unpopular card variant of a 3060ti card using gddr6x memory which i did not realise until watching this video! The card is a gigabyte 3060ti eagle oc d6x 8g model that is not listed on tech powerup's GPU database. The identifier for this specific card is nowhere to be seen and initially thought the card was made for the chineese market. Works a treat now since i replaced thermal pads for the VRM and memory using thicker pads as the factory ones were not up to the task. GPU hotspot and memory temps were insane, reaching 70 to 80c with no changes to any oc settings in AB. Is this a rare card? Or just a unpopular variant?
now that we got a bunch variants of 3060 now, can you do ultimate 1060 comparison (5GB, 6GB, 6GB 9GBps and GD5X) someday? Because this is almost the same situation now
Irrelevant graphics card I have a RTX 3060 ti so looking for RTX 4060 ti 16GB will ignore AMD RX 7600xt 8GB because I have only a 650 psu. Love these videos gives me experience.
Wow this card is actually a better upgrade than a 4060ti because it actually is reliably higher instead of losing by surprising margins in some scenarios and games!
I remember when gddr overclock wasn't even a thing, but more is More. I wouldn't even consider the choices, if the price difference between isn't much. The faster is faster, no matter how little.
Zotac is the worst brand. I had a Zotac before and while the card never given me computer problems, it had loud coilwhine even at low frame rate, it was loud at 40% fan speed and temps were always 65 degrees + in all games
I wonder if the addition of 6X memory will bring it closer to the AMD 6750xt. When I built my rig a few months ago I looked at both but chose AMD for the better performance/price and because it runs better under Linux than Nvidia cards do.
Considering it doesnt change where the card sits performance wise...no it shouldnt bring it any closer to the 6750xt than it already is. Memory, speed and capacity simply just isnt a huge factor unless you are running out. There are a few cases where vastly faster memory helps out at huge resolutions, but it's hard to fall back on edge cases for a general outlook of a gpu. It would be like me deciding to go amd based on mw2 performance alone.
Anyone else notice there being a big difference in the gameplay/benchmark footage for the witcher and red dead? It seems like the gddr6x is much brighter or has more lighting effect settings enabled. I even flipped my screen upside down in windows to make sure it wasn't my monitor lol.
This is what I've been saying... instead of offering 6X memory, Nvidia could have used regular GDDR6 and given the cards more VRAM... the 6X doesn't add much performance but it is decently more expensive to use...
is rtx 3060 ti gddr6x worth buying over rtx 4060 ti for $60 less ??? does 3060 ti gddr6x heat up more compared to the gddr6 version ?? last question which brand do you recommend Zotac or Gigabyte??
3060ti's have a big defective rate because of 1gb Hynix GDDR6 non-x (they die in 1-2 years), all other 30xx GDDR6 non-x cards are fine: 3050 uses 4x2gb, 3060 is 6x2gb. This is probably just a fix for that
thats like the same scenario when the first RTX 2000 came out. and it was about Micron chips. saw a GPU repair video in yt where also samsung memory chips die the same way.. and most of the people that had this problem is when they OC their memory too much and they heat up. also to take note is that the GDDR6 doesn't include any temperature sensor but the GDDR6X have. so people frying their memory chips to death by overclocking it. same problem with the RTX 2000 micron chips, people OC the memory to death. my old 2060 had the same micron chip and it lasted 4 years. it only died when my AIO spilled liquid on it LMAO
@@scra1 Did some digging and really only anecdotal evidence which cannot be used to make a proper conclusion. I'm someone who has a working 680 that was overclocked to the wall for most of its life and went through 3 hand me down systems. That may not have much to do with 3060ti failures but googling "3060ti failure rates" comes up with pretty much a handful of people on reddit and literally nothing else.
@@MDxGano The ones that had the memory problems were mostly the 2000 series, particularly notorious in the 2080Ti with the Micron chips. I haven't really heard about the 3000 series having the same issues but I think it's "better to be safe than sorry" when it comes to these things in my opinion.
Interestingly enough, the 6x card seems to run drawing more power (maybe 25watts) while cooler(2ish degrees). Guess this has to do more with the 3rd party card design?
Could you consider that FH5 runs much worse in the town area and in the HotHweels expasion or you're going for average scenario results? In the said places GPU usage goes up by about 20-25%, which is the reason why I have to turn down some settings to avoid drops. Sad to see these 2 areas having an impact on the whole game, otherwise my 3060Ti FE utilization is only 60-70 on average, I was disappointed that I couldn't max out the game in 1080p (DLAA). I wish there was an option to have graphics scaled back in these more demanding areas, so the performace is mostly consistent.
I just increased the power limit on my GDDR6 model in msi afterburner and got the same results sometimes even better than the GDDR6X so it is completely pointless to favor one over the other if you're looking for a 3060 Ti
G'day Random, 🤔If these were still getting made late last year maybe it's not really a SKU Upgrade/Rerelease but nVIDIA just switching all models to DDR6X VRAM
wow such performance, a whole frame
Sometimes that frame can come in handy 😂
Frame wins games.
@justinh.2398 all we need is frame
25 more watts for just a frame.
@@KimBoKastekniv47 it's like overclocking but worse 😔
Well, after watching this video ive settled on upgrading from a 3060ti to a 3060ti G6X.
Seriously?for that amount of money and the minimal diff in performance, you're better off with a 4070 or even that 4060ti with 16gb if they release it.
@@jeantechnoir7702 r/woosh
im waiting until the rtx 3060ti g6xx!
I'm personally waiting for the gddr7 var
@@jeantechnoir7702 lol Oh Jean....
I was thinking:
I see so many comments asking me if I can test a GDDR6X version. Surely it's much better than the normal one, right?
...
I will start pasting this video's link under those comments 😅
Thanks for the great comparison!
Hi zwormz love your vids
@@pravnav hey! Thanks mate :)
The human eye can't see beyond GDDR6.
😂
Good one xD
Neither can the computer's eyes
The regular GDDR6 card produces pretty much the same results while consuming less power, ~25w less than the GDDR6X variant.
additionally, on gddr6 cards, you can already get a stable +1000mhz oc on the memory
@@sexdefender86 Not much demonstrable performance from it though. At best you're talking 1-3 FPS, although in my experience even a modest +500Mhz can improve 1% lows depending on the game / engine.
that doesnt say much because the thing is already using over 200 watts
@@arch1107 but doesn't require a second power connector. That second power connector might rule it out as an option for someone with a lower wattage psu, or one that doesn't have a lot of power connectors to spare, like most OEM systems.
@@gamewizard1760 the second connector is used when you overclock and cant rely on the 75 watts delivered by the pci express slot
theoretically you don't need it if you don't overclock, when you do, of course you will go over the 150 watts from the 8 pin power cable and the 75 watts from the pci express connector
they should have used a 6 pin connector, but, it is what it is
Love your channel. I didn't know they made a GDDR6X version
Yeah seen it in stock at a few places now :)
@@RandomGaminginHD Now if only they made a GDDR6X 12gb 3060
Surplus stock of G6X memory.
But can't make 4060Ti GDDR6X 😆😆😅🤣🤣🤣🤣😂😂😜🤪😝😛
Its because fail with hynix
My favorite hardware testing channel by far. In depth reviews of games that show gameplay talk about important factors that determine if the hardware is worth it etc, thank you for these videos!
Thanks for watching :)
I'd be curious how the two cards respond in a VRAM limited situation. Something like Hogwarts Legacy with the textures cranked high enough to challenge the 8 gigabyte buffer.
@Captain_Morgan so just downgrade the game version...
If you actually watched the video he did show usage of over 7gb vram. And if he increased more he would hit close to 8gb. But from the video its obvious that its very slight difference , this is pure cash grab lie.
Hogwarts Legacy had issues with VRAM allocation from day 1. Having said that, I only have an RX 6600 8gb and never had any issues with it.
RE4 remake is also pretty VRAM hungry.
Boy, that GDDR6X upgrade really unleashed the beast, huh?
All 0.2% of it🤣
GUYS, nvidia released the GDDR6X version of the video card for a reason, they released it to prevent further problems with burning out the gddr6 chips, because for some users the chip started to burn, this is due to incorrectly set voltages directly from the Hunix factory.
so you can safely take the gddr6X version, it's very reliable
Like! Today’s video cards look like drones :))
Yeah I really like the look of this one 😁
There is a reason why we haven't heard about this version... *It doesn't say much!*
Thank you😄👍🏽
It was tough getting a 3060 Ti for MSRP this time last year. Luckily I got one and have been happy with it since.
That make Nvidia's choice to go for that instead of a Vram capacity upgrade questionnable. It's obviously just a left over stock of chips but they could at least make the upgrade on the 3070 instead.
There was never a bandwidth bottleneck with the original so why would there be any real improvement.
THIS
Heck I use one limited to the lowest voltage, the VRAM downclocks to 5000mhz but it barely changes the Timespy score by 8% or something compared to 7000
and the GDDR6X is power hungry..
@@BenState You're not serious😂
It's good that someone quantified this. The 3060ti wasn't known to have a memory bottleneck, so I had a feeling the difference was negligible.
Close to measuring tolerances, but the power draw of the GDDR 6X variant is also higher - around 15% as far as I could see.
That's a huge improvements im gonna bought this later
About 20w more power on 6X than 6 for a few frames 😅, no wonder it hasnt been covered in other channels, thanks for the video.
Would maybe have been a more interesting card if they changed the memory to 12GB VRAM on a 192 bit bus, but GDDR6X to make up the loss in bandwidth from the smaller bus. As it is, I'm guessing the improvements are minimal, because the card isn't really reaching its bandwidth limit most of the time even on the original version.
The memory subsystem in these cards are designed to act more like the consoles. This is Nvidia trying to be AMD. More memory to compensate for bandwidth limitations. Makes sense but in practice? Devs need to spit polish their turds and stop churning out non optimized garbage.
@@Trick-Framed This. Over half of the games that broke 8gb cards in HWUB's video on the topic have been remedied.
@@MDxGano Yep. They get to it. Like I said, they need to keep at it.. I'm to the point where I stopped buying release day games. Too expensive for the sheer volume of game breaking moments. On console they know better... but the PC crowd will deal with it? Meanwhile we pay 10x for our "Game machines".
Last AAA game that pooped the bed on console was Cyberpunk 2077. A couple years ago. As a point of reference.
@@Trick-Framed yeah and then they release bullshit card like 3070 and Ti with 8gb of ram wtf !? 😄😆😒
So we get stronger card with vram limitation 🤦🤦
GDDR6X card was using 20-30W more than the old card. Hardly seems worth it with the minimal performance boost
Wasnt aware of the x variant, thanks mang 😊
from an engineering point of view, a consistent 1.5% uplift is definitely an improvement.
From a budget gaming point of view - the extra price and energy/power costs are not worth it, of course.
The tests with ray tracing activated were missing, perhaps a greater bandwidth will help, now we will be left with the question...
Seeing this make me happy with my EVGA RTX 2080 Super I got for doing some work on a friend's car. And recently just upgraded from an AMD Ryzen 5 1600X to an AMD Ryzen 5 5600X. I can say that I got a HUGE upgrade vs the 1600X and Zotac GTX 1070 I was running before. And it is nice to finally run my RAM at its full 3600MHz instead of 3200MHz running the 1600X. Went from a 95 watt processor (stock) that only hit 4.0GHz all core and pulling closer to 125 watts when maxed out and 70*C, to a 65 watt processor (stock) that will auto overclock to between 4.65-4.8GHz all core, some cores peaking at 5GHz and drawing 110-115 watts and 65*C with the same Corsair AIO. Pretty happy with that all on a B450M motherboard that I flashed a new BIOS to.
Okay, I never saw that there are 2 different cards available. :D
In Luxembourg/Germany the price is similar, to for 2 frames extra why not..
If the price is similar go for it 😁
So my takeaway is if you already own the og 3060 ti it's not worth the upgrade. Would love to see the comparison between a 2060 6gb to see if it would be worth my time and money upgrading
If he would’ve used a faster cpu there would’ve been a noticeable increase in performance. I got almost 10% FPS increase in some games and I’m running a 13600K.. The stock 12400f with the crappy ram he uses holds the Gpu back.
And it would definitely be worth the upgrade from a 2060.. But if your running an older cpu than 11thgen Intel or 5000 series Ryzen there isn’t a point. Upgrade your CPU and ram first if that’s the case.
This is usually the case with higher memory bandwidth... as game engines become more taxing on VRAM bandwidth, the difference between the 2 cards will become more noticeable. however, since most game engines today are well optimized to still run well on GDDR5 memory (and maybe GDDR6 memory in some titles) the extra bandwidth wont yet make a difference until the workload from newer game engines gets more demanding.
you'll run out of V-ram most likely by that point aswell.
The higher bandwidth really comes into play at higher resolutions, but by that time 3060 Ti has already run out of steam
I would highly suggest using Dx11 for Witcher 3 testing as it generally gets approximately +10% and doesn't have any shader stutter, that way you'd be able to use ultra+ without a performance hit. It feels very silly that nvidia would do this with a 3060 ti as opposed to a 3070 since that card is much more likely to have bandwidth related performance issues.
Overall, more VRAM woulda helped this card more than faster VRAM. 3060ti shoulda had 10GB, 3070 shoulda had 12GB, and the 3080 shoulda had 16GB for this generation to shine fully for years to come. the 3060 shoulda been capped at 8GB of GDDR6 VRAM, not the TI version.
Oh nyo. Gotta upgrade mine immediatly.
My £45 580 8gb arrived today... very impressed so far :) Big fps jump from my 570 and all good :D
I think I remember hearing about this 3060 ti refresh, but tbh it completely slipped my mind until this video
Not worth a price premium imo🤦♀🤦♂🙈. Great video thou💪😇🥰👍
Thanks for the vid
stunned you didnt do 4k test, i run a R5 3400g with a rx5500xt at 4k in horizen zero dawn at ultra, sure its only 25 fps but works
Found a 3060 ti for $210 on Mercari and I couldn’t be happier with this GPU especially gaming at 1080p
Looks to be using +20W over the non X version (at least what was being reporting on the screen)
That could also be down to bad binning. My Gaming OC pro would use over 200w regularly as well but clocked at 1945mhz on the core.
Why they can't put 10 or 12 Gb of vram ?! It will be the best move to do for this 3060TI ! Great video, I don't know this GDDR6X was out in the market !
they dont do it on the 4070, why would they do that in a 3060?
Love the videos, did note the power draw was an extreme 20 - 25watts too... Personally don't think the price difference is worth it... Although my systems rocking a 3060 😁
Wonder how the GDDR6X model will compare when the VRAM starts to run out and it needs to swap more data in and out?
Great, I recently bought 2 3060TIs and never even heard about this.
You should run a Blender benchmark and see how much the extra bandwidth helps.
the higher bandwidth maybe useful for productivity scenario, like rendering videos, i guess
I was expecting to get a VRAM upgrade that genuinely wouldve made it a great upgrade but it seems like this is a very nominal upgrade to a regular 3060 Ti
Plus it uses a lot more power.
Nah, if they make VRAM upgrade now, how they gonna sell "next-gen" ?
Man, there are so many "nice" numbers in the results😊
BTW, another one in favor of .X model is VRAM consumption(at least in RTSS): it "eats" about 150 Meg less. Does it make any difference due to quicker speeds or one should opt for 12G card anyway?
Increased power use by quite a bit for very little gain.
Maybe try comparing the GTX 1660 super with a more modern card ❤
Yeah love that card!
@@RandomGaminginHD It'll be nice to see how it actually stacks up with hard numbers, my target is 1080p 75fps and I really haven't had to turn down a lot of settings yet
Yeah I think I'll stick with my RX6600, and still recommend it to others over this (or the OG version) Double the power consumption, double the price just to gain a mere 20 fps average in 1440p on most games is not worth the cash. Especially when the 7600 comes out next month (if it's priced competitively enough.) I've been gaming on Nvidia for the quite awhile, but they have lost there value charm when they went to RTX.
3060ti regular is still better option than 6600 or 6700. Only if price difference is much larger is a good buy. Here where i live not so much. 6700xt is okey for larger vram. But again useless because we cant use RT, so if we compare it like that, 3060ti is still better.
@@milosstojanovic4623 First let me say that your English is not that great so your comment is all over the place, but I will try my best to respond. I don't know were you are to see how your markets are in terms of pricing, but were I am there is a $200+ (U.S.) average difference in price between the 6600 and the 3060TI, and the deference matters to people that don't have "more money than brains." If RT is more important to my customer(s) then saving money then I just point them to the 6700XT it's $40 bucks less than the regular DDR6 Zotac 3060TI and they are on par with one another, but most think it's not worth the cost of the card nor the cost added to their electric bill each month especially if they have more than one gamer in the house.
@@JeremyLeePotocki First of all my English is good enough, and when you learn to speak and write Serbian as i do English, then we could discuss that, before that stop with bullshit remarks.
Second, here 6600 regular non XT costs 280e from what i found the cheapest, 3060ti Evga costs 330e (that's the cheapest, and 350e is an average price), and that ends any further comparison. That's why i specifically wrote HERE, because prices vary from country to country. The cheapest 6600xt costs 375e.
@@milosstojanovic4623 Milos I was not trying to be insinuative just informative so no need to be this defensive nor offensive in your comment. Also if you would have just stated what your paying for your cards in your area it would have informed me a lot better, and you would have received a better response. So a 50e difference dose make it a better option if you don't take in count the cost of actually using the card on a daily bases. So with that said if you don't like what I said maybe just don't reply at all, or at least be more civil about it.
@@JeremyLeePotocki this is me being civil, if i was not civil, i would use entirely different words to describe people who are being smartass because first thing they say is YoUr EngLiSh iS NoT gOOd and "i can't understand context of what had been said" ;)
I talked to native speaking people, and some of their English is less understandable than people which are not native speakers. And all what you and others should do is to use 2 more brain cells from regular 2 to understand words spoken ;)
Best regards ✌
Instead they should have done a 16 gig 3060ti. That would be much better.
10-20% less efficient for 1-2% performance gain. What a deal ! I'm gonna get five of them.
Anyway thanks for the video i find out there was X version and still i was not 100% sure to get a 6700XT or a 3060Ti because of the 4Gb Vram less, i'm confident in my choice now. No ragrets👌
Reminds me of when the GTX 1650 got a GDDR6 model, apparently because manufacturers were basically just done with GDDR5 and it was harder to source the older model memory
Exactly there is no point to use different memory modules because they put almost the same price, and we gain nothing, shit is ridiculous. XD
@@milosstojanovic4623 Well, I suppose we don’t gain nothing at all, but yes, it’s not much.
I figured the difference would be negligable. The GDDR6 version isn't really bandwidth limited in the first place.
Was there any difference in the power consumption? It seems as though GPU power usage is still on the rise, the tech is needing the next big break.
Software reports an increase of 25-30W (around 15% more) on the G6X variant. Honestly, not worth it for the ~3% extra performance you're getting.
might try a ryzen 5 5600 on your bench for these 2 cards. From what i saw in benchmarks from other YTers, the 3060ti G6X performs closer to the 3070..... I think your 12400F might be the limiting factor here
The memory on RTX 3060ti was already faster than GPU could benefit from, nvidia should have put lower bandwidth VRAM like 3060
(Memory Bus: 192 bit
Bandwidth: 360.0 GB/s)
into the 3060ti, it would've been fine, more VRAM was much more necessary than that of more bandwidth on 3060ti.
Also using the same buswidth of 256bit, adding 4x2GB chips and 4x 1GB chips of memory would've resultant in still higher bandwidth than 3060,
I'm not super technician but that what I think. :)
1:36 Be interested to see the power draw at the wall if there's any difference.... especially as that one has 2 x 8 pin when some only have a single 8 pin.
i like to see games like Diablo, Borderlands and Snowrunner tested on the channel. because thats is the only games i play for years.
I'm wondering of the GddrX version runs faster at 4k due to the extra bandwidth
Hell yeah! 25 Watts more for 2 whole FPS boost. And how much extra does it cost for this epic upgrade?
I thought the GDDR6X was suppose to be more efficient. Is this older version 6X memory Nvidia wants to off load excess stock?
Aesthetically I think the Zotac 30 Series RTX GPUs look great. Very underrated.
Underrated because its a very loud card and it gets high temps
@@Ladioz I said aesthetically. The noise a card produces has nothing to do with its aesthetics.
You need longer videos. You never make it to the end of my task. 😂
The problem with these cards, as I have the 3070 variant of this brand of card myself, is the stupid position of the 8 pin sockets! Surely it would not of killed them to put the sockets higher. They are so awkward to put the cables in. And, with the watching of this video, no, to me this card is a complete waste of money. The plusses are so minuscule, it's not worth it. But, thanks again, Steve for another interesting video!
CPU clocks are slow and resolution medium. If there was no bandwidth bottleneck in the GDDR6 install, why would there be one in the GDDR6X? You have no signal to test and got those exact results.
they did a refresh but didn't add 4gb more memory.............. just faster memory...
Yeah that 8 gig limit kills the 3060ti in newer games.
I got a gaming laptop in mid 2021 with an RTX 3060 and ryzen 7 5800H. The RTX 3060 can run upto 130W which is about as powerful as you can get in a laptop. The VRAM can be a limit sometimes but I don't mind turning the settings down and I think this laptop will run my applications comfortably for the next 2 years. Pretty happy with this purchase.
Legion 5 pro?
@@Thehoneybadger90 dell g15 actually.
Its all about price. During the gpu apocalypse I was on the EVGA wait list so long I forgot about it. When the email came to purchase a 3060ti at msrp I jumped on it. My oldest Son has no complaints and sure the 8g of Vram may be an issue with the latest unoptimized hot garbage games.... The games he actually plays run fantastic and we won't be replacing it anytime soon.
The faster memory helps a lot in ray tracing. I recommend you to try it. I’m almost sure it will be better
Eh, most people aren't that concerned with RTX performance. I can see why it's "cool" but it still doesn't have the performance to "game" on without making dramatic compromises that defeat the value of ray tracing in videogames anyways. And no, DLSS is not a valid solution for a $650+ GPU, let alone the $1,600 RTX 4090.
@@KiraSlith I know that most people doesn’t turn up RTX, but I’m saying that testing RTX with different types of memories to show the difference in performance
@@vtr_monsterextremo5145 you are right, he should have tested RT even if he would need to lower resolution or reduced some settings.
I own a somewhat unknown or just unpopular card variant of a 3060ti card using gddr6x memory which i did not realise until watching this video! The card is a gigabyte 3060ti eagle oc d6x 8g model that is not listed on tech powerup's GPU database. The identifier for this specific card is nowhere to be seen and initially thought the card was made for the chineese market. Works a treat now since i replaced thermal pads for the VRM and memory using thicker pads as the factory ones were not up to the task. GPU hotspot and memory temps were insane, reaching 70 to 80c with no changes to any oc settings in AB. Is this a rare card? Or just a unpopular variant?
There is a GTX 1060 with GDDR5X, a KFA model apparently
I like the design of this GPU!
RandomGamingInHD tested a 3060 Ti with GDDR6X vs the GDDR6 and it was like a 1FPS difference apparently lol
I'm guessing non-X GDDR6 is no longer being produced (but 3060 Ti's still are for some reason...)
now that we got a bunch variants of 3060 now, can you do ultimate 1060 comparison (5GB, 6GB, 6GB 9GBps and GD5X) someday? Because this is almost the same situation now
Irrelevant graphics card I have a RTX 3060 ti so looking for RTX 4060 ti 16GB will ignore AMD RX 7600xt 8GB because I have only a 650 psu. Love these videos gives me experience.
Wow this card is actually a better upgrade than a 4060ti because it actually is reliably higher instead of losing by surprising margins in some scenarios and games!
I like the 2x8pin location.
I remember when gddr overclock wasn't even a thing, but more is More. I wouldn't even consider the choices, if the price difference between isn't much. The faster is faster, no matter how little.
Textures only impact memory, if memory is fine a higher texture size is no problem!
Probably more meant for eSport-titles like CS @1080p. Doubt bandwith will matter at higher resolutions, since the games hit other limits way before.
in a blindtest, highly doubt the 1-5 frames difference would be noticeable, but still handy to have just in case future games demand it
It'll be useful for OC, You can push core speeds quite high without worrying about memory speeds.
it's over 400+USD , not surprising that no one is covering it at this point even if it's a refresh.
I've had three Zotac cards in my life. Two of them failed quite quickly, third one had broken fan speed control from the start. 😬
Never again.
Zotac is the worst brand. I had a Zotac before and while the card never given me computer problems, it had loud coilwhine even at low frame rate, it was loud at 40% fan speed and temps were always 65 degrees + in all games
I wonder if the addition of 6X memory will bring it closer to the AMD 6750xt. When I built my rig a few months ago I looked at both but chose AMD for the better performance/price and because it runs better under Linux than Nvidia cards do.
Considering it doesnt change where the card sits performance wise...no it shouldnt bring it any closer to the 6750xt than it already is. Memory, speed and capacity simply just isnt a huge factor unless you are running out. There are a few cases where vastly faster memory helps out at huge resolutions, but it's hard to fall back on edge cases for a general outlook of a gpu. It would be like me deciding to go amd based on mw2 performance alone.
I have an MSI 3060 Ti 3x OC with the GDDR6X and 2 8-pin plugs! It's a great card!
How has it been so far? what resolution do you play? do games run smooth?
@@Ladioz 1440p @ 144hz. I mainly play older titles like Doom 2016. It’s been good. My cpu is a 5700x
@@Sonic_1000 Oh cool. Im building 3060 TI and 5800X3D, for 1080p gaming
all I can say is that. RX 6700xt and or RX 6750xt sounds like a better buy imo.
Really like the 6700xt had one myself for a while
Anyone else notice there being a big difference in the gameplay/benchmark footage for the witcher and red dead? It seems like the gddr6x is much brighter or has more lighting effect settings enabled. I even flipped my screen upside down in windows to make sure it wasn't my monitor lol.
This is what I've been saying... instead of offering 6X memory, Nvidia could have used regular GDDR6 and given the cards more VRAM... the 6X doesn't add much performance but it is decently more expensive to use...
Interestingly, the gddr6x pulls constantly 20-30 watts higher than the normal one.
I mostly notice a 20 watt drop in power consumption
is rtx 3060 ti gddr6x worth buying over rtx 4060 ti for $60 less ???
does 3060 ti gddr6x heat up more compared to the gddr6 version ??
last question which brand do you recommend Zotac or Gigabyte??
3060ti's have a big defective rate because of 1gb Hynix GDDR6 non-x (they die in 1-2 years), all other 30xx GDDR6 non-x cards are fine: 3050 uses 4x2gb, 3060 is 6x2gb. This is probably just a fix for that
Source?
@@MDxGano YT deletes my comments, try to search it yourself
thats like the same scenario when the first RTX 2000 came out. and it was about Micron chips. saw a GPU repair video in yt where also samsung memory chips die the same way.. and most of the people that had this problem is when they OC their memory too much and they heat up. also to take note is that the GDDR6 doesn't include any temperature sensor but the GDDR6X have. so people frying their memory chips to death by overclocking it. same problem with the RTX 2000 micron chips, people OC the memory to death. my old 2060 had the same micron chip and it lasted 4 years. it only died when my AIO spilled liquid on it LMAO
@@scra1 Did some digging and really only anecdotal evidence which cannot be used to make a proper conclusion. I'm someone who has a working 680 that was overclocked to the wall for most of its life and went through 3 hand me down systems. That may not have much to do with 3060ti failures but googling "3060ti failure rates" comes up with pretty much a handful of people on reddit and literally nothing else.
@@MDxGano The ones that had the memory problems were mostly the 2000 series, particularly notorious in the 2080Ti with the Micron chips. I haven't really heard about the 3000 series having the same issues but I think it's "better to be safe than sorry" when it comes to these things in my opinion.
Looks like the big improvement is the 30w less of power darw, which can be a big deal for some people.
A whooping 3-4% increase in fps with 10-15% energy consume increase and maybe 20$ higher. What a steal
Didn’t know there was a refresh
5% more frames for a higher price? that's like buying updated drivers
Interestingly enough, the 6x card seems to run drawing more power (maybe 25watts) while cooler(2ish degrees). Guess this has to do more with the 3rd party card design?
Could you consider that FH5 runs much worse in the town area and in the HotHweels expasion or you're going for average scenario results? In the said places GPU usage goes up by about 20-25%, which is the reason why I have to turn down some settings to avoid drops. Sad to see these 2 areas having an impact on the whole game, otherwise my 3060Ti FE utilization is only 60-70 on average, I was disappointed that I couldn't max out the game in 1080p (DLAA).
I wish there was an option to have graphics scaled back in these more demanding areas, so the performace is mostly consistent.
I just increased the power limit on my GDDR6 model in msi afterburner and got the same results sometimes even better than the GDDR6X so it is completely pointless to favor one over the other if you're looking for a 3060 Ti
hey randomgaming, what camera you use for your video?
G'day Random,
🤔If these were still getting made late last year maybe it's not really a SKU Upgrade/Rerelease but nVIDIA just switching all models to DDR6X VRAM
I also noticed the X pulling 20 to 25 more Watts.