The "New" RTX 3060 Ti - Is Gaming Performance Improved?

Поделиться
HTML-код
  • Опубликовано: 25 окт 2024
  • НаукаНаука

Комментарии • 470

  • @ydkma
    @ydkma Год назад +681

    wow such performance, a whole frame

    • @RandomGaminginHD
      @RandomGaminginHD  Год назад +202

      Sometimes that frame can come in handy 😂

    • @justinh.2398
      @justinh.2398 Год назад +31

      Frame wins games.

    • @ydkma
      @ydkma Год назад +12

      @justinh.2398 all we need is frame

    • @KimBoKastekniv47
      @KimBoKastekniv47 Год назад +73

      25 more watts for just a frame.

    • @pelonix
      @pelonix Год назад +48

      @@KimBoKastekniv47 it's like overclocking but worse 😔

  • @t1e6x12
    @t1e6x12 Год назад +434

    Well, after watching this video ive settled on upgrading from a 3060ti to a 3060ti G6X.

    • @jeantechnoir7702
      @jeantechnoir7702 Год назад +21

      Seriously?for that amount of money and the minimal diff in performance, you're better off with a 4070 or even that 4060ti with 16gb if they release it.

    • @anrypettit3112
      @anrypettit3112 Год назад +211

      @@jeantechnoir7702 r/woosh

    • @pelonix
      @pelonix Год назад +33

      im waiting until the rtx 3060ti g6xx!

    • @krejn
      @krejn Год назад +23

      I'm personally waiting for the gddr7 var

    • @helenHTID
      @helenHTID Год назад +16

      @@jeantechnoir7702 lol Oh Jean....

  • @zWORMzGaming
    @zWORMzGaming Год назад +14

    I was thinking:
    I see so many comments asking me if I can test a GDDR6X version. Surely it's much better than the normal one, right?
    ...
    I will start pasting this video's link under those comments 😅
    Thanks for the great comparison!

    • @pravnav
      @pravnav Год назад +1

      Hi zwormz love your vids

    • @zWORMzGaming
      @zWORMzGaming Год назад +1

      @@pravnav hey! Thanks mate :)

  • @TechOrigami
    @TechOrigami Год назад +162

    The human eye can't see beyond GDDR6.

  • @AderorhoFestus
    @AderorhoFestus Год назад +126

    The regular GDDR6 card produces pretty much the same results while consuming less power, ~25w less than the GDDR6X variant.

    • @sexdefender86
      @sexdefender86 Год назад +8

      additionally, on gddr6 cards, you can already get a stable +1000mhz oc on the memory

    • @The_Man_In_Red
      @The_Man_In_Red Год назад +5

      @@sexdefender86 Not much demonstrable performance from it though. At best you're talking 1-3 FPS, although in my experience even a modest +500Mhz can improve 1% lows depending on the game / engine.

    • @arch1107
      @arch1107 Год назад +4

      that doesnt say much because the thing is already using over 200 watts

    • @gamewizard1760
      @gamewizard1760 Год назад +2

      ​@@arch1107 but doesn't require a second power connector. That second power connector might rule it out as an option for someone with a lower wattage psu, or one that doesn't have a lot of power connectors to spare, like most OEM systems.

    • @arch1107
      @arch1107 Год назад

      @@gamewizard1760 the second connector is used when you overclock and cant rely on the 75 watts delivered by the pci express slot
      theoretically you don't need it if you don't overclock, when you do, of course you will go over the 150 watts from the 8 pin power cable and the 75 watts from the pci express connector
      they should have used a 6 pin connector, but, it is what it is

  • @Xendindrome
    @Xendindrome Год назад +135

    Love your channel. I didn't know they made a GDDR6X version

  • @lees8359
    @lees8359 Год назад +54

    My favorite hardware testing channel by far. In depth reviews of games that show gameplay talk about important factors that determine if the hardware is worth it etc, thank you for these videos!

  • @irishgiant5150
    @irishgiant5150 Год назад +63

    I'd be curious how the two cards respond in a VRAM limited situation. Something like Hogwarts Legacy with the textures cranked high enough to challenge the 8 gigabyte buffer.

    • @rogoznicafc9672
      @rogoznicafc9672 Год назад +3

      @Captain_Morgan so just downgrade the game version...

    • @milosstojanovic4623
      @milosstojanovic4623 Год назад +4

      If you actually watched the video he did show usage of over 7gb vram. And if he increased more he would hit close to 8gb. But from the video its obvious that its very slight difference , this is pure cash grab lie.

    • @janemba42
      @janemba42 Год назад +1

      Hogwarts Legacy had issues with VRAM allocation from day 1. Having said that, I only have an RX 6600 8gb and never had any issues with it.

    • @RuruFIN
      @RuruFIN Год назад

      RE4 remake is also pretty VRAM hungry.

  • @gvii
    @gvii Год назад +11

    Boy, that GDDR6X upgrade really unleashed the beast, huh?

    • @raf9826
      @raf9826 Год назад +4

      All 0.2% of it🤣

  • @zonelore
    @zonelore Год назад +1

    GUYS, nvidia released the GDDR6X version of the video card for a reason, they released it to prevent further problems with burning out the gddr6 chips, because for some users the chip started to burn, this is due to incorrectly set voltages directly from the Hunix factory.
    so you can safely take the gddr6X version, it's very reliable

  • @3dfxvoodoocards6
    @3dfxvoodoocards6 Год назад +3

    Like! Today’s video cards look like drones :))

  • @upfront2375
    @upfront2375 Год назад +1

    There is a reason why we haven't heard about this version... *It doesn't say much!*
    Thank you😄👍🏽

  • @TFSned
    @TFSned Год назад +7

    It was tough getting a 3060 Ti for MSRP this time last year. Luckily I got one and have been happy with it since.

    • @fabrb26
      @fabrb26 Год назад

      That make Nvidia's choice to go for that instead of a Vram capacity upgrade questionnable. It's obviously just a left over stock of chips but they could at least make the upgrade on the 3070 instead.

  • @TC-tn9tb
    @TC-tn9tb Год назад +16

    There was never a bandwidth bottleneck with the original so why would there be any real improvement.

    • @BenState
      @BenState Год назад

      THIS

    • @RS-nq8xk
      @RS-nq8xk Год назад +1

      Heck I use one limited to the lowest voltage, the VRAM downclocks to 5000mhz but it barely changes the Timespy score by 8% or something compared to 7000

    • @heyitsmejm4792
      @heyitsmejm4792 Год назад +2

      and the GDDR6X is power hungry..

    • @TC-tn9tb
      @TC-tn9tb Год назад

      @@BenState You're not serious😂

  • @50shadesofbeige88
    @50shadesofbeige88 Год назад +9

    It's good that someone quantified this. The 3060ti wasn't known to have a memory bottleneck, so I had a feeling the difference was negligible.

  • @kai84m
    @kai84m Год назад +2

    Close to measuring tolerances, but the power draw of the GDDR 6X variant is also higher - around 15% as far as I could see.

  • @amy-295
    @amy-295 Год назад +2

    That's a huge improvements im gonna bought this later

  • @XxJGVxX
    @XxJGVxX Год назад +1

    About 20w more power on 6X than 6 for a few frames 😅, no wonder it hasnt been covered in other channels, thanks for the video.

  • @OneTwoWolf
    @OneTwoWolf Год назад +13

    Would maybe have been a more interesting card if they changed the memory to 12GB VRAM on a 192 bit bus, but GDDR6X to make up the loss in bandwidth from the smaller bus. As it is, I'm guessing the improvements are minimal, because the card isn't really reaching its bandwidth limit most of the time even on the original version.

    • @Trick-Framed
      @Trick-Framed Год назад +2

      The memory subsystem in these cards are designed to act more like the consoles. This is Nvidia trying to be AMD. More memory to compensate for bandwidth limitations. Makes sense but in practice? Devs need to spit polish their turds and stop churning out non optimized garbage.

    • @MDxGano
      @MDxGano Год назад +1

      @@Trick-Framed This. Over half of the games that broke 8gb cards in HWUB's video on the topic have been remedied.

    • @Trick-Framed
      @Trick-Framed Год назад

      @@MDxGano Yep. They get to it. Like I said, they need to keep at it.. I'm to the point where I stopped buying release day games. Too expensive for the sheer volume of game breaking moments. On console they know better... but the PC crowd will deal with it? Meanwhile we pay 10x for our "Game machines".

    • @Trick-Framed
      @Trick-Framed Год назад

      Last AAA game that pooped the bed on console was Cyberpunk 2077. A couple years ago. As a point of reference.

    • @milosstojanovic4623
      @milosstojanovic4623 Год назад +2

      @@Trick-Framed yeah and then they release bullshit card like 3070 and Ti with 8gb of ram wtf !? 😄😆😒
      So we get stronger card with vram limitation 🤦🤦

  • @Aruneh
    @Aruneh Год назад +5

    GDDR6X card was using 20-30W more than the old card. Hardly seems worth it with the minimal performance boost

  • @cybersamiches4028
    @cybersamiches4028 Год назад

    Wasnt aware of the x variant, thanks mang 😊

  • @todorkolev7565
    @todorkolev7565 Год назад +2

    from an engineering point of view, a consistent 1.5% uplift is definitely an improvement.
    From a budget gaming point of view - the extra price and energy/power costs are not worth it, of course.

  • @RetroGamingX1
    @RetroGamingX1 Год назад +1

    The tests with ray tracing activated were missing, perhaps a greater bandwidth will help, now we will be left with the question...

  • @Carstuff111
    @Carstuff111 Год назад +1

    Seeing this make me happy with my EVGA RTX 2080 Super I got for doing some work on a friend's car. And recently just upgraded from an AMD Ryzen 5 1600X to an AMD Ryzen 5 5600X. I can say that I got a HUGE upgrade vs the 1600X and Zotac GTX 1070 I was running before. And it is nice to finally run my RAM at its full 3600MHz instead of 3200MHz running the 1600X. Went from a 95 watt processor (stock) that only hit 4.0GHz all core and pulling closer to 125 watts when maxed out and 70*C, to a 65 watt processor (stock) that will auto overclock to between 4.65-4.8GHz all core, some cores peaking at 5GHz and drawing 110-115 watts and 65*C with the same Corsair AIO. Pretty happy with that all on a B450M motherboard that I flashed a new BIOS to.

  • @Beisepimp
    @Beisepimp Год назад +1

    Okay, I never saw that there are 2 different cards available. :D
    In Luxembourg/Germany the price is similar, to for 2 frames extra why not..

  • @tfgdj94694
    @tfgdj94694 Год назад +5

    So my takeaway is if you already own the og 3060 ti it's not worth the upgrade. Would love to see the comparison between a 2060 6gb to see if it would be worth my time and money upgrading

    • @LegionGamingTV
      @LegionGamingTV Год назад

      If he would’ve used a faster cpu there would’ve been a noticeable increase in performance. I got almost 10% FPS increase in some games and I’m running a 13600K.. The stock 12400f with the crappy ram he uses holds the Gpu back.

    • @LegionGamingTV
      @LegionGamingTV Год назад

      And it would definitely be worth the upgrade from a 2060.. But if your running an older cpu than 11thgen Intel or 5000 series Ryzen there isn’t a point. Upgrade your CPU and ram first if that’s the case.

  • @hakdergaming
    @hakdergaming Год назад +3

    This is usually the case with higher memory bandwidth... as game engines become more taxing on VRAM bandwidth, the difference between the 2 cards will become more noticeable. however, since most game engines today are well optimized to still run well on GDDR5 memory (and maybe GDDR6 memory in some titles) the extra bandwidth wont yet make a difference until the workload from newer game engines gets more demanding.

    • @Dankuzmeemusmaximus
      @Dankuzmeemusmaximus Год назад +3

      you'll run out of V-ram most likely by that point aswell.

    • @MrSolvalou
      @MrSolvalou Год назад

      The higher bandwidth really comes into play at higher resolutions, but by that time 3060 Ti has already run out of steam

  • @SyndicatesFollower
    @SyndicatesFollower Год назад +1

    I would highly suggest using Dx11 for Witcher 3 testing as it generally gets approximately +10% and doesn't have any shader stutter, that way you'd be able to use ultra+ without a performance hit. It feels very silly that nvidia would do this with a 3060 ti as opposed to a 3070 since that card is much more likely to have bandwidth related performance issues.

  • @ThePike220
    @ThePike220 Год назад +2

    Overall, more VRAM woulda helped this card more than faster VRAM. 3060ti shoulda had 10GB, 3070 shoulda had 12GB, and the 3080 shoulda had 16GB for this generation to shine fully for years to come. the 3060 shoulda been capped at 8GB of GDDR6 VRAM, not the TI version.

  • @ChrisGrump
    @ChrisGrump Год назад

    Oh nyo. Gotta upgrade mine immediatly.

  • @dodgydruid
    @dodgydruid Год назад

    My £45 580 8gb arrived today... very impressed so far :) Big fps jump from my 570 and all good :D

  • @sosukelele
    @sosukelele Год назад

    I think I remember hearing about this 3060 ti refresh, but tbh it completely slipped my mind until this video

  • @michaelthompson9798
    @michaelthompson9798 Год назад +1

    Not worth a price premium imo🤦‍♀🤦‍♂🙈. Great video thou💪😇🥰👍

  • @kingsnk3139
    @kingsnk3139 Год назад

    Thanks for the vid

  • @phoenixangel9022
    @phoenixangel9022 Год назад

    stunned you didnt do 4k test, i run a R5 3400g with a rx5500xt at 4k in horizen zero dawn at ultra, sure its only 25 fps but works

  • @Grandmaster_Vic
    @Grandmaster_Vic Год назад

    Found a 3060 ti for $210 on Mercari and I couldn’t be happier with this GPU especially gaming at 1080p

  • @bunsenhoneydew6551
    @bunsenhoneydew6551 Год назад +3

    Looks to be using +20W over the non X version (at least what was being reporting on the screen)

    • @MDxGano
      @MDxGano Год назад

      That could also be down to bad binning. My Gaming OC pro would use over 200w regularly as well but clocked at 1945mhz on the core.

  • @nykraftlemagnifique
    @nykraftlemagnifique Год назад

    Why they can't put 10 or 12 Gb of vram ?! It will be the best move to do for this 3060TI ! Great video, I don't know this GDDR6X was out in the market !

    • @arch1107
      @arch1107 Год назад

      they dont do it on the 4070, why would they do that in a 3060?

  • @iceman198021
    @iceman198021 Год назад

    Love the videos, did note the power draw was an extreme 20 - 25watts too... Personally don't think the price difference is worth it... Although my systems rocking a 3060 😁

  • @virtual-adam
    @virtual-adam 10 месяцев назад

    Wonder how the GDDR6X model will compare when the VRAM starts to run out and it needs to swap more data in and out?

  • @JRose-zn7iw
    @JRose-zn7iw Год назад

    Great, I recently bought 2 3060TIs and never even heard about this.

  • @supabass4003
    @supabass4003 Год назад +1

    You should run a Blender benchmark and see how much the extra bandwidth helps.

  • @MichaelCH911
    @MichaelCH911 Год назад +1

    the higher bandwidth maybe useful for productivity scenario, like rendering videos, i guess

  • @_Devil
    @_Devil Год назад +2

    I was expecting to get a VRAM upgrade that genuinely wouldve made it a great upgrade but it seems like this is a very nominal upgrade to a regular 3060 Ti

    • @bills6093
      @bills6093 Год назад

      Plus it uses a lot more power.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад

      Nah, if they make VRAM upgrade now, how they gonna sell "next-gen" ?

  • @andyshtroymish4997
    @andyshtroymish4997 Год назад +2

    Man, there are so many "nice" numbers in the results😊
    BTW, another one in favor of .X model is VRAM consumption(at least in RTSS): it "eats" about 150 Meg less. Does it make any difference due to quicker speeds or one should opt for 12G card anyway?

    • @bills6093
      @bills6093 Год назад +1

      Increased power use by quite a bit for very little gain.

  • @TSF71
    @TSF71 Год назад +3

    Maybe try comparing the GTX 1660 super with a more modern card ❤

    • @RandomGaminginHD
      @RandomGaminginHD  Год назад +1

      Yeah love that card!

    • @siddarthgandubilli2786
      @siddarthgandubilli2786 Год назад

      @@RandomGaminginHD It'll be nice to see how it actually stacks up with hard numbers, my target is 1080p 75fps and I really haven't had to turn down a lot of settings yet

  • @JeremyLeePotocki
    @JeremyLeePotocki Год назад +2

    Yeah I think I'll stick with my RX6600, and still recommend it to others over this (or the OG version) Double the power consumption, double the price just to gain a mere 20 fps average in 1440p on most games is not worth the cash. Especially when the 7600 comes out next month (if it's priced competitively enough.) I've been gaming on Nvidia for the quite awhile, but they have lost there value charm when they went to RTX.

    • @milosstojanovic4623
      @milosstojanovic4623 Год назад

      3060ti regular is still better option than 6600 or 6700. Only if price difference is much larger is a good buy. Here where i live not so much. 6700xt is okey for larger vram. But again useless because we cant use RT, so if we compare it like that, 3060ti is still better.

    • @JeremyLeePotocki
      @JeremyLeePotocki Год назад

      @@milosstojanovic4623 First let me say that your English is not that great so your comment is all over the place, but I will try my best to respond. I don't know were you are to see how your markets are in terms of pricing, but were I am there is a $200+ (U.S.) average difference in price between the 6600 and the 3060TI, and the deference matters to people that don't have "more money than brains." If RT is more important to my customer(s) then saving money then I just point them to the 6700XT it's $40 bucks less than the regular DDR6 Zotac 3060TI and they are on par with one another, but most think it's not worth the cost of the card nor the cost added to their electric bill each month especially if they have more than one gamer in the house.

    • @milosstojanovic4623
      @milosstojanovic4623 Год назад

      @@JeremyLeePotocki First of all my English is good enough, and when you learn to speak and write Serbian as i do English, then we could discuss that, before that stop with bullshit remarks.
      Second, here 6600 regular non XT costs 280e from what i found the cheapest, 3060ti Evga costs 330e (that's the cheapest, and 350e is an average price), and that ends any further comparison. That's why i specifically wrote HERE, because prices vary from country to country. The cheapest 6600xt costs 375e.

    • @JeremyLeePotocki
      @JeremyLeePotocki Год назад

      @@milosstojanovic4623 Milos I was not trying to be insinuative just informative so no need to be this defensive nor offensive in your comment. Also if you would have just stated what your paying for your cards in your area it would have informed me a lot better, and you would have received a better response. So a 50e difference dose make it a better option if you don't take in count the cost of actually using the card on a daily bases. So with that said if you don't like what I said maybe just don't reply at all, or at least be more civil about it.

    • @milosstojanovic4623
      @milosstojanovic4623 Год назад

      @@JeremyLeePotocki this is me being civil, if i was not civil, i would use entirely different words to describe people who are being smartass because first thing they say is YoUr EngLiSh iS NoT gOOd and "i can't understand context of what had been said" ;)
      I talked to native speaking people, and some of their English is less understandable than people which are not native speakers. And all what you and others should do is to use 2 more brain cells from regular 2 to understand words spoken ;)
      Best regards ✌

  • @roastinpeace2320
    @roastinpeace2320 Год назад +3

    Instead they should have done a 16 gig 3060ti. That would be much better.

  • @fabrb26
    @fabrb26 Год назад

    10-20% less efficient for 1-2% performance gain. What a deal ! I'm gonna get five of them.
    Anyway thanks for the video i find out there was X version and still i was not 100% sure to get a 6700XT or a 3060Ti because of the 4Gb Vram less, i'm confident in my choice now. No ragrets👌

  • @Skradgee
    @Skradgee Год назад

    Reminds me of when the GTX 1650 got a GDDR6 model, apparently because manufacturers were basically just done with GDDR5 and it was harder to source the older model memory

    • @milosstojanovic4623
      @milosstojanovic4623 Год назад

      Exactly there is no point to use different memory modules because they put almost the same price, and we gain nothing, shit is ridiculous. XD

    • @Skradgee
      @Skradgee Год назад

      @@milosstojanovic4623 Well, I suppose we don’t gain nothing at all, but yes, it’s not much.

  • @pengu6335
    @pengu6335 Год назад

    I figured the difference would be negligable. The GDDR6 version isn't really bandwidth limited in the first place.

  • @byre1000
    @byre1000 Год назад +1

    Was there any difference in the power consumption? It seems as though GPU power usage is still on the rise, the tech is needing the next big break.

    • @steph_on_yt
      @steph_on_yt Год назад +1

      Software reports an increase of 25-30W (around 15% more) on the G6X variant. Honestly, not worth it for the ~3% extra performance you're getting.

  • @MrTechnofuzz
    @MrTechnofuzz Год назад +1

    might try a ryzen 5 5600 on your bench for these 2 cards. From what i saw in benchmarks from other YTers, the 3060ti G6X performs closer to the 3070..... I think your 12400F might be the limiting factor here

  • @shaheermansoor2560
    @shaheermansoor2560 Год назад +1

    The memory on RTX 3060ti was already faster than GPU could benefit from, nvidia should have put lower bandwidth VRAM like 3060
    (Memory Bus: 192 bit
    Bandwidth: 360.0 GB/s)
    into the 3060ti, it would've been fine, more VRAM was much more necessary than that of more bandwidth on 3060ti.
    Also using the same buswidth of 256bit, adding 4x2GB chips and 4x 1GB chips of memory would've resultant in still higher bandwidth than 3060,
    I'm not super technician but that what I think. :)

  • @BillyBoy444
    @BillyBoy444 Год назад

    1:36 Be interested to see the power draw at the wall if there's any difference.... especially as that one has 2 x 8 pin when some only have a single 8 pin.

  • @comerain2425
    @comerain2425 Год назад

    i like to see games like Diablo, Borderlands and Snowrunner tested on the channel. because thats is the only games i play for years.

  • @Loundsify
    @Loundsify Год назад +1

    I'm wondering of the GddrX version runs faster at 4k due to the extra bandwidth

  • @justinpatterson5291
    @justinpatterson5291 Год назад +1

    Hell yeah! 25 Watts more for 2 whole FPS boost. And how much extra does it cost for this epic upgrade?

    • @roythunderplump
      @roythunderplump Год назад

      I thought the GDDR6X was suppose to be more efficient. Is this older version 6X memory Nvidia wants to off load excess stock?

  • @marcelthorsen
    @marcelthorsen Год назад +1

    Aesthetically I think the Zotac 30 Series RTX GPUs look great. Very underrated.

    • @Ladioz
      @Ladioz Год назад

      Underrated because its a very loud card and it gets high temps

    • @marcelthorsen
      @marcelthorsen Год назад

      @@Ladioz I said aesthetically. The noise a card produces has nothing to do with its aesthetics.

  • @SVW1976
    @SVW1976 Год назад

    You need longer videos. You never make it to the end of my task. 😂

  • @yogibear2k220
    @yogibear2k220 Год назад

    The problem with these cards, as I have the 3070 variant of this brand of card myself, is the stupid position of the 8 pin sockets! Surely it would not of killed them to put the sockets higher. They are so awkward to put the cables in. And, with the watching of this video, no, to me this card is a complete waste of money. The plusses are so minuscule, it's not worth it. But, thanks again, Steve for another interesting video!

  • @BenState
    @BenState Год назад

    CPU clocks are slow and resolution medium. If there was no bandwidth bottleneck in the GDDR6 install, why would there be one in the GDDR6X? You have no signal to test and got those exact results.

  • @John_Doe1980
    @John_Doe1980 Год назад +4

    they did a refresh but didn't add 4gb more memory.............. just faster memory...

    • @MoultrieGeek
      @MoultrieGeek Год назад

      Yeah that 8 gig limit kills the 3060ti in newer games.

  • @fanatiquefantastique8516
    @fanatiquefantastique8516 Год назад

    I got a gaming laptop in mid 2021 with an RTX 3060 and ryzen 7 5800H. The RTX 3060 can run upto 130W which is about as powerful as you can get in a laptop. The VRAM can be a limit sometimes but I don't mind turning the settings down and I think this laptop will run my applications comfortably for the next 2 years. Pretty happy with this purchase.

  • @TheJamesKF
    @TheJamesKF Год назад

    Its all about price. During the gpu apocalypse I was on the EVGA wait list so long I forgot about it. When the email came to purchase a 3060ti at msrp I jumped on it. My oldest Son has no complaints and sure the 8g of Vram may be an issue with the latest unoptimized hot garbage games.... The games he actually plays run fantastic and we won't be replacing it anytime soon.

  • @vtr_monsterextremo5145
    @vtr_monsterextremo5145 Год назад +1

    The faster memory helps a lot in ray tracing. I recommend you to try it. I’m almost sure it will be better

    • @KiraSlith
      @KiraSlith Год назад +1

      Eh, most people aren't that concerned with RTX performance. I can see why it's "cool" but it still doesn't have the performance to "game" on without making dramatic compromises that defeat the value of ray tracing in videogames anyways. And no, DLSS is not a valid solution for a $650+ GPU, let alone the $1,600 RTX 4090.

    • @vtr_monsterextremo5145
      @vtr_monsterextremo5145 Год назад +1

      @@KiraSlith I know that most people doesn’t turn up RTX, but I’m saying that testing RTX with different types of memories to show the difference in performance

    • @milosstojanovic4623
      @milosstojanovic4623 Год назад

      @@vtr_monsterextremo5145 you are right, he should have tested RT even if he would need to lower resolution or reduced some settings.

  • @drruncmd
    @drruncmd 5 месяцев назад

    I own a somewhat unknown or just unpopular card variant of a 3060ti card using gddr6x memory which i did not realise until watching this video! The card is a gigabyte 3060ti eagle oc d6x 8g model that is not listed on tech powerup's GPU database. The identifier for this specific card is nowhere to be seen and initially thought the card was made for the chineese market. Works a treat now since i replaced thermal pads for the VRM and memory using thicker pads as the factory ones were not up to the task. GPU hotspot and memory temps were insane, reaching 70 to 80c with no changes to any oc settings in AB. Is this a rare card? Or just a unpopular variant?

  • @sirfairplay9153
    @sirfairplay9153 Год назад

    There is a GTX 1060 with GDDR5X, a KFA model apparently

  • @RaphyRaphaelOG
    @RaphyRaphaelOG Год назад

    I like the design of this GPU!

  • @toufusoup
    @toufusoup 2 месяца назад

    RandomGamingInHD tested a 3060 Ti with GDDR6X vs the GDDR6 and it was like a 1FPS difference apparently lol

  • @user-bf5sc8pn8x
    @user-bf5sc8pn8x Год назад +1

    I'm guessing non-X GDDR6 is no longer being produced (but 3060 Ti's still are for some reason...)

  • @KeradSnake
    @KeradSnake Год назад

    now that we got a bunch variants of 3060 now, can you do ultimate 1060 comparison (5GB, 6GB, 6GB 9GBps and GD5X) someday? Because this is almost the same situation now

  • @markcook4043
    @markcook4043 Год назад

    Irrelevant graphics card I have a RTX 3060 ti so looking for RTX 4060 ti 16GB will ignore AMD RX 7600xt 8GB because I have only a 650 psu. Love these videos gives me experience.

  • @IdunRedstone
    @IdunRedstone Год назад

    Wow this card is actually a better upgrade than a 4060ti because it actually is reliably higher instead of losing by surprising margins in some scenarios and games!

  • @facelessvaper
    @facelessvaper Год назад

    I like the 2x8pin location.

  • @TheSilviu8x
    @TheSilviu8x Год назад

    I remember when gddr overclock wasn't even a thing, but more is More. I wouldn't even consider the choices, if the price difference between isn't much. The faster is faster, no matter how little.

  • @sjoervanderploeg4340
    @sjoervanderploeg4340 Год назад

    Textures only impact memory, if memory is fine a higher texture size is no problem!

  • @biglevian
    @biglevian Год назад

    Probably more meant for eSport-titles like CS @1080p. Doubt bandwith will matter at higher resolutions, since the games hit other limits way before.

  • @illyasarachi873
    @illyasarachi873 Год назад

    in a blindtest, highly doubt the 1-5 frames difference would be noticeable, but still handy to have just in case future games demand it

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад

      It'll be useful for OC, You can push core speeds quite high without worrying about memory speeds.

  • @IceCreamePudding
    @IceCreamePudding Год назад

    it's over 400+USD , not surprising that no one is covering it at this point even if it's a refresh.

  • @THU31
    @THU31 Год назад

    I've had three Zotac cards in my life. Two of them failed quite quickly, third one had broken fan speed control from the start. 😬
    Never again.

    • @Ladioz
      @Ladioz Год назад

      Zotac is the worst brand. I had a Zotac before and while the card never given me computer problems, it had loud coilwhine even at low frame rate, it was loud at 40% fan speed and temps were always 65 degrees + in all games

  • @MoultrieGeek
    @MoultrieGeek Год назад +2

    I wonder if the addition of 6X memory will bring it closer to the AMD 6750xt. When I built my rig a few months ago I looked at both but chose AMD for the better performance/price and because it runs better under Linux than Nvidia cards do.

    • @MDxGano
      @MDxGano Год назад +1

      Considering it doesnt change where the card sits performance wise...no it shouldnt bring it any closer to the 6750xt than it already is. Memory, speed and capacity simply just isnt a huge factor unless you are running out. There are a few cases where vastly faster memory helps out at huge resolutions, but it's hard to fall back on edge cases for a general outlook of a gpu. It would be like me deciding to go amd based on mw2 performance alone.

  • @Sonic_1000
    @Sonic_1000 Год назад

    I have an MSI 3060 Ti 3x OC with the GDDR6X and 2 8-pin plugs! It's a great card!

    • @Ladioz
      @Ladioz Год назад

      How has it been so far? what resolution do you play? do games run smooth?

    • @Sonic_1000
      @Sonic_1000 Год назад

      @@Ladioz 1440p @ 144hz. I mainly play older titles like Doom 2016. It’s been good. My cpu is a 5700x

    • @Ladioz
      @Ladioz Год назад +1

      @@Sonic_1000 Oh cool. Im building 3060 TI and 5800X3D, for 1080p gaming

  • @hoangtudaden1304
    @hoangtudaden1304 Год назад +1

    all I can say is that. RX 6700xt and or RX 6750xt sounds like a better buy imo.

  • @Nilboggen
    @Nilboggen Год назад

    Anyone else notice there being a big difference in the gameplay/benchmark footage for the witcher and red dead? It seems like the gddr6x is much brighter or has more lighting effect settings enabled. I even flipped my screen upside down in windows to make sure it wasn't my monitor lol.

  • @kennethd4958
    @kennethd4958 Год назад

    This is what I've been saying... instead of offering 6X memory, Nvidia could have used regular GDDR6 and given the cards more VRAM... the 6X doesn't add much performance but it is decently more expensive to use...

  • @zarrar26
    @zarrar26 Год назад +1

    Interestingly, the gddr6x pulls constantly 20-30 watts higher than the normal one.

  • @Decenium
    @Decenium Год назад +1

    I mostly notice a 20 watt drop in power consumption

  • @SimpleReview
    @SimpleReview Год назад

    is rtx 3060 ti gddr6x worth buying over rtx 4060 ti for $60 less ???
    does 3060 ti gddr6x heat up more compared to the gddr6 version ??
    last question which brand do you recommend Zotac or Gigabyte??

  • @scra1
    @scra1 Год назад +8

    3060ti's have a big defective rate because of 1gb Hynix GDDR6 non-x (they die in 1-2 years), all other 30xx GDDR6 non-x cards are fine: 3050 uses 4x2gb, 3060 is 6x2gb. This is probably just a fix for that

    • @MDxGano
      @MDxGano Год назад

      Source?

    • @scra1
      @scra1 Год назад

      @@MDxGano YT deletes my comments, try to search it yourself

    • @heyitsmejm4792
      @heyitsmejm4792 Год назад +1

      thats like the same scenario when the first RTX 2000 came out. and it was about Micron chips. saw a GPU repair video in yt where also samsung memory chips die the same way.. and most of the people that had this problem is when they OC their memory too much and they heat up. also to take note is that the GDDR6 doesn't include any temperature sensor but the GDDR6X have. so people frying their memory chips to death by overclocking it. same problem with the RTX 2000 micron chips, people OC the memory to death. my old 2060 had the same micron chip and it lasted 4 years. it only died when my AIO spilled liquid on it LMAO

    • @MDxGano
      @MDxGano Год назад

      @@scra1 Did some digging and really only anecdotal evidence which cannot be used to make a proper conclusion. I'm someone who has a working 680 that was overclocked to the wall for most of its life and went through 3 hand me down systems. That may not have much to do with 3060ti failures but googling "3060ti failure rates" comes up with pretty much a handful of people on reddit and literally nothing else.

    • @JCmeister9
      @JCmeister9 Год назад

      @@MDxGano The ones that had the memory problems were mostly the 2000 series, particularly notorious in the 2080Ti with the Micron chips. I haven't really heard about the 3000 series having the same issues but I think it's "better to be safe than sorry" when it comes to these things in my opinion.

  • @RobGMun
    @RobGMun Год назад

    Looks like the big improvement is the 30w less of power darw, which can be a big deal for some people.

  • @kmc4682
    @kmc4682 Год назад

    A whooping 3-4% increase in fps with 10-15% energy consume increase and maybe 20$ higher. What a steal

  • @kingcrackedhen3572
    @kingcrackedhen3572 Год назад +1

    Didn’t know there was a refresh

  • @blade6965
    @blade6965 Год назад +2

    5% more frames for a higher price? that's like buying updated drivers

  • @samstrolia
    @samstrolia Год назад

    Interestingly enough, the 6x card seems to run drawing more power (maybe 25watts) while cooler(2ish degrees). Guess this has to do more with the 3rd party card design?

  • @Patrick76496
    @Patrick76496 Год назад

    Could you consider that FH5 runs much worse in the town area and in the HotHweels expasion or you're going for average scenario results? In the said places GPU usage goes up by about 20-25%, which is the reason why I have to turn down some settings to avoid drops. Sad to see these 2 areas having an impact on the whole game, otherwise my 3060Ti FE utilization is only 60-70 on average, I was disappointed that I couldn't max out the game in 1080p (DLAA).
    I wish there was an option to have graphics scaled back in these more demanding areas, so the performace is mostly consistent.

  • @fareshesham8135
    @fareshesham8135 Год назад

    I just increased the power limit on my GDDR6 model in msi afterburner and got the same results sometimes even better than the GDDR6X so it is completely pointless to favor one over the other if you're looking for a 3060 Ti

  • @sifarid4502
    @sifarid4502 Год назад

    hey randomgaming, what camera you use for your video?

  • @shaneeslick
    @shaneeslick Год назад

    G'day Random,
    🤔If these were still getting made late last year maybe it's not really a SKU Upgrade/Rerelease but nVIDIA just switching all models to DDR6X VRAM

  • @wandameadows5736
    @wandameadows5736 Год назад

    I also noticed the X pulling 20 to 25 more Watts.