NVIDIA 4060Ti Details... NVIDIA finally listening to consumers?

Поделиться
HTML-код
  • Опубликовано: 9 ноя 2024

Комментарии • 2,8 тыс.

  • @TheWoolyninja4
    @TheWoolyninja4 Год назад +709

    I don't know I still can't get past the 4070ti and the 4070 gap. That blunder has really just put a damper on the entire generation.

    • @DenverStarkey
      @DenverStarkey Год назад +34

      actually the 4070 ti and 4070 gap isn't that large , 4070ti = 3090 , 4070 = 3080 . that's just a 25% performance gap and there's been otehr ti's that gaped that far from teh base before. the real massive gap is the gap between the 4090 and the 4080. while the 4090 is like 2.5 X the performance of a 3090 , the 4080 is only like 1.5-1.8 x the performance of a 3090 , bassically the 4090 is about twice as fast as the 4080. that's a enourmous gap compared to most generations where the 90 tier is usually only 30% faster than the 80 tier.
      it also screams that nivdia was planning on filling that gap with other 1300+ card models.

    • @2hotflavored666
      @2hotflavored666 Год назад +75

      You don't have to, just wait until the 5000 series and completely forget this generation existed. Like people should've done from the start.

    • @Gay-is-_-trash
      @Gay-is-_-trash Год назад

      ​@@DenverStarkey And all for the same price w 4070 v 3070, and only $100 more 4070 TI v 3070 TI if you take Bidenflation into account

    • @dhaumya23gango75
      @dhaumya23gango75 Год назад +46

      @@DenverStarkey 4090 is 30-35 percent faster than the 4080 in raw performance (which is still much bigger than the 10-15 percent gap between the 3080 and 3090 ) and not twice as fast lol, check benchmarks before spreading wrong info bud. 4090 is also around 70 percent (or 1.7x) faster than the 3090 (see Hardware Unboxed review) and not 150 percent faster or 2.5x like you said. 4080 is 1.3x the raw perf of the 3090 in video games (or 30 percent faster) nowhere near 1.5-1.8x. I think you got on the 4090 hype train and made up a bunch of numbers in your head, its not that much faster than the 4080.

    • @DETERMINOLOGY
      @DETERMINOLOGY Год назад +3

      @@DenverStarkey Same could be said about 4070 Ti and 4080 the gap isnt large at all but people saying the 4080 is a 4k card
      Funny how they say that but the 4080 only gives roughly 15-30 fps or so and less in games. That's not a 4k card imo. 50 series will prove that and the # should be much higher

  • @abemassri5929
    @abemassri5929 Год назад +306

    Nvidia should not expect praise for being fair to other parties (including customers); it is simply fulfilling its basic responsibility, and that should be the norm.

    • @zaphod4245
      @zaphod4245 Год назад +18

      Idk, you kinda have to treat Nvidia like a child, positive affirmation when they do good is how you get them to keep doing it.

    • @okthen515
      @okthen515 Год назад +7

      @@zaphod4245 I dont think that is how businesses work lol

    • @nophone9311
      @nophone9311 Год назад

      @@zaphod4245 Fuck no. If a company screw you over 90% of the time, but is fair 10% that doesn't mean you give them your money. You go else where for 90% of the time they try and nickel and dime you.

    • @T33K3SS3LCH3N
      @T33K3SS3LCH3N Год назад +8

      Why is this their "responsibility"? They're a business, not a public service provider.
      If your priority is fairly priced goods, advocate for nationalisation.
      The simple problem is that there isn't enough competition because GPUs are so advanced that it takes too much capital to become competitive, and Nvidia/AMD are just so good that it is very hard to keep up even if you make it that far.

    • @nophone9311
      @nophone9311 Год назад

      @@T33K3SS3LCH3N>advocate for nationalisation
      You fanboys are so out of touch it is actually hilarious.

  • @akmarksman
    @akmarksman Год назад +328

    I think Intel's Arc A770 GPU made a impact on Nvidia's board of directors. 16GB GDDR6 thru a 256bit bus, for only $349? The fact that its almost always sold out at newegg too. They may not be moving enough GPUs to change the tide, but its causing a change in the landscape.

    • @darthwiizius
      @darthwiizius Год назад +44

      The ever improving A770, Intel are doing quite a solid job getting their drivers going too from what I see on tech channels.

    • @RevWolf1776
      @RevWolf1776 Год назад +12

      agreed and I'm very interested to see how their second gen goes now that they have driver maturity.

    • @antonhei2443
      @antonhei2443 Год назад +15

      Yep! I bought the ARC770 FE as my DX12 card, coming from a 970 still on my older setup. It's fun and love the idea of supporting competition. 👽💾

    • @kcwilsonii
      @kcwilsonii Год назад +2

      Yeah that 4060 is only 128bit bus

    • @teardowndan5364
      @teardowndan5364 Год назад +5

      I'm not liking how Intel says they will fully address their idle power issues in future generations. If I am to buy one of those, I want that fixed first.
      Also, if a $100 premium to go from 8GB to 16GB on the RTX4060Ti is too steep, the $120 to go from 8GB to 16GB on the A770 is 20% worse. The 16GB A770 needs to come down at least $50.

  • @rustler08
    @rustler08 Год назад +170

    $499 for the 4060 Ti buys you a used, under warranty 3080. NVIDIA is just hoping that people ignore outright performance benefits because it has more VRAM.
    It doesn't matter that it has more VRAM, that's just their way of distracting you from the fact that you're spending $500 on a 60-series card (which, spoiler alert, was actually possible with the 3070). The 3060 Ti was $399. A couple dollars more worth of VRAM chiplets doesn't make it worth $100 more

    • @IdeasAreBulletproof
      @IdeasAreBulletproof Год назад +28

      It also buys you a 6800xt

    • @tavinorigami
      @tavinorigami Год назад +10

      I know I am an outlier, but for me it's great because I am training ml models.

    • @marcm.
      @marcm. Год назад +13

      And yet this isn't a series 60 card, it's actually a 50 series card. They're just unfairly renaming it to get your dollars, for a lot more than what it should be

    • @BadMenite
      @BadMenite Год назад +3

      @@marcm. We'll see once the actual benchmarks come out, but a supposed "50 series" wouldn't have a 20% performance uptick over a previous gen 60ti series. The 3050 is significantly slower than a 2060 Super for example. The only kinda shitty thing going on here is charging $100 for more VRAM.

    • @PuppetMasterdaath144
      @PuppetMasterdaath144 Год назад

      1080 ti is 3070 and 200 dollar. And 6700 xt used 300 dollar even better. I'd buy the amd 3080 is 500 dollar with minimal advantage

  • @kathleendelcourt8136
    @kathleendelcourt8136 Год назад +241

    Nvidia listened to their consumers and decided to give them the middle finger: "So you really want more VRAM? Ok we give you 8 more GB for 100 bucks". The 4060 at $300is ok-ISH, the 4060Ti 8 GB is a joke and shouldn't exist, and the 4060Ti 16GB for $500 is a scummy answer to the legitimate critics regarding Nvidia's GPUs overall lack of VRAM.

    • @germanmosca
      @germanmosca Год назад +5

      1: Do you actually know the cost of VRam?
      2: What do you expect to be playing with a 4060 Ti where 8 GB VRam wouldn't be enough?
      I do know an answer to 2 btw. And that answer would be the only Game where i can potentially run out of VRam on my 2070 (which i use for 4k Gaming and VR btw...) And that would be VRChat. (I'm sure Neos VR and ChilloutVR have that potential as well)

    • @kylobews7764
      @kylobews7764 Год назад +17

      Last of Us uses more than 8gig of vram

    • @GT86crazy
      @GT86crazy Год назад +19

      @@germanmosca If you follow Moores Law is Dead channel, you'll know the answer to 1.
      for question 2: Hogwarts, Last of Us, Resident Evil 4 remake and a couple others. And this is at 1440p with raytracing on.

    • @megapro125
      @megapro125 Год назад +6

      ​@KiznaVision apparently it's roughly $6 per GB when you buy in bulk like Nvidia does. So adding 8 gigs of extra VRAM should cost Nvidia ~$50. Meaning the margin they already make on the 4060Ti 8GB increases by another $50 for the $100 more expensive 16 GB model

    • @justanobody4983
      @justanobody4983 Год назад +5

      @@GT86crazy dude i played with hogwarts and re4 on a 1660 ti with 6gb. All the complaints i see is invalid to tell you the truth. You can play those games at 6gb. You cant really play ultra because youll run out of fps before you run out of vram.

  • @FSAPOJake
    @FSAPOJake Год назад +281

    2nd generation in a row where a 60-series card has more VRAM than the 70 series, all cause Nvidia fumbled the bag with VRAM yet again and are scrambling to fix it. How could a company full of brilliant engineers keep fumbling the bag like this over and over?

    • @im_godness7990
      @im_godness7990 Год назад +7

      Fr😭

    • @MrEdennnn
      @MrEdennnn Год назад +22

      Having that much profit coming in causes some serious delusion, they think they can do no wrong. The sad thing is that this fumbled bag won’t really harm them that much so, we may not see much change in the future.

    • @pootmahgoots8482
      @pootmahgoots8482 Год назад +19

      Money. They're stumbling over themselves with all the other SKUs they fumbled with due to greed and now they're like "Here, poors. Have access to a decently spec'd GPU." *Frantically sweeping their dumb shit under a rug.

    • @LOT9T
      @LOT9T Год назад +2

      Money makes the best go soft!

    • @Psyk0h
      @Psyk0h Год назад +28

      I highly doubt the engineers wanted it to be like this, which really sucks

  • @rickysargulesh1053
    @rickysargulesh1053 Год назад +879

    No thanks Nvidia

    • @Trainwheel_Time
      @Trainwheel_Time Год назад +56

      Don't sweat it. I'll buy two. One for me and one for you. Cheers!

    • @wahahah
      @wahahah Год назад +27

      @@Trainwheel_Time I will buy 3, one for me and 2 for the next two people who won't buy

    • @barryhoggle2354
      @barryhoggle2354 Год назад +48

      2 times the fake frames😂

    • @AnimatorBlake
      @AnimatorBlake Год назад +2

      @@Trainwheel_Time Lmao do it

    • @darrenskjoelsvold
      @darrenskjoelsvold Год назад +7

      Gee tell me what you really think. Hehe... well I will not hate on Nvidia but I still plan to wait for the 7700XT and 7800XT to come out before making any purchases because I will get one of those.

  • @CreateNowSleepLater
    @CreateNowSleepLater Год назад +276

    Hopefully Benchmarks will include the Intel A770 card. Will be nice to finally see an apples to apples comparison.

    • @hasnieking
      @hasnieking Год назад +29

      How strange that I'm considering getting an AMD CPU, and an Intel GPU.

    • @jakemcgowan7928
      @jakemcgowan7928 Год назад +16

      @@hasnieking just remember that the intel gpus are affordable for a reason, its very risky to buy for the very long term because its not unlikely that intel scrap the whole gpu project and as a result stop updating drivers eventually. That being said, they are very good value, and I really hope intel doesnt abandon them.

    • @BNR_248
      @BNR_248 Год назад +12

      @@jakemcgowan7928 This is true, however I dont think Intel would stop driver development until the end of the GPUs lifecycle.

    • @406Steven
      @406Steven Год назад +6

      @@BNR_248 They released a new BIOS for my 7th generation NUC a couple of months ago, hopefully they support Arc as long as they are other older products. I'm hoping AV1 stuff finally kicks off so my A380 can get put into full-time usage.

    • @DIYTech21
      @DIYTech21 Год назад +5

      waiting for the 3060s to go dirt so can play old titles at 1080P

  • @AMTgrafix
    @AMTgrafix Год назад +163

    You can thank EVGA for the evolution of partnership fairness. Hopefully they can agree to come back because nobody did it better…

    • @accelement3499
      @accelement3499 Год назад +5

      EVGA wont make another NVidia gpu sadly

    • @AC3handle
      @AC3handle Год назад +21

      the ONLY way EVGA would go back to making NVidia cards would be if they were allowed to make the kinds of cards they WANT, some of which would be extremely experimental. And quite honestly, I don't see that happening.

    • @D1craigRob
      @D1craigRob Год назад +7

      Partner fairness but customer screwing. With no FE model to compete with the AIBs won't be making any card even close to 499$.

    • @accelement3499
      @accelement3499 Год назад

      @@D1craigRob i bet there is a 499 aib card nvidia will demand it i cant guarantee it will be easy to get but there will be one

    • @JragonSoul
      @JragonSoul Год назад +2

      @@D1craigRob Realistically I can't see the 16GB models needing new coolers, a minor tweak MIGHT be needed if the add more memory modules but it would make more sense to just double up the ram chips.

  • @JETWTF
    @JETWTF Год назад +62

    VRAM isnt just a framebuffer for the game resolution and many people like using high resolution textures which gets stored in memory too. You can quickly run out with 4k and 8k textures.

    • @backlogbuddies
      @backlogbuddies Год назад +5

      With unreal engine 4&5 you run out at 1080p. With 5 720p is having issues with 8GB

    • @marshallmcluhan33
      @marshallmcluhan33 Год назад +2

      LLMS and different AI tech are also coming that will use cuda/rtx with lots of VRAM for simulation and generation of content. vram is going to be even more important soon.

    • @LordOfNihil
      @LordOfNihil Год назад +3

      thats why video cards need so much memory and why texture resolution settings usually have a small impact on performance (up until you exceed the memory requirements and get bogged down with pcie bus traffic). frame buffers are tiny compared to the amount of vram you have, even with only 8gb. i doubt anything up to 4k uses more than a gb of frame buffer.

    • @NotAnonymousNo80014
      @NotAnonymousNo80014 Год назад +2

      Somehow 288 GB/s now equals 554 GB/s for your textures because 32MB of cache :D Nvidia is the new church.

    • @backlogbuddies
      @backlogbuddies Год назад

      @@LordOfNihil UE4 and 5 are massive hogs because the engine stores everything into VRAM. This is why so many games are "unoptimized messes". The Unreal Engine natively does this forcing simple games that should be fine on 4-6 GB of VRAM into requiring 10. A unity project I made uses 4 GB of VRAM. I swapped it to Unreal, with as many effects off as possible to parity match Unity, and I was lagging with an 8 VRAM card.
      UE4/5 are the most used engines now and they need 12GB min plus lots of PCIE lanes.

  • @GOPACKERSJT
    @GOPACKERSJT Год назад +27

    Take me back to the Pascal price to performance days... I can't believe how expensive the 60 tier cards are now.

  • @edgyjorgensen3286
    @edgyjorgensen3286 Год назад +280

    In relative terms I can’t help but think that it’s still named incorrectly. Based on projected performance it should really be a 4060 and not a 4060 tie.

    • @fabiusmaximuscunctator7390
      @fabiusmaximuscunctator7390 Год назад +60

      It's a 4050 Ti as it should be around a 3070. So 199 would be fair.

    • @thepezfeo
      @thepezfeo Год назад +7

      Better lineups:
      Make the 12gb (192 bit) cards 4070 and 4060 Ti .... 8gb (128 bit) cards the 4060 and 4050 Ti. ~OR~
      Make the 12gb (192bit) cards 4070 Ti, 4070, and 10gb (160 bit) 4060 Ti .... 8gb (128 bit) 4060 and 4050 Ti.

    • @fabiusmaximuscunctator7390
      @fabiusmaximuscunctator7390 Год назад +2

      @@thepezfeo This is would the lineup should have looked like. 4070 is a true 4060 though as it isn't any faster than the 3080

    • @wherearemytesticles
      @wherearemytesticles Год назад +11

      What does it matter what it's called? The only thing that matters is price.

    • @Bigdog1787
      @Bigdog1787 Год назад +6

      Ya I don't understand how they say there is not much more room for profit do they realize how much profit margin they got currently is super high they could put the card at $200 and still make a real healthy profit.

  • @marcosvictor4935
    @marcosvictor4935 Год назад +199

    Jay it's 4 years later, for the same price it should be at least 2x the performance without frame Gen to actually be a good thing. And against the 3060ti, the usual 30% uplift GenOverGen is what would make it okay for the same price. (Okay not good because the 3060ti was overpriced at its time, and so is the 4060ti)

    • @Gay-is-_-trash
      @Gay-is-_-trash Год назад +5

      More than 3x shader performance, 2x RT performance, and more than 6x Tensor performance, and double vram😉 All for the same price as 2060 super, and a bit more than 3060 TI if you take Bidenflaton into account

    • @TippyToes476
      @TippyToes476 Год назад +10

      I agree here, The 4060ti is only rumoured to be performing on level of the RTX 3070, it should have been on par or better against the RTX 3070ti. The Performance advancements for mainstream Ada isn't there. Also if you justify paying $500 for a 128bit bus you are crazy.
      Also Frame Generation should NOT be a selling point on these 40series cards, as soon as FSR3 comes out it, DLSS 3 will be come irrelevant cause people will be switching it on 20/30 series cards.
      So TLDR you are paying the same price for 15% performance is kinda a bad deal.

    • @HonkieWithaBoomstick
      @HonkieWithaBoomstick Год назад +1

      my last rig had a 1060 that I ran into the ground
      paid 550 for my 3060tis in my newer rigs and I genuinely feel like the 60s are the best cards on the market in terms of price vs performance. I didnt even consider the 4k series. going to run these on my systems for the next 10 years and then prolly switch to 5060ti when the 6000 series releases or whatever. Luckily thanks to consoles console hardware and game releases are always like 10 years behind what pcs can handle so my system will feel cutting edge for the next 10 years until they release the ps6 at which point Ill upgrade again. Also thankfully aaa devs are incompetent so there's nary a realease I actually feel is worth downloading or playing.
      By that point a new entry level gpu 5060 or 6060 will likely be 1500$( the price of my new rigs themselves) and the whole rig will likely cost 4k just with entry level parts. Assuming inflation doesnt collapse the us dollar at that point and nvidia even exists anymore, or Im even in a financial state where pc gaming is still viable. I switched to pc maybe 5 years ago and while we do enjoy qol things like mods if prices dont come to earth (which they wont) pc gaming might die just as its really getting off the ground.
      pcs are having a harder and harder time competing with consoles because the bar of entry is getting set higher and higher. In 5 years we'll all have the choice of paying 800 for a ps6 or paying 2500 for a new entry level rig, with the more intensive builds reaching 5-10k on the regular

    • @flame6466
      @flame6466 Год назад +8

      @@Gay-is-_-trash What do these numbers even mean. You are just giving Nvidias Arguments. In real performance it's rather just 1.5x the performance of the 2060 super.

    • @Noob._gamer
      @Noob._gamer Год назад

      @@HonkieWithaBoomstick pc gaming is dead anyway you can get ps5 instead of trash

  • @aggregatecrab2989
    @aggregatecrab2989 Год назад +6

    How is this at all positive Jay?
    The 4060 ti should be a 12gb card at $399 the 4070 should have had 16gb. The 4060 ti is the same price as a 3060 ti two years later with a whopping 10-15% performance increase (by NVIDIA's own slide). It better be 60% faster than a 2060 super, a 3070 is (Look it up on techpower up). $500 for the 16gb that will perform equal to a 3070 except with more vram... TWO YEARS LATER! The 4060 will apparently perform worse than a 3060 ti according to, again, NVIDIA's own slides. NVIDIA's performance numbers also include raytracing and DLSS 2, just not DLSS 3. Not even the extra RT/Tensor cores can save these cards. Skip them, they are pure trash.
    DLSS 3 is not a selling point, on lower tier cards. It only makes sense if you have a high enough frame-rate to make up for the input delay in the first place. If you don't have the frame-rate its useless. Steve from HWU talking about this exact issue (DLSS 3 at 30fps or lower starting point) - "you will feel like you are playing while you're drunk". DLSS 3 is only useful in single player titles while trying to take advantage of a high refresh rate monitor, again assuming you have a frame-rate of 60+ in the first place.

  • @simo_486_3
    @simo_486_3 Год назад +37

    Rx 6700 (non xt) 10gb vram. Most under appreciated card. Came out cheap when during the miners wars and handles 1440p really well.

    • @kevincr19
      @kevincr19 Год назад +1

      for 280 bucks it's a great deal

    • @JonLake
      @JonLake Год назад +2

      Completely agree! I bought a 6700xt from a reputable miner (in person not online) for 250 when ethereum crashed. Been solid in 1440 and even 4k 75-100fps medium/high.
      I probably won't upgrade for at least 2yrs

    • @RadarLeon
      @RadarLeon Год назад

      ​​@@JonLake I love the 5700xt price point, just been wanting to get the best proformance I can, I keep staring at the 7900xt.....I need it, I don't need it..... money.....
      My gpu is a r9 290x 8gb...and I know if I buy the 5700 xt I won't upgrade for another 4 years

  • @chefcc90
    @chefcc90 Год назад +19

    $500 for a 4060Ti... No thanks. I don't care how much VRAM it has

    • @limmel3588
      @limmel3588 Год назад

      I got a 6900XT for the same price on fb marketplace. Nvidia has lost their mind

    • @rhettbryan7520
      @rhettbryan7520 Год назад +1

      @@limmel3588 the fact people just disregard this is insane to me. the amount of “Nvidia or nothing” people out there is sickening. i just ordered a 6800xt.

  • @Baesic999
    @Baesic999 Год назад +200

    Remember when Nvidia abandoned gamers for crypto miners? I think everyone should avoid the 40 series all together if they can lol

    • @Capiosus
      @Capiosus Год назад +8

      they still make quality cards though, and they did kill mining afterwards with their gaming lineup of cards.

    • @Sylas6540
      @Sylas6540 Год назад +7

      If you like using NVIDIA products then you gotta realize they need to make money to survive…. Covid was hard on everyone. Especially gaming. It’s hard to fully blame them for chasing some money

    • @mattgreenfield8038
      @mattgreenfield8038 Год назад +5

      4090... too dominant to ignore

    • @ComputerProfessor
      @ComputerProfessor Год назад +21

      @@Sylas6540 They will lower their price if their cards don't sell.

    • @lat78610
      @lat78610 Год назад +1

      You should avoid gpu's then
      Amd aint better

  • @russb7
    @russb7 Год назад +202

    The 16gb sku is in response to Hardware Unboxed videos showing how some modern games are using more than 8 even at lower than 4k resolutions.

    • @BriBCG
      @BriBCG Год назад +43

      Yeah, and that problem is only going to continue getting worse, buying a 8gb card or even a 12gb card today is a bit of a mistake.

    • @brando3342
      @brando3342 Год назад +52

      That’s only because developers are not spending the time optimizing their games properly. Instead they’d rather save the money and depend on things like DLSS, FSR, and cards that have more disposable VRAM.
      People need to figure this out, we’re getting scammed here.

    • @DeadPhoenix86DP
      @DeadPhoenix86DP Год назад +35

      @@BriBCG And within a few years even 24GB GPU's aren't enough. PC developers has gotten lazy.

    • @killersberg1
      @killersberg1 Год назад +7

      ​@@brando3342 Use lower settings if your card doesn't have enough vram. 8gb was standard for six years now and because of the new consoles the requirements are rising. This always happens in a cycle. Be tech requires more vram and if you don't have it you can lower the settings. That's what they are for, to optimise for your specific use case.

    • @MeatNinja
      @MeatNinja Год назад +11

      @@DeadPhoenix86DP It's not about devs getting lazy. It's about publishers cutting corners.

  • @Phate8263
    @Phate8263 Год назад +11

    My 1060 6GB cost me $225 in 2016. Inflation calculator says that's ~$290 in April 2023. I've always been a PC gamer, but if this is the new/persistent state of GPUs, it's hard to justify not just buying a console. Oh well, my 1060 will live on for awhile yet. Maybe Intel or AMD will eventually get me with a reasonably priced card.

    • @tallpaul9475
      @tallpaul9475 Год назад

      I understand how your 1060 situation is with my 1080. It will live on until it dies, or until I really actually have a need for it.

  • @FlockofAngels
    @FlockofAngels Год назад +1

    I bought 2x 3090s and NO ONE, including Google told me that NVLINK does not work on ANYTHING. I bought the NVLINK bridge and Windows 11 can't use it in ANY of my 3D programs (including Blender). I needed 48gb of memory "pooled" for 3d models. Now NVIDIA puts out 8gb cards... These cards are absolute junk to me. I have all the CUDA cores I need, I need more video memory and to see Nvidia going in the opposite direction is really sad. I bought a Viking camp village environment model that I cannot even load in two 3090s because it is too big and memory pooling does not even work in any 3D app. NVLINK is a total gimmick and seeing 8GB cards is, again, simply junk. Now I could buy a NVIDIA Quadro RTX A6000 for 4,000+ dollars but what about my gaming as well? I was hoping to be able to do both my gaming and 3D work on an RTX level card. It would have been nice to have gotten a heads up on the NVLINK scam as if gaming is the only use for a graphics card... And when a 48gb 4090 does finally come out it will be probably priced out of the sky... Who needs 200fps on a video game, really... Especially when you might be streaming that video game online at 2400 kbps over a copper cable wire...

    • @FlockofAngels
      @FlockofAngels Год назад +1

      @🩸Vash🩸 I guess that is what I will have to do, my problem is the hype around the NVLINk that has no use whatsoever. We do depend on this info coming from the "reviewers" but I did not hear a peep about NVLINK pooling not working. Nor does Google really come out and say NVLINK memory pooling "is a TOTAL scam". I guess Microsoft and NVIDIA have some sort of war going on. Everyone finds this out after they upgrade their computers, buy 2X3090 cards and NVLINK and go to test it on their own. Windows says "memory is pooled" but it is not really pooled and nothing works with this "pooled memory". Windows can't pool the memory because Microsoft has a proprietary "file paging" system that prevents memory pooling altogether. And Linux has the paging system that Windows lacks but none of the 3D programs for Linux are implemented to use it (including Blender). And who really wants to dual boot Linux and Windows when file systems get all confused... The NVLINK it a total gimmick invented to get people to buy two cards, expensive motherboards, power supplies, RAM, and an 80 dollar NVLINK, all to find out it doesn't work on ANYTHING 3D. Yes, I will have to wait for a 48GB card to come out... Thanks. In the meantime, Nvidia is putting out 8 GB cards. It is hard to not be a little bit disappointed when you are thousands of dollars in the hole with cards that are useless for the purpose you bought them for.

  • @wtflolomg
    @wtflolomg Год назад +161

    These cards are still at least a tier too high. The '12GB 4080' was really a 4060ti. Nvidia shorted gamers because they gambled on crypto and had tons of Ampere stock to move (and competing with used crypto cards potentially flooding the market), so they combined the stacks, then they doubled down on screwing gamers by keeping the stingy VRAM configurations as a means to differentiate gamer cards from the huge margin server cards. Gamers were an afterthought for Nvidia in this generation.

    • @TheZROLimit
      @TheZROLimit Год назад +11

      Hell even the 4090 should have probably been the 4080ti

    • @Gay-is-_-trash
      @Gay-is-_-trash Год назад +8

      U don't take Bidenflation into account, or the price of new process nodes, and R&D for new technology

    • @2hotflavored666
      @2hotflavored666 Год назад +14

      And the gamers still bought the cards. Meaning the gamers, by default, support Nvidia screwing them over. What a bunch of masochists.

    • @namegoeshere197
      @namegoeshere197 Год назад

      ​@2HotFlavored imagine having a choose. I needed something that can run my vr setup.The 4090 was my only option

    • @nguyenkhue4021
      @nguyenkhue4021 Год назад +1

      @@2hotflavored666 Bought what? In my country the GPUs are collecting dust, lots of people sell 2nd 4xxx with only 1 month use and 200$ discount yet it didn't sold off

  • @Ben-Rogue
    @Ben-Rogue Год назад +150

    $400 for 8GB? Lol! NVIDIA just pumping out the flops!

    • @Noob._gamer
      @Noob._gamer Год назад +3

      Get a ps5 instead of trash (pc) that shit -( 4090)(gpu )can’t get 4k 30 fps but ps5 can get 4k 60 to 120 fps for only 500$ and pc gaming is dead anyway.don’t invest your money on trash

    • @BlackJesus8463
      @BlackJesus8463 Год назад +1

      ikr

    • @jayyh_01
      @jayyh_01 Год назад +24

      ​@@Noob._gamer youre not that guy

    • @zecikk_5651
      @zecikk_5651 Год назад +14

      @@Noob._gamer PS5 can't get 4k 60fps. More like 30

    • @Frozoken
      @Frozoken Год назад +3

      Not the tflops I'll tell you that much, what a slap in the face to not only stagnate cuda cores, but outright regress them especially when the 4090 got 60% more than its older counterpart

  • @StolenJoker84
    @StolenJoker84 Год назад +23

    I think that this is Nvidia doing something to appease the critics. If Nvidia wants to save face, they need to do exactly this across the board. Bring prices down to normal and give us useful specs.
    IMO, this is a step in the right direction, but it’s too late to save the reputation that 40 series has.

    • @aratosm
      @aratosm Год назад +5

      The 4090 has better specs(likely performance) per cost than every single card below it. Nvidia this gen did a massive middle finger to regular consumers.

    • @chengli1589
      @chengli1589 Год назад +3

      ​@@aratosm That's why this gen actually warrants a rejection.

    • @aratosm
      @aratosm Год назад +1

      @@chengli1589 I agree. You're spot on my friend.

  • @mashrien
    @mashrien Год назад +22

    What I want to know, specifically, is what this things performance is versus a flat-3070. I'm guessing it'll be about the same, maybe single-digit-percentage faster
    Edit: Just did a bit of research, and it looks like 1440p RDR2 on a 3070 runs right about ~100FPS, whereas the 4060ti will be pushing about ~90FPS.
    Long story short, if you've already got a 3070 and don't care about DLSS-3, then skip this card and aim for a 4070.. Or last-gen 3080 (maybe TI) if you can find one used.
    If I were a betting man, I'd say the 3080-TI will still be on-par with a 4070-TI. The only real performance difference being in RT and DLSS-3.. And I don't care about either until the performance hit of RayTracing (without DLSS fake-upscaling) is similar to that of anti-aliasing. When it's like a 10% performance hit, I'll be all for it. But right now, RT is 90% hype just like Bloom lighting was back in the ElderScrolls-Oblivion days, and there aren't enough games that make functional use of it to bother with dishing out an extra $400 just so I can turn pretty-reflection-eyecandy on.

    • @potvinsuks8730
      @potvinsuks8730 Год назад +2

      With just a lil search here on youtube, you will EASILY find your answer. Btw, the 4070ti is much faster then even a 3080ti.

    • @Izanagi-Arsene
      @Izanagi-Arsene Год назад +3

      it'd be best to skip the 3080ti since they had a glaring issue that became apparent from the diablo 4 beta.

    • @malazan6004
      @malazan6004 Год назад +1

      No get a 4070ti as in many countries it's one of the better value options and unlike.the 4070 an actual nice upgrade for many 3000 series users. Yes still expensive but they are all expensive. The 4070ti hate was overblown.

    • @malazan6004
      @malazan6004 Год назад +1

      3080ti and 4070ti are NOT on par and the gap keeps widening with driver releases. The 4070ti competes more with the 3090 and 3090ti. It's still expensive but yeah

    • @thejollysloth5743
      @thejollysloth5743 Год назад

      If you don’t care about Ray Tracing and you are just gaming, the 6800 XT from AMD makes FAR MORE SENSE!
      It beats the 3080 at 1440p in a 50 game average, has 16GB of VRAM, a larger memory BUS than the new Nvidia card de being released, and if you can’t afford the 6800 XT the 6800 non XT also has 16GB and Hardware Unboxed compared it to the RTX 3070 in new releases and it beats the 3070 AND the 3070 ti in Ray Tracing even at 1080p Ultra with RT because the 3070 and 3070 ti both run out of VRAM even at 1080p Ultra with RT on!
      The 3080 10GB isn’t even safe from future releases since both new consoles have 16GB of VRAM. 12GB needs to be the MINIMUM, unless you are looking at budget cards like a 4050 or 4050 ti (if they get released).
      Anything over $300 now should have 12GB.

  • @406Steven
    @406Steven Год назад +4

    I'm very happy about them keeping the pricing consistent. I think they should stick to pricing as we've been used to over the years and build a card appropriate to the price. The **90 or Titan or whatever halo product they have can, as always, be priced however ludicrously higher but give us an $80 upgrade from integrated, a $150 card which blows it out of the water, a $250 card that'll satisfy budget gamers with current AAA titles at moderate settings, $350-500 for midrange cards, then $500-700 for high-end stuff (again, the super high tier stuff like a 4090 can just cost whatever, like the old Intel Extreme Editions which gave you a little more performance for 3x the cost).

  • @Frozoken
    @Frozoken Год назад +53

    Yeah, jay, im sure 2060 owners are jumping with joy to see a 15% performance uplift after 3 years with no vram increase either.

    • @rare6499
      @rare6499 Год назад +15

      Jay is losing it

    • @Messiah666rc
      @Messiah666rc Год назад +7

      Wait, what video did you see? it's a 60% uplift vs a 2060 and 8Gb vs 6 or 3...

    • @afauxican_american
      @afauxican_american Год назад +5

      2060 Super user since 2019 and I bought a 4070 FE basically when they dropped. It'll do exactly what I need it to do and the price was within my budget. No regrets so far.

    • @trickerdgaming
      @trickerdgaming Год назад +4

      @@afauxican_american I have a 4070 as well and its a awesome card for 1440p, and the frame generation hardware is really great as well.

    • @magicaces13
      @magicaces13 Год назад

      ​@@Messiah666rc depends on the game and what it supports

  • @EveningOfficer
    @EveningOfficer Год назад +81

    Jay: It might be time to upgrade from your current 60 series card, like your 2060 super.
    Me and my GTX 760 2GB: Nah I don’t think so. My “60 series” card cost me $200 in 2015 🙃

    • @SomeOneOneOne
      @SomeOneOneOne Год назад +2

      Cant be true.. I paid 260 for my rx580 nitro+ in 2017..

    • @Infinity_Ghost
      @Infinity_Ghost Год назад +2

      ​@@SomeOneOneOne not everyone lives in a country where the gpu market was affordable in the past...

    • @maxrobe
      @maxrobe Год назад +1

      MSI GTX 970 cost £290 in 2015. I still use it. So with inflation, 4060 doesn't look that bad. However can't afford to drop that sort of money nowadays though, (or, for that matter, the rest of the kit needed to go around it).

    • @DGCastell
      @DGCastell Год назад +4

      you're a GOAT for keeping that 760 firing 🤙🏼

    • @username8644
      @username8644 Год назад +10

      I paid $250 for my 1070 ti over 6 years ago. Still works perfectly today, overclocks just as high as it did new and I have 8GB of VRAM so I don't see the point of buying a new card 6+ years later that has the exact same amount of VRAM while costing way more.

  • @ComputerProfessor
    @ComputerProfessor Год назад +62

    Too much for an 8 GB card.
    8gb should be reserved for the 4010, 4020, and 4030 assuming they get released, although I doubt it. Anything higher than that should get more than 8 GB or anything over $150 should have more than 8 GB.

    • @mytestbrandkonto3040
      @mytestbrandkonto3040 Год назад

      I wait for 4050, a decent successor to GTX 1650.

    • @bobbythomas6520
      @bobbythomas6520 Год назад +6

      4010-30 aren’t a thing lol it would be a 4050 and probably a 2660 and a 2660ti. These cards didn’t offer Dlls or tensor cores so they can be cheaper

    • @marcogenovesi8570
      @marcogenovesi8570 Год назад +1

      (laughing in Jensen Leather Jacket Man)

    • @rindviech8173
      @rindviech8173 Год назад

      @@mytestbrandkonto3040 from the spec leaks the 4050 will be garbage pressed into the form of a gpu

  • @jsnotlout3312
    @jsnotlout3312 Год назад +4

    Am I just out of touch? These prices are crazy. I spent $100 for a 5700XT, sure it was used. But it has the same vram count and pretty similar performance. Who is going to pay $399 for that? let alone $499.

  • @DrMuFFinMan
    @DrMuFFinMan Год назад +28

    Also, there is a huge difference between the "actual" memory bandwidth vs "effective". Actual is a guarantee, whereas effective just means it could do this, but will the game engine or program actually use it. I don't have any confidence in Nvidia at this point. Will it be better than the 3060ti? Most likely, but will it be better enough?

    • @nuffsaid7759
      @nuffsaid7759 Год назад +2

      effective also means not going to work in 1440p and above

    • @arketsjenkins5016
      @arketsjenkins5016 Год назад

      They cut down rams cuz they expensive and thats how they make money while selling you the functional minimum.
      They saw how infinity cache of amd worked and copied that. They saw potential to milk you guys.

    • @alexandersedore7720
      @alexandersedore7720 Год назад

      @@arketsjenkins5016 VRAM is literally 8$ a GB in bulk, a quoted source from NIVIDIA has said this to multiple people. VRAM isn't expensive they are just trying to rip you off. 8 extra GB for 100$ on a still 128bit bus is insane.

  • @brianm.595
    @brianm.595 Год назад +46

    I can't say i have good feelings about it. For me it makes the whole stack not make sense. The 4070 and 4070ti have 12gb. Its weird to have a 4060 with 16 gb. I know the bandwidth makes a difference but the 4060 should have been 12gb. then the 4070 should have been 16 gb, the 4080 24gb, 4090, 32gb or something along those lines. 8gb isn't enough vram and shouldn't be a spec in 2023. There are already games where 8gb just flat out isn't enough because they are being made for consoles which have like 16gb of "vram". Nobody should be shooting for 1080p in 2023. 1440p is accessible enough that it should be the standard. You know they are going to launch a 4050, whats that going to have? We going to go back 10 years and give it 4gb?

    • @evilofalfa
      @evilofalfa Год назад +1

      IKR

    • @BenedictPenguin
      @BenedictPenguin Год назад

      if nvidia did that then they would pretty much compete againts themself in the next generation, it makes sense for us but not for them

    • @MaethorDerien
      @MaethorDerien Год назад

      The 4090 or 4080 was never going to have that much ram, that would massively cannibalize their professional GPUs where they get a much larger mark up. The A6000 for example is pretty much a 3090 with 48gb of memory but costs just under 7000 dollars. Pretty much I wouldn't expect more than 24gb out of the top end because of that. That is probably a big part of why they don't put a lot or memory even on things like the 4070 as well. They want to make those unattractive for workstation use. AMD can put more ram on the cards because they don't really sell workstation GPUs and probably never will until they get something like cuda.

    • @DenverStarkey
      @DenverStarkey Год назад +5

      the consoles DO NOT HAVE 16GB of vram. they have 16 gb of unified memory that is shared across the entire system.on some games the gpu portion of the apu will likely only ever acess between 10-11 gb of video ram. the reality is the average game will most often acess less than 10 more like 9.5 on the average case scenario.

    • @David-yx3bd
      @David-yx3bd Год назад +6

      "Nobody should be shooting for 1080p in 2023." Dude, that's 90% of gamers.

  • @kewellhoo520
    @kewellhoo520 Год назад +7

    499 for a 128bit bus "mainstream" card, with the performance only 15% better than the predecessor? Uhhh NOPE.
    Remember 3060ti almost the same performance as 2080s

  • @brando3342
    @brando3342 Год назад +24

    Who cares… don’t buy their garbage for at least a few generations. They haven’t lost enough revenue yet.

    • @VeiLofCognition
      @VeiLofCognition Год назад

      Im not happy about this video, i hope to god more people are thinking like you and I.

  • @KimBoKastekniv47
    @KimBoKastekniv47 Год назад +2

    So many bad takes:
    "Who needs 16GB?" - has Jay been living under a rock?
    "Just get the 8GB one" - $400 for 8GB in 2023?
    "Same price so that's good" - 2060S to 3060Ti jump is massive, 3060Ti to 4060Ti is 15%.

    • @atomicfrijole7542
      @atomicfrijole7542 Год назад

      Compare your extensive game software library to the new games that utilize more than 8gb vram. You'll see that there are very few games that need 16gb. 8gb is still fine. In 3 years, you'll want something better anyway.

  • @evenAndre
    @evenAndre Год назад +2

    I feel you are much too positive. This is really a 4050ti-ish model, so we are NOT getting a good deal on this. It is dirt cheap for nvidia to make. Bet it costs them about 100$ in pure production cost. But since AMD is trash gen on gen uplift this time, they are training consumers on new price/performence levels.
    8BG at 400$ is DOA and 100$ increase for 8GB extra VRAM is also DOA. I expect these cards to sell very poorly at these prices. Only 4060 @ $300 is somewhat ok due to the new features, NOT raw performance.

  • @sedixmrboss5625
    @sedixmrboss5625 Год назад +25

    The 3060ti runs out of vram in Forza Horizon 5 at max settings 1440p. That results in dropped frames for 5 minutes and then a crash. Otherwise it would run just fine. So from my point of view, 16GB on a 1440p capable card makes sense.

    • @sedixmrboss5625
      @sedixmrboss5625 Год назад +10

      1.15 faster than a 3060ti is waaaay too little. WTF. W.T.F.

    • @KimBoKastekniv47
      @KimBoKastekniv47 Год назад +2

      @@sedixmrboss5625 Exactly, 3060 Ti was a massive 50% jump over the 2060 Super.

    • @kahlernygard809
      @kahlernygard809 Год назад

      @Sedi xMrBoss what's the power draw difference? The reduce heat in 4060ti is large step and has higher overclocking room

    • @vroomzoom4206
      @vroomzoom4206 Год назад +2

      ​@Sedi xMrBoss yeah Jay shilling tbh. These cards are trash.

    • @YuokoII
      @YuokoII Год назад +1

      I have 3070 and in fh5 vram message sometimes pops up but it’ve never crashed because of it. Kinda weird

  • @1stPersonFarmer
    @1stPersonFarmer Год назад +18

    Would love to see a video with best matched gpu to cpu. For both amd and intel to help the viewers that are looking to build a pc.

    • @aftermax01
      @aftermax01 Год назад

      Well, intel already did

    • @T33K3SS3LCH3N
      @T33K3SS3LCH3N Год назад

      For most gamers, the current situation is that investing into a powerful CPUs is pointless anyway. CPUs have hit the point of being way overpowered for games a couple generations ago.
      I upgraded to a 4090 when Cyberpunk Overdrive came out, and I was still GPU-bottlenecked in Cyberpunk, Total War WH3, and even Satisfactory... with an i5-11600K. Typical peak loads were 100% GPU, 80% CPU. I don't recall a single time where the CPU was the bottleneck.
      The RTX 4060 will probably often be recommended with something like an i5-13600K but I can already tell you that this is CPU-overkill. I switched to the 13600KF last week to get the jump to DDR5 done, and already knew that this was a pure vanity upgrade. 1500€ GPU vs 300€ CPU, that's the state of the market.

  • @isaacpovey4031
    @isaacpovey4031 Год назад +15

    I know exactly what happened here nvidia was going to release the 4060ti with 8gb at the higher price but they are scared of the upcoming AMD midrange cards that will have more vram so they dropped the price to keep there launch date and pushed back the release of the 16gb real 4060ti. I suspect we will see few 8gb cards once the initial stock runs out.

    • @TotallySlapdash
      @TotallySlapdash Год назад +9

      My guess is the other way round, the 8GB card is the only one mass-produced and the 16 will effectively be vaporware... either way, I'm not interested in $399-499 for a $249-299 card, so NV can go eat dirt.

    • @HFRG-zq1qm
      @HFRG-zq1qm Год назад +1

      Thing is, the Ti releases of each NVidia GPU line are ALWAYS less memory than the standard model at a higher memory bandwidth. So your assumption of the 16gb being "the real" 4060Ti is WAY off base, I am certain. I would wager the standard 4060 will have 12 GB, or perhaps 8 GB, but at a lower bandwidth.

    • @slayerr4365
      @slayerr4365 Год назад +1

      @@TotallySlapdash Everyone called the 3060 ti a great value card even for the 600+ dollar scalper price but now the 4060 ti is a 250 dollar card. Ok bot.

  • @thematrixredpill
    @thematrixredpill Год назад +2

    Daja Vu. They pulled this crap with first release of the 40 series only showing DLSS3 40 vs all other. The fact that they're done it again vs the 3060Ti means the performance is crap. They shifted the product stack, its a 50 class card. Also going down the cache route lowered the bus wide and locked in stupid ram allocations. All that on die cache is useless if you run out of video ram. If you got a 4070Ti, you gotta be pissed

  • @rhoharane
    @rhoharane Год назад +1

    The "16GB doesn't belong on a 60 series card" argument is off.
    If the whole VRAM controversy was from saying a 3070 can't handle certain games because of spillover from having not enough VRAM, and would been worth the money because it lasted you many years of new games, at 1440, if only it had more VRAM.
    And a 4060ti will at least be as fast as 3070.
    Then it absolutely makes sense for a 4060ti to have 12, if not 16GB.
    Reviewers hesitate to say it but they want to say it's dead in the water at 8GB, in a way. Many games will run at lowered settings, whose quality won't be particularly prioritized by devs because they will typically target a higher VRAM budget. And accommodating 8GB is resorting to just carelessly downscaling the textures and calling it a day because devs have a million other things to finish.
    If the 4060ti isn't at least as good as a 3070, then it's kind of a disappointing card anyway.
    But we can't speculate on this stuff too much. We need the actual test numbers.
    On the other hand, it feels like Nvidia wants to use the existence of the 8GB as a way to make it seem like they priced the 4060ti nicely, when some could consider the 16GB version the "real" version of the card. Adding 8 extra GB of GDDR6 doesn't cost them anywhere near $100. Just like how MacBooks "start at" a certain price but the base specs are actually so bad that you wouldn't actually buy that. (once again, 8GB on a modern macbook is not enough, and tests have shown that going up to 16GB makes a huge difference, both in speed but also how badly it hits the soldered-on SSD because of spillover)

  • @megatronopera
    @megatronopera Год назад +19

    Price still too high. No thanks.

    • @samson_the_great
      @samson_the_great Год назад

      bum lol

    • @megatronopera
      @megatronopera Год назад +1

      @@samson_the_great I have a 4080 laptop, and i got a good deal on it. Price on desktop cards is atrocious.

    • @samson_the_great
      @samson_the_great Год назад +1

      @@megatronopera are you kidding me? Laptops are extremely overpriced for what you get, but you complain about 500 bucks for a gpu?

    • @SeventhCircle77
      @SeventhCircle77 Год назад

      @@samson_the_great$500 for a $350 card lmfao. Get a 6800 or 6800xt for $500 and get 16gb and way better performance.

    • @samson_the_great
      @samson_the_great Год назад

      @@SeventhCircle77 they don't have any of the good features the rtx series has.

  • @DrNooberious
    @DrNooberious Год назад +29

    I am very skeptical about the low memory throughput of the 4060 TI.

    • @dphidt
      @dphidt Год назад +1

      The 8x L2 cache size makes the bandwidth difference a non-issue under most circumstances. Especially if the memory controller has a decent prefetch mechanism. I’ve been designing algorithms for high performance embedded computing for 30+ years, and cache management is where the majority of performance gains occur.

    • @Gay-is-_-trash
      @Gay-is-_-trash Год назад

      Ok red fanboy. Nvidia made infinity cache, but better. How do you feel about it now

    • @DETERMINOLOGY
      @DETERMINOLOGY Год назад

      For 1080p wont be a major issue. DEF the 4060 Ti is a pure 1080p card so it will do just fine.

    • @PendulumCancel
      @PendulumCancel Год назад

      I wouldn't be if the 16GB version were $400, but for $500 you and every midrange PC gamer and below should be skeptical. Nvidia could have doubled the size of the memory bus to 256 bit as well, but then that would invalidate the existence of the $600 4070 and we can't have that lol.

  • @Tony_Calvert
    @Tony_Calvert Год назад +4

    This card is doa even the 16gb. The 16gb version is the only one that should exist and it should cost $399.

  • @MET3
    @MET3 Год назад +7

    One reason I really want to pay more for the 16GB version is so I can experiment with some LLM for local AI. 16GB is perfect for local testing. Good bang for the buck if you’re interested in AI

    • @ZenAndPsychedelicHealingCenter
      @ZenAndPsychedelicHealingCenter Год назад +1

      It's way overpriced. Stop enabling corporations shafting customers like this.

    • @frontrangejrs
      @frontrangejrs Год назад

      @Spooderderg Ya, this will be good for productivity ONLY. I was looking forward to this but based on the benchmarks already out for the 8GB which you can buy on Newegg, the 16GB has same specs so will only benefit when 8GB is full. Might as well get a 3070 TI with its GDDRX vram. Ofcourse if your not doing productivity, your better off just getting a used RX 6000 series for the money as the lack of vram and select games makes Nvidia's ray tracing advantage nada now. NVIDIA really sucks this generation. AMD sucks to for not releasing anything for performance and efficiency like the 6800 not XT this generation but they don't have to because of Nvidia's fumble, AMD's own marketing to shareholders is for customers to just go buy discounted 6000 series....

    • @Tomogeny
      @Tomogeny Год назад

      Eh, the RTX 3060 has 12GB of VRAM and is doing a killer job at AI stuff for me. 4060 will come out at the same time as 16GB 4060 Ti so might as well go for 4060 then (assuming it'll have 12GB+ of VRAM) if you're talking value. Grab an extra coffee for the 20% longer computing times. Also, let's not talk productivity with a 128 bit bus. This thing is actually worse than the 3060 ti at most productivity tasks. Its a complete screw you to gamers and an even bigger offense to people who use it for productivity. There's really no scenario in which buying this piece of rubbish is warranted, except if you only play cyberpunk with dlss3

  • @ABaumstumpf
    @ABaumstumpf Год назад +2

    The memory bandwidth... that is not how that works. Sure bigger L2 is really nice, but it is absolutely not the same as having more bandwidth. And depending on the game it could be way better even, or way worse - say large textures that are in constant use but never need much access - memory bandwidth is king. Small detail-textures that are used docents or hundreds of times - cache is king.

  • @soapa4279
    @soapa4279 Год назад +7

    Nvidia put out an explanation about the VRAM and L2 cache. While it makes theoretical sense, it does not solve the fact some game engines still want to store more than 8GB into VRAM (at particular settings like RT on/ 4K)
    It would be interesting to see the 4060 vs 3070 in those games that suffered. RE4 and Hogwarts comes to mind. If their whole thing about L2 cache actually works.
    (And I know, these are marketed as a 1080P cards, but for the sake of knowledge it would be interesting to see)

    • @ABaumstumpf
      @ABaumstumpf Год назад +1

      Yeah but most of the games that would take advantage of more VRam also need more performance. Performance that these cards dont have.
      And then we have the handful of broken games were moders have fixed the memory problems that the lazy publishers caused.

    • @mromutt
      @mromutt Год назад +1

      I dont know how they can try to call it a 1080p card, thats like saying the the 2080 is a 1080p card or the 3070 is a 1080p card.

    • @Chrisp707-
      @Chrisp707- Год назад +1

      Actually the TI is a 1440P card soooo

    • @rare6499
      @rare6499 Год назад

      L2 cache won’t save these cards when they run in to heavy vram requirements

    • @mromutt
      @mromutt Год назад

      @@varitytopic4440 I was actually wondering how it would negotiate that, would it report to Gen 3 as x16 capable or just be stupid and still say x8 even though it can run at Gen 3 x16 speeds (Gen 4 x8).

  • @Ms.Fowlbwahhh
    @Ms.Fowlbwahhh Год назад +56

    About your mention of vram and 1080p, MSI Afterburner says I’ve used 12gb of vram with RDR2 maxed out at 1080p sometimes. Afterburner could have been lying though lol.

    • @dauntae24
      @dauntae24 Год назад +7

      I’m playing Wolfenstein New Colossus on ultra/uber with a 4070ti at 1080p and I keep getting a message about running out of vram.

    • @billbollinger3748
      @billbollinger3748 Год назад

      Same, there are many games that have the hunger for vram. You can play then with only 4gb of vram but in reality, I am sure most of them could take a chunk of a 16gb option amd use it well

    • @PyromancerRift
      @PyromancerRift Год назад +4

      Its ram allocation. I have played the game in 4k with a 3080 10gb and i had no stutters.

    • @andersbjorkman8666
      @andersbjorkman8666 Год назад

      I run everything on ultra on RDR2 and never go over 6.5 VRAM usage (a newer gddr6x Nvidia card). Something that some videos have demonstrated though is that AMD cards, in some games, will use more VRAM than Nvidia cards. Are you on an AMD card?

    • @duncanidaho5834
      @duncanidaho5834 Год назад +2

      @@dauntae24 You need a new monitor - 4070ti at 1080p is a big L

  • @siliconlegion6406
    @siliconlegion6406 Год назад +5

    Love the videos. You have inspired me to go into the field of electronics and I am currently going for a degree in computer engineering and when ever I have a bad day or a hard class your videos always inspire me to keep striving to be the best that I can.

  • @aratosm
    @aratosm Год назад +1

    I remember when gpu companies would "optimize" performance which created artifacts and was viewed by the reviewers as cheating for higher benchmarks. Oh how the times have changed. Now artifacts are a feature since there is next to 0 gen to gen improvement. 3 years, and almost 0 raw performance improvement. Crazy.

  • @AwankO
    @AwankO Год назад +1

    No amount of Vram could satisfy me with the cut down bus and reduced bandwidth. 6:45 - 9:45 along with the lower overall specs.

  • @chosen1one930
    @chosen1one930 Год назад +11

    This would be interesting to see nvidia throwing AIB partners more bones, they really need to let them do more

  • @katsaras1
    @katsaras1 Год назад +4

    Nothing about the 4060, 4060 ti 8Gb and 4060 ti 16Gb is good. Money to performance ratio and the pricing for what is an entry level card is simply silly and ridiculous. Staying with my AMD 6800 XT for the next 4-5 years

  • @Brad25King
    @Brad25King Год назад +26

    I don't think it's worth the extra $100, but Jay you haven't given your opinion of how much more 8 GB VRAM should be? If the 4060 ti 8GB would be $350 and the 16GB 425, I would feel it makes a lot more sense. Perhaps $400 and $465 ..Frame generation is dlss3 only and doesn't look good in many of the few games that support it for me personally
    And yay, 32MB cache, I would love to see some high texture pack testing of the 3060ti 4060ti 8gb versions to see those 'extra hits make half the mem speed meaningless' arguments...

    • @blacknemesy
      @blacknemesy Год назад +2

      You don't like frame generation because it "doesn't look good in many of the few games that support it"? That makes no sense at all

    • @antoniohagopian213
      @antoniohagopian213 Год назад +4

      Just like reflex dlss 2 looks horrendous on most games that use it. The only game that has a great dlss is crossout that I know of.

    • @mchonkler7225
      @mchonkler7225 Год назад +4

      VRam costs $4-6 per GB. 8 extra GB should be no more than $50 increase.

    • @kolle128
      @kolle128 Год назад +2

      @@mchonkler7225 Kind of. Because you also need Space on the PCB for the chips. But I also think that +$50 would be fair, not +100.

    • @sysbofh
      @sysbofh Год назад +3

      @@kolle128 You need space on the PCB only if the memory chips are at the max size already. If there is the option to just put one bigger chip... Just like computer memory: we can buy 2 x 8 or 2 x 16 memories.
      Also, it's far cheaper to do this when they are creating the PCB: more than 90% of the cost (PCB, cooler, voltage regulator, GPU, etc) is already sunk.

  • @EmblemParade
    @EmblemParade Год назад +1

    Jay, it's not an "ideology" that thinks that 8GB is not enough. It's proven that even for 1080p 8GB can hamper your ability to use high settings in newer games. It's simply not a good idea to invest in just 8GB in 2023.

  • @marctech1996
    @marctech1996 Год назад +1

    Bro what kind of reaction is that? We aren’t in the shortage anymore. Plenty of consumers have shown that they are willing to wait out a gen or two if performance gains are so minimal. How are you recommending an 8gb graphics card to anyone in 2023 when you could get a PS5 with 12-16gb late 2020. No wonder so many people are going with consoles instead. Especially if that’s they kind of coverage you get from a huge PC channel.

  • @ilhanbayramoglu2811
    @ilhanbayramoglu2811 Год назад +4

    Another dumb move to make whole situation worse. These things should have been think thorough. Easy segmentation, hear me out:
    4060/60Ti -> 12 GB (192 bit)
    4070/70Ti -> 16 GB (256 bit as always)
    4080/80Ti -> 20 GB (320 bit)
    4090/90Ti -> 24 GB (384 bit as always)

  • @tech339
    @tech339 Год назад +7

    16GB would be better for hobbyists and learning 3d artists rendering on GPU for loading large scenes in GPU memory for rendering. I have been waiting for a 60-tier card with more VRAM for that because I don't have the budget for any model above that. As I am still learning and not earning from my work it would be a boon. I have been limited to small scenes till now as I have a GTX 1660 super with 6GB VRAM and would try to get the 16GB one. Hope in the future you will start looking into the 3D Rendering aspect of consumer GPUs. Thanks for all the great information you provide, Everything I know about PC's is because of you, Linus, both Steves, and Paul, and also Kyle.

  • @demontekdigital1704
    @demontekdigital1704 Год назад +5

    I'm just waiting for the time when we can get back to parts not shipping with some sort of catastrophic manufacturer failure / controversy. I personally think $329 would be the perfect price point for these cards too. I mean, it's still expensive, and would still maybe require some saving to buy one, but saving some cash from a couple of paychecks is a hell of a lot better than buying a GPU for the same price as a used car, or a month's worth of rent.

    • @jamesgodfrey1322
      @jamesgodfrey1322 Год назад

      so true

    • @disguiseddv8ant486
      @disguiseddv8ant486 Год назад

      This new generation of PC hardware isn't for you. So stop complaining about the prices when you're living check to check and not planning on buying anyway.

    • @demontekdigital1704
      @demontekdigital1704 Год назад

      @@disguiseddv8ant486 I didn't ask you, nor do I care.

    • @disguiseddv8ant486
      @disguiseddv8ant486 Год назад

      @@demontekdigital1704 Actually you're speaking to anyone/everyone that's reading your public post. And if you didn't care you wouldn't have replied to me. But this is the behavioral pattern of a complaining individual who doesn't like the truth and wanting others to feed into what you are only saying. Either get your money up or stay in your lane with the Nvidia GT730.

    • @jamesgodfrey1322
      @jamesgodfrey1322 Год назад

      @@disguiseddv8ant486 A part of group that see this price is as been on greedy side for 8GB GPU that run the risk not last due fact the games are becoming more vram hunger
      Now as me I always been fan of last generation hardware why price is better and the fact that the all nasty bugs are found the driver are sorted

  • @ImaITman
    @ImaITman Год назад +2

    No Nvidia's not listening to their consumers. They launched a $400 card for $500, and then they launched a steaming pile of garbage at $400. Jay don't buy into the bullshit....please for the love of God, we can't lose you too.

  • @Chrisp707-
    @Chrisp707- Год назад +1

    The 16GB should be the only version. Paying $100 just for 8GB extra is fucking ludicrous.

  • @grimdicer152
    @grimdicer152 Год назад +14

    R9 390 - 8GB for $330 in 2015
    RTX 4060 Ti - 8GB for $400 in 2023
    Really shows you how much greed has standard the GPU market

    • @dusty_reaper96
      @dusty_reaper96 Год назад +1

      U forget too 1080ti had 11 gigs too

    • @bubba966
      @bubba966 Год назад

      Adjust that $330 for inflation and it's $422.37 today

    • @AHawksDive
      @AHawksDive Год назад +1

      @@bubba966 While you're not wrong, he's talking about a high-end card (for 2015) and a low-end to mid range card now.

    • @AHawksDive
      @AHawksDive Год назад

      @HumanBeing Nvidia's 70s and 80s series were considered high end at the time, you said yourself it struggled to beat it. So they are comparable.
      I also never stated it was the fastest card...

  • @milesdyson7429
    @milesdyson7429 Год назад +21

    The 8GB cards might seem like decent value, but they only perform 10-15% better than the last gen equivalents (w/o DLSS3). After almost two and a half years, that's not great - especially when performance is going to collapse above 1080p from the narrow bus width.

    • @Lucromis
      @Lucromis Год назад +5

      I was thinking the same thing. 10 percent better than last gen is pathetic. That’s nearly the difference between what overclocking used to do on the same models.

    • @香噴噴-p6f
      @香噴噴-p6f Год назад +1

      but for itx builder the lower power consumption is a nice thing since they can get another like msi aero 4060 with less heat

    • @slayerr4365
      @slayerr4365 Год назад +3

      But for the many many people still sitting on 2060 or even 1060s who were thinking about getting a 3060 ti now they can get a 4060 ti for the exact same price so I am happy for those guys. Not everyone is looking to upgrade every single generation champ :)

    • @arcace1
      @arcace1 Год назад +1

      Performance of the 8GB cards is going to die if you run out of vram.

    • @jesusbarrera6916
      @jesusbarrera6916 Год назад +1

      @@slayerr4365 or get a similarly priced 6800 that even stomps the 3070ti..... stop being Nvidia's floormats you guys

  • @KevinBein
    @KevinBein Год назад +12

    I can see the 16gb model being useful for creators. The additional VRAM is useful for many applications not just gaming.

    • @Wurstbrot03
      @Wurstbrot03 Год назад

      This. But since 3d sculpting + rendering is just a hobby of mine, I will not pay extra 100 bucks for 16gb of ram. Other than 3d art, I play mostly indie games that don't need a powerful card. I probably settle for a normal 4060 since the power consumption seems to be so low that I can keep my current bequiet PSU. A 4060 with 12Gb would ideal for me but I guess that's not an option.

    • @VoldoronGaming
      @VoldoronGaming Год назад

      Creators should by a Lovelace quadro since the quadro driver is application tested and certified and also that quadro driver unlocks some features on the card that isn’t useful to gamers.

    • @Lxcx311
      @Lxcx311 Год назад +3

      ​@@VoldoronGaming not every creator is a professional creator. Most people do it as a hobby and also play games with their PC

    • @funi1083
      @funi1083 Год назад +3

      @@VoldoronGaming quadro is stupid expensive compared to mass market rtx models

    • @a-alex-p7033
      @a-alex-p7033 Год назад

      @@Wurstbrot03 you could wait for amd options too, unless you must have Nvidia. 6700xt has 12gb so maybe the new 7700xt will have 12

  • @tyrannicpuppy
    @tyrannicpuppy Год назад +1

    I get that maybe it's just not something the tech channels I watch don't care about these factors, but it's kind of frustrating only ever hearing a card's worth rated by "Will it make games faster?" And if it doesn't, the conclusions don't bother go the cheaper model. By looking at those specs, that 16GB card looks like it will be a monster for low cost 3D render hobbyists. It's WAY cheaper than even the last-gen big boy cards. Has a large enough VRAM to hold a bunch of assets ready for render. And you are getting all the benefits of the new-gen architecture. I can definitely think of a bunch of people I chat with in that group regularly who will probably be thrilled to see that larger buffer card available. Regardless of whether that VRAM will help with their games, which will still be getting a boost as well.

  • @basmus
    @basmus Год назад +4

    I love my 3060 ti founders edition. Im happy to wait for the 5000 series or what weird naming sceme nvidia brings up next

  • @IAlbert93I
    @IAlbert93I Год назад +12

    i think they added 4060ti 16gb later in response to consumers, that's why they don't plan founders.
    PS: if you would consider infation dollar per performance gets even better (can be difficult to analyze correctly but we got a big inflation spike atm)

    • @HFRG-zq1qm
      @HFRG-zq1qm Год назад +1

      The current inflation spike is total BS. The inflation over the last three years is 500% of the amount of inflation of the prior 3 decades put together,

    • @Archer_Legend
      @Archer_Legend Год назад +2

      Well in Europe the 3070 had a 519 euros MSRP, that crap of the 4060Ti has an MSRP of 549 euros, that card is a scam here.

    • @IAlbert93I
      @IAlbert93I Год назад

      @@Archer_Legend yeah i guess we'll see real prices soon

    • @Archer_Legend
      @Archer_Legend Год назад

      @@IAlbert93I however I do not understand how can Jay can be positive about these cards, even in the US they are bad.

  • @pdamasco
    @pdamasco Год назад +7

    Honestly I hope mid-generation they do make another 4070 SKU. Maybe with a slightly different configuration and 16GB of ram as well. They can just retire the 12GB options and I know consumers would be very happy with that decision. As long as the pricing was a VERY SMALL increase basically to cover the extra VRAM. If they do it, then at least we know NVIDIA is finally somewhat listening to the market and adapting, rather than just raising prices and re-upping the higher end options. Mainstream is important too!!
    Also I wonder what they will do with the RTX 4050 SKUs. I really think that is the only SKU that should have 10 or 12GB of VRAM. That's the low end SKU. I really think all of the other SKUs should have 14GB or more up the stack. Maybe they will target those numbers with the 50 series or 40 series refresh. It is a real shame the 30 series didn't ever have those rumored 16GB SKUs. I have a feeling NVIDIA knew people would hold onto those 3080 16GB cards just like the 1080 and 1080 Ti owners. They didn't want a bunch of people skipping multiple generations of upgrades again but to be honest, they shouldn't care about that. To me if you make a great product and it lasts 3-5 years that is a POSITIVE thing. Why, because we go back and buy another NVIDIA product. We all know NVIDIA doesn't make the majority of its money from us gamers anymore but customer loyalty is important to every company. Making good products at good prices that last is how all companies retain customers. I will always buy the card that I think will last longer for the same money. AMD has a history of releasing better specs for less money and supporting them for a long time. Why shouldn't I get 5 years out an RTX 4090 for example???

    • @potvinsuks8730
      @potvinsuks8730 Год назад

      So I suppose, you did not purchase a 4070 then lol?

    • @malazan6004
      @malazan6004 Год назад

      While I think 12gb is fine for the 4070ti going forward especially for 1440p and some 4k games it should have been 14 to 16. I doubt there will be another sku though people are buying these so yeah

  • @mikeal425
    @mikeal425 Год назад

    Thank you for explaining the Memory Bandwidth , when i seen such low memory bandwidth was a no go for me , but when you explained how the memory bus works really helped

  • @duckilythelovely3040
    @duckilythelovely3040 Год назад +2

    I work at microcenter, we got 9 4060tis delivered to us.
    it'll be interesting to see how the launch goes.

    • @flimermithrandir
      @flimermithrandir Год назад +1

      Let me guess. At first it will sell and 10 Days later you get back at least halve of them again.

    • @duckilythelovely3040
      @duckilythelovely3040 Год назад +2

      @@flimermithrandir you aren't wrong hahaha. Happens everytime.
      These are cheaper, it'll be interesting. The cheap ones tend to not be returned. Our returns consist mostly of 4090s 7900xtx 7900xt and AMD 6000 series cards. (Surprise surprise)

  • @maniacos2801
    @maniacos2801 Год назад +13

    16GB - for running Stable Diffusion locally and also for rendering of bigger scenes with HD textures. With SD implemented into Blender, I can see a lot of use for a 16GB card. At his price point of 500 this could come as a valid upgrade for some hobbyist 3D artists.

    • @darthgregor9232
      @darthgregor9232 Год назад +1

      If you would have needed 16GB you would have bought a 6800 or, 6800XT or 6900XT by now. Actually, in Germany you are getting 6900XT and 6950XT models for the same price the 16GB 4060 Ti 16GB will be released...
      But yeah, I totally agree that 16GB make totally sense. Should have been the case for 4070 already.

    • @artemis908k
      @artemis908k Год назад +1

      @@darthgregor9232 AI software like stable diffusion has pretty poor support on amd gpus. You lose certain features and generally have worse performance when you can actually get it running

    • @dominikbeitat4450
      @dominikbeitat4450 Год назад

      It's just the hobbyists who're willing to upgrade their 2080 who still get the shaft. At that price it's a 4070ti with 12GB and I just can't when 16GB cards are a thing.

    • @maniacos2801
      @maniacos2801 Год назад

      @@darthgregor9232 ROCm is sadly not as far as Cuda. PyTorch shows significant worse performance with ROCm than with original Cuda. Until AMD manages to get ROCm on par with Cuda, ML users are stuck with Nvidia.

  • @freedombyfire573
    @freedombyfire573 Год назад +6

    My 7 year old 1060 (6GB) is still running strong for 1080p gaming. Its crazy to think that you only get 2 GB more after 7 years!!! I hoped 4K gaming would be affordable after all this time. Guess I'll have to wait another generation or two and switch to intel or amd by then... or just keep playing age of empires games till the end of time, which will most likely be more fun and a lot cheaper than paying these crazy grab inflation prices.

  • @civil_leuthie
    @civil_leuthie Год назад +2

    'TI' should have remained their name for the improved version of the vanilla released a few months later. Releasing 'TI' version first creates the problems they're having now. This should be just 4060. The 4070 TI should have been the 4070. Someone at nVidia messed up and keeps messing up.

    • @jorgiotoldo8636
      @jorgiotoldo8636 Год назад

      the most nvidia scam is their laptop gpus .
      when you have a 3080 laptop gpu its barely a 3060 desktop gpu performance

  • @3-valdiondreemur564
    @3-valdiondreemur564 Год назад +1

    This card is just a 150$ too expensive for both models. Already, the 3060 TI was a capable 4K card - so saying giving the excuse that the 4060 TI is targeted for 1080P is really just a shitty excuse. If my 2070 SUPER can handle 4K in some games, and that the 3060 TI beats it, the 4060 TI is more than a capable card even at high resolutions. Just give us the damn VRAM, Nvidia.

  • @ShinyHelmet
    @ShinyHelmet Год назад +1

    An extra 8Gb of VRAM for an extra $100 on the Ti. What a bargain....said no-one ever!

  • @Pond721
    @Pond721 Год назад +8

    Why does the 4060Ti have 16GB yet the 4070Ti has 12GB? The 4060 should have an 8GB and 12GB and the 4070 should be 12GB and 16GB for the Ti. It makes no sense.

  • @DunkSouth
    @DunkSouth Год назад +18

    Jay, there is a small consumer & enthusiast market out there for local AI applications as well. These apps are highly VRAM-intensive--and swapping to RAM is disastrous for performance--so there are actually many AI applications & models that a 4060Ti with 16GB could run at speed, but a 3080 with 12 GB could not. I hope Nvidia continues the trend of providing models with extra VRAM for these things, especially as AI becomes more integrated into consumer applications.

    • @GamingLovesJohn
      @GamingLovesJohn Год назад +2

      Too bad the uplift in performance is so shit in value out of all the other 60 series card, that’s just dead in the water.

    • @DunkSouth
      @DunkSouth Год назад +4

      @@GamingLovesJohn The rasterization/RT uplift will not be big excepting in extremely memory-intensive workloads; however, that is exactly what I'm talking about.
      There are simply some jobs which a mid-range 16GB card can do, but a high-range 10/12GB card can't. 🥲

    • @marshallmcluhan33
      @marshallmcluhan33 Год назад +2

      @@DunkSouth Out of curiosity what models are you playing with? Have you tried out the LLM Wizard-Vicuna-7B-Uncensored yet? I'm stuck on CPU only so I use GPT4ALL but I'm just glad that someone else understands why vram is important. I need 24GB lol

    • @potvinsuks8730
      @potvinsuks8730 Год назад +2

      I'd rather strike back at the root of this problem and somehow demand that we get true PC game titles made on PC's for PC's and not games which are made on PC's for Consoles then ported over to PC.

    • @DunkSouth
      @DunkSouth Год назад +2

      @@marshallmcluhan33nowadays I am usually using vicuna-free 13B. It is very coherent and conversational, and has had a lot of the openAI bias removed.
      I'm really interested in playing with llama-30B-type models. I've heard that a lot of so-called "emergent behaviors" only really start to appear around there, and that the difference between 13B and 30B is night and day.

  • @nukedathlonman
    @nukedathlonman Год назад +3

    I really think this generation of cards are going to be remembered more for the price increases more then anything else. I also don't think people are going to enjoy the look of DLSS when gaming at 1080 on the RTX 4060 (though the frame generator might help - that is if the game supports it).

    • @JoeMaranophotography
      @JoeMaranophotography Год назад

      I liked DLSS 2 on a laptop 2070 so I'm sure they will be fine with DLSS 3 tbh. Its fine apart from a few games but you can just replace it with the later dlss versions.

    • @nukedathlonman
      @nukedathlonman Год назад

      Well, my experience - DLSS and FSR do okay for 1440P and 4K... But for 1080, it doesn't look good at all - rather have a GPU staying playable with out reliance on it.

  • @justin9202
    @justin9202 Год назад

    As a game artist i do appreciate the option for 16gb vram as it cuts down render times for programs like maya, ue5, Renderman, substance, ect

  • @matp93
    @matp93 Год назад +1

    To me the 16GB version being AIB only feels a bit like one more nail in the coffin for partners, as it's likely going to be difficult to justify the price difference for just more memory, and more people are going to opt for the lower spec version, i.e. more $$$ for Nvidia directly.

  • @enmitynz4613
    @enmitynz4613 Год назад +3

    I'm looking forward to the video on frame gen. We need to look closely at latency when it comes to frame generation and not just at the fps numbers as they can be misleading if your base fps (without frame gen) isn't already well over 60 fps as so far from what I have seen, it appears to sometimes run games in what you would assume is a smooth gaming experience due to the fps numbers being over 60 however in reality "feeling" like 30 due to the latency.

  • @bobaruni
    @bobaruni Год назад +4

    This is seriously one of the worst generational uplifts, only 15 percent and a smaller memory bus on a smaller node, it is a cost cutting exercise at best.....you are under the spell of Nvidia buying into their frame generation BS that does nothing for latency. The only real improvement here is efficiency and that is mostly because of TSMC vs Samsung. As a reviewer, please stop buying into NVidia's BS and try to think a bit more independently and critically.

  • @gucky4717
    @gucky4717 Год назад +15

    Jay, 8GB VRAM is already to little for some games in 1080p, causing low 1% FPS stutter or outright crashes when using RT. The 3070 has those problems.
    That is why it is consense among many people, that 8GB is too little in 2023 and onward.

    • @HFRG-zq1qm
      @HFRG-zq1qm Год назад +3

      There aren't really that many final release games that have that issue, and most of the games that 8 GB isn't enough for are those very still in development alpha titles.

    • @Ghostlynotme445
      @Ghostlynotme445 Год назад +1

      You know this ain’t 4gb vram right? 😂 8gb vram is fine for high sometime ultra settings for 1 year or 2 years

    • @gucky4717
      @gucky4717 Год назад +1

      @@Ghostlynotme445 Games like TLOU1 already has bad muddy textures with 8GB VRAM. It won't get better.

    • @WirxawTanev
      @WirxawTanev Год назад +1

      @@Ghostlynotme445 It's not. It hasn't been since 2020 or even earlier. Just monitor your VRAM usage. Most games would try to eat up 8GB. Hogwarts is unplayable at 6GB unless you are a magician. Cyberpunk gets to the point of crashing after a few hours.
      At this point - 8GB is *NOT* enough for even above-average 1080p settings, and sure as hell isn't enough for max settings. Period.
      Which is why 3060 was a sweet hobo deal, though 4060 is still a solid hobo card as an upgrade to 3050.
      But any card that wants to promote itself as "optimal 1080p experience" has to have 12GB. It may not need 16GB for 1080p, but 8GB is no longer optimal.
      And 4060ti 16GB at 500$ is just almost twice the price of 4060 for barely any higher performance and it's just 20% shy of 4070, which may have only 12GB VRAM, but that has GDDR6X and yet more performance.

    • @HFRG-zq1qm
      @HFRG-zq1qm Год назад

      @@WirxawTanev Ok, the games you are referencing (Hogwarts and Cyberpunk), I have seen plenty of streamers playing AND streaming a playable 1440 resolution experience on 1060's and 1650's paired with Ryzen 7's & 9's and i9's, meanwhile I have seen plenty of choppy/jumpy/laggy unplayable or barely playable 1080 footage with i7's and Ryzen 5's paired with 3090's and 4080's recorded/streamed from a second machine for resource preservation. Graphical Alacrity isn't always about the GPU, and the GPU isn't as often a bottleneck as something else in the system is. And kinda surprisingly but kinda not, the most common point of resource and performance bottlenecking is quantity of system RAM or RAM clocks, and the second most common is CPU core count and frequency, especially with Intel CPUs. Those particular titles aren't optimized, and as such are very much so CPU bound and system memory bound, and though they claim to require 8 GB system memory recommend 16 GB of RAM available, tend to actually require 32 GB+ of free RAM available for stability and smooth performance, another failing point of non optimized games. And there are also a number of windows system settings that when enabled throttle your 3d performance while providing no benefits to your desktop performance, and those will be enabled by default or if you let windows decide what's best.

  • @michaelkaster5058
    @michaelkaster5058 Год назад +1

    Look at Jay taking nvidia charts at face value, it is so cute.

  • @jakehutchens
    @jakehutchens Год назад

    What needs to happen with the VRAM stack moving forward: Professional series->48GB+, 90->24GB, 80->20GB, 70->16GB, 60->12GB, 50->8GB. Entry-level cards have to come in at under $300 and be scaled up from that point.

  • @MD-wn4ui
    @MD-wn4ui Год назад +8

    Bought a 6950XT for 600 USD new instead.

  • @konstantinlozev2272
    @konstantinlozev2272 Год назад +3

    RTX 3060ti beat the RTX 2080 Super
    What this "4060ti" (4050ti in disguise) do?
    Does it beat the RTX 3080ti?
    Or at least the RTX 3080 12GB?
    No!
    Can't even beat a RTX 3070!
    After 3 years!
    😂😂😂😂😂
    What an abomination of a "next gen".

  • @M3h3ndr3
    @M3h3ndr3 Год назад +4

    Its gonna be 600-700€ in europe sorry but that card is not even worth 300€, ill stop gaming altogether before buying this shit.

  • @demonhellcat
    @demonhellcat Год назад +2

    I upgraded from a 2070 to a 4070. $600 turned out to be my magic price to switch. The performance jump from the 2070 is very noticeable. Only bad news is now I feel like I need to upgrade my CPU to keep up in some games even at 1440p.

    • @lobsterjohnson8642
      @lobsterjohnson8642 Год назад +2

      CPU, motherboard, power supply, ram etc. PC gaming is a constant upgrade 😢 my wallet

    • @robbycoker84
      @robbycoker84 Год назад

      On an older generation motherboard, slow busses may be a bottleneck, also; even if currently installed CPU is fast enough to handle the 4070 well. Therefore, you wouldn't be able get the most possible FPS out of the 4070 & CPU combo on that respective board.

    • @lobsterjohnson8642
      @lobsterjohnson8642 Год назад

      @@overused6632 I mean it's definitely worth it if you game. You can get so many free games and discounts. On top of that you can emulate or torrent games as well. I just downloaded and played Zelda tears of the kingdom days before it even released 💀 on top of that games have a longer shelf life on PC. I can't go back to controllers after keyboard and mouse. Despite all the annoyances that come with PC the pros outweigh it all.

    • @malazan6004
      @malazan6004 Год назад

      4070ti was the best deal in my country lol 3080 for the same price in many places and not taking a risk on a used miner card. It was still pricey for sure but I came from a 2060 laptop so it's a beast to me.

  • @AMGameGoesOn
    @AMGameGoesOn Год назад +1

    Frame generation is the latest scam. And hell no ain't buying an 8GB VRAM 4060 Ti for $399. 8GB ain't enough for 1080p gaming even now. You are wrong my friend. The 8GB varient shouldn't even have existed and only the 16GB at $399 should have been there.

  • @Mysticsoldier88
    @Mysticsoldier88 Год назад +3

    Jayz review… kind of praising it. GNs review… shitting on it at every turn for essentially being a 3060 ti. Odd.

  • @OficinadoArdito
    @OficinadoArdito Год назад +4

    NVidia always launches a 60 tier with at least or more VRAM than an 80 tier. They are surely targeting the entry-level creator market. I myself got into render with a 2060 12 gb. That was crazy value, made me several times more money than what I paid for the card.

  • @Andrew-qb3oq
    @Andrew-qb3oq Год назад +9

    I'm sorta interested in the 4060ti 16gb as an upgrade over my 3060ti 8gb. Mainly for VR. I don't VR with anything fancy, just a Quest 2. But I regularly see titles nearly use up the 8 gb of vram. I can only imagine it getting worse when I get a Quest 3, if it is higher resolution.

    • @master_baiter1873
      @master_baiter1873 Год назад +3

      Get a better gpu then. Not a 60s card.

    • @123clyd3
      @123clyd3 Год назад +2

      Nah dude. Get a 3080ti or 3090 or 6950xt if you want an upgrade. Don’t buy this piece of sh*t 4060ti. It may have 16gb or VRAM, but it’s on a 128 Bit memory bus. The 3080ti and 3090 are on a 384 Bit memory bus. I’m looking out for ur best interest!!

    • @Quast
      @Quast Год назад +1

      @@123clyd3 I concur

    • @malazan6004
      @malazan6004 Год назад +1

      Buy what is good for you and don't let all the comments dissuade you. I see many comments of people happy with their 4070ti despite so many comments over the months saying to get a used 3000 card that is likely slower and maybe mined on.

    • @malazan6004
      @malazan6004 Год назад +1

      ​@@123clyd3no don't get a 3000 series especially if you're in my country where a 3080ti is same price as a 4070ti. Unless its deal of a century stop telling people to buy older tech lol

  • @Republic3D
    @Republic3D Год назад +2

    15% increase in performance from 3060 ti to 4060 ti is good? Or even 60% over 2060 Super. Those are not impressive numbers.

  • @23wjam
    @23wjam Год назад +5

    16GB would be great for low-cost AI workstation.

    • @arunachalpradesh399
      @arunachalpradesh399 Год назад

      hello, am in budget and happy about 16gb in mid range card, do lots of productivity work. but i am still skeptical lower bandwith ,bus ,cuda cores count compare to 3060 ti. so will it make difference

    • @23wjam
      @23wjam Год назад

      @@arunachalpradesh399 to be honest, I use a 2060 for stable diffusion (which I think is only 6GB VRAM) and I've no problems. My main issue is the challenge in running some decent large language models.
      A 4060 might be a nice upgrade, but if I upgrade I'd probably save for a better GPU as I also want better VR.

    • @arunachalpradesh399
      @arunachalpradesh399 Год назад

      @@23wjam how about increasing pc ram?. does large language model takes half of ram from pc ram if v ram is not enough?.

    • @23wjam
      @23wjam Год назад

      @@arunachalpradesh399 no. The model needs to be loaded onto the GPU, so has to be enough vram

    • @arunachalpradesh399
      @arunachalpradesh399 Год назад

      @@23wjam there is language model also which use cpu and pc ram

  • @U_Geek
    @U_Geek Год назад +7

    The 16gb card is a good thing for people who want to use something like the Llama based llms on their pc. Just being able to fit most if not all the ai model into vram will increase speeds a lot so I like the option.

    • @Darksector88
      @Darksector88 Год назад +4

      I know plenty of blender artists who are still struggling with 8 gig cards. Cuda + 16Gb's for $500, this will help them greatly. To much focus is on gaming only which is a shame.

    • @berkertaskiran
      @berkertaskiran Год назад

      Guys just get 3090 24GB for 500$ used, why are you trying to get ripped off?

    • @Darksector88
      @Darksector88 Год назад +1

      @@berkertaskiran it's not too much of a rip off, the new cuda efficiency is superior to the 3090. I have a 3090 now and love it for blender so I'm not disagreeing, it's just I can see the appeal. Some people are really turned off by older cards that have been used. Also, there are those out there that want the latest and greatest even though the 4060 ti will have 8Gb's less vram. The 16 should be more efficient on the 4060 at least.
      $500 for a brand new cuda card with 16gb's of ram isn't that much of a rip off to be honest. Many jumped on the 3080 which has less ram and was a little more expensive.

    • @berkertaskiran
      @berkertaskiran Год назад

      @@Darksector88 It's just not good enough. 4060Ti is 22Tflops 3090 is 35.5. I don't want a used card or one that uses more power either, but there's big performance difference. And it's actually worth it for it to use more power because of that. It can even be more efficient because it would carry out the task faster.
      In gaming it's a bit more different but it would still be not very different. I wanted DLSS 3 until recently but seeing it affect UI and cause other pretty noticeable effects is bad. I figured games are too unpredictable for fake frames to be really usable.
      I always considered 4070 seeing it's a better jump than others and seeing 4070Ti is too expensive for 12GB. But with the release of these things now I think even 4070 is a bad deal. Why would I pay more to get less VRAM?
      Nvidia is basically trying to upsell all their lineup. They want you to want more performance. But then you see less VRAM. So they force you to go for 4080 which only has equivalent VRAM of the 500$ card. So the higher you go the worse deal you get. But it's also pretty bad on low end. Only 4090 makes sense but then that's a huge price and a huge card that draws a ton of power and it's still the same VRAM of the 500$ used card! I mean sure 24GB is plenty but when you pay that much, you want something different than one that costs lesser than a grand!
      Nvidia's strategy backfired on me and it seems it backfired on a lot of people. I was considering 4070 but wanted to see the new cards. Now I'm thinking I'll just go with a used 3060 12GB or something. It's not really much behind any of these, also pretty good in both games and rendering and 12G isn't much different than 16G and certainly not any different than the 4070 or 4070Ti 12G!

    • @U_Geek
      @U_Geek Год назад

      @@berkertaskiran I'm kinda looking a few years down the line when you can buy them used for nothing so anyone can acess it even broke university students. but 500usd seems too good for that card. But even if wasn't I don't have the psu nor the case space for a huge powerhungry 3090

  • @OniCr0w
    @OniCr0w Год назад +3

    I'm so confused. This card gets so much hate but performs on par with the 3070 for 50 to 100 dollars less.

    • @mlg4lyf
      @mlg4lyf Год назад +3

      It's a pretty fantastic card it is a little pricey compared to amds counterpart, but if you play at 1080p you most likely won't hit that 8 gb of ram buffer also if you already have a 3060 or better you don't really need to upgrade to this, most people are only upset that it has only 8 gb of VRAM and thats pretty valid

    • @AvroBellow
      @AvroBellow Год назад

      You're confused alright. It LOSES to the RTX 3060 Ti in some cases and the 16GB version gets SLAUGHTERED by the RX 6800 XT at the same price.

    • @mlg4lyf
      @mlg4lyf Год назад

      @@AvroBellow yeah the only reason to not get the 6800xt is if you don't want to upgrade your power supply

  • @FelixVyra
    @FelixVyra Год назад

    I'm glad they have a 16gb model. For a lot of people it might seem stupid but not all of us who have to do professional work can afford a $1200 card.
    I have a 12gb 3060 at the moment and I have run into the problem of filling up the VRAM. It's not fun when you have to do work.

  • @czbrat
    @czbrat Год назад +2

    Pay $400 for an 8GB card? Nah bro I'm good. Especially since one of the big features (frame generation) has a VRAM cost when being used.

  • @tex69420
    @tex69420 Год назад +8

    Why was the 4060 ti shill taken down?

    • @2hotflavored666
      @2hotflavored666 Год назад +1

      Because AMD fanboys swarmed the video and cried their eyes out.

    • @tex69420
      @tex69420 Год назад +7

      @@2hotflavored666 delusional

    • @2hotflavored666
      @2hotflavored666 Год назад +1

      @@tex69420 AMD fanboys are indeed.

    • @tex69420
      @tex69420 Год назад

      @@2hotflavored666 ok chief

    • @2hotflavored666
      @2hotflavored666 Год назад +1

      @@tex69420 👍

  • @SaviorKirbo
    @SaviorKirbo Год назад +3

    500$ for a 60ti gpu💀💀💀