NVIDIA RTX 4060 & RTX 4060 Ti Specs: VRAM, Price, Release Date, & Cache

Поделиться
HTML-код
  • Опубликовано: 20 сен 2024

Комментарии • 2,2 тыс.

  • @GamersNexus
    @GamersNexus  Год назад +416

    UPDATE: NVIDIA just announced the 4060 non-Ti pricing (they didn't share the price with us prior to this video going up). The 4060 will be $300.
    Learn about the PC part RIP list: ruclips.net/video/PAAhBTkLG4o/видео.html
    Get the LIMITED GN 15-Year Anniversary shirt! Sales close soon! store.gamersnexus.net/products/limited-edition-foil-gn15-tshirt-design

    • @ManuSaraswat
      @ManuSaraswat Год назад +59

      when 6700xt is going for cheaper than 3060, it's really hard to justify any other card in budget lineup.

    • @shaneeslick
      @shaneeslick Год назад +13

      G'day Steve,
      Always love your Technical Explanations, especially accompanied by Andrews AWESOME! 3D Animations

    • @911canihelpu
      @911canihelpu Год назад +11

      with respect to traction control, if its turned off on both cars, then weight, weights distribution, wheelbase etc determine traction, and its better to use this as a comeback analogy to nvidia, fling their own shit back at them

    • @rawdez_
      @rawdez_ Год назад +1

      NOBODY SHOULD BUY ANY OF THAT OVERPRICED AF CRAP EVER AND IT SHOULD ROT ON SHELVES until ngreedia and ayyymd go bankrupt with theirs 2-3 times overpriced GPUs.
      during last 5 years, 1000-3000 ngreedia GPUs were money printing machines, literally. now they're still priced like they are money printing machines BUT THEY ARE NOT capable of making money anymore.
      prices should drop to pre-mining levels and be at least x2-3 times LOWER, because they make 0 sense without mining.
      after that we would finally get some decent progress in gaming and in gaming hardware. currently we have almost 0 progress in 6-7 years.
      and let me remind you that even with pre-mining prices GPUs never were cheap and ngreedia made BILLIONS IN PROFITS even then. now they just make even more billions with insane margins.

    • @H4rdeen
      @H4rdeen Год назад +6

      The informational bits are highly appreciated!

  • @davidwagner3710
    @davidwagner3710 Год назад +464

    got to love NVidia including single digit performance gains for the 4060ti. it seems the 3060ti to the 4060ti is like going from the i7 6700K to the i7 7700K.

    • @sihamhamda47
      @sihamhamda47 Год назад +19

      Intel processor progression at that time is worse than that lol, been stuck in 14nm+++++++ all the way into 11th gen without any notable performance/IPC increase except a higher clock speed

    • @YourSkyliner
      @YourSkyliner Год назад +94

      @@sihamhamda47 Even more embarrassing for Nvidia, given how Intel pulled off these kinds of performance gains while staying on the same node and core counts for ages.

    • @sihamhamda47
      @sihamhamda47 Год назад +9

      @@YourSkyliner Yeah, fortunately Intel finally listen to their customer, and dropping the 12th and 13th gen with massive 50% performance increase each gen (12th gen is 40-50% faster than 11th gen, and 13th gen is 40-50% faster than 12th gen)
      Let's see how long it takes for Nvidia to finally listen to their consumer as well

    • @zurvanlol3869
      @zurvanlol3869 Год назад +2

      i mean in games which provide dlss 3.0 support the 4060ti should perform well, other than that... mhhh yes...lol

    • @Giliver
      @Giliver Год назад +1

      Odd comparison

  • @theOldestMaan
    @theOldestMaan Год назад +522

    AMD has a genuine opening here to gain footing in the marketplace. They'll botch it, but it's there.

    • @travotravo6190
      @travotravo6190 Год назад +83

      I don't think either company cares. It's lackluster because they can sell old stock, and fall back on console deals and stuff. Gamers are the losers this generation.
      If by some miracle Intel could triple down on GPUs right now we might see something competitive, but that's going to be unlikely as well.

    • @I2ed3ye
      @I2ed3ye Год назад +40

      Getting ready for the same AMD marketing campaign from last time where they acknowledge that they screwed up the previous time, will do good by gamers, and then go back to their shareholders and completely erase their whiteboard and replace it with dollar signs.

    • @liberteus
      @liberteus Год назад

      Unfortunately, Always Miscalculated Decisions will call their Awful Marketing Department and botch the whole thing.

    • @-INFERNUS-
      @-INFERNUS- Год назад

      All 3 GPU companies are pathetic, but the sad part is we need to pick one. Nvidia are greedy shit bags, AMD is not even bringing out their new RDNA 3 cards to compete with Nvidia and Intel is a sad joke they don't seem to take the GPU industry seriously. They have no new GPU until 2024, WTF 😑

    • @kazioo2
      @kazioo2 Год назад +16

      Nope, there is NO opening. AMD not having upper technical margins in GPUs over Nvidia of any kind means they can't do that, because every single move can be easily countered by Nvidia and in the ride to the bottom war it would be the AMD struggling more than Nvidia. AMD knows it that's why they don't want a price war, like they did with Ryzen where they DID have technical scaling advantage over Intel with the glue trick.

  • @Shadowauratechno
    @Shadowauratechno Год назад +66

    Back when the GTX 960 launched, it was criticized for using a 128-bit bus when the 760 used a 256-bit bus. The defense often given was that Nvidia had managed to optimize and compress the way it handled data, and thanks to bigger cache and the new MFAA technology it could get around the limitations of the slower memory bus. The 960 showed initial promise in the few games that supported MFAA when compared to the 760. Then MFAA became irrelevant. Even geforce experience advises against turning it on in games that supported it. The 960, despite having significantly more compute power, started achieving similar frame rates to its predecessor in new titles, especially DX11 games. I will not be getting my hopes up about Nvidias marketing this time. At least its good to see GN attempting to hold Nvidia a bit more accountable than they did in the review for the 960.

    • @PQED
      @PQED Год назад +7

      Precisely.
      nVidia is looking for the most sales at the highest price they can charge, with products getting irrelevant just in time for the next gen (planned obsolescence).
      Really gives me a bad taste in my mouth, and the worst part is that it's getting more common in everything (even outside electronics); but in this very case other ompanies are very much following nVidias lead, and that's just horrible for us consumers that have nowhere near the disposable income to get something even half-decent.
      40-series is a real milestone, because it's where we're paying the same _or more than last gen_, but with near zero performance uplift.
      I wish more people could understand that, and stop defending the megacorps that hardly need it.
      We truly need to look out for eachother if we want fair pricing as well as fair products.

    • @imnotusingmyrealname4566
      @imnotusingmyrealname4566 Год назад +5

      Nvidia cards age like milk while AMD age like wine. The R9 390X was laughed at for its ludicrous 8GB VRAM but it still offers relatively respectable performance while the GTX 970 with its 3.5GB VRAM completely fell behind. Also thanks for this history lesson on that GN review. I wasn't around then and checked out of the PC market until a couple days ago and I am scared at how little I know. For example in my country the 4080 is now available very close to 1200€ and I thought "hmm MSRP is 1200$ so that's pretty good" while in reality that is apparently awful value.

    • @imnotusingmyrealname4566
      @imnotusingmyrealname4566 Год назад +2

      I remember Maxwell having REALLY good compression compared to the competition and predecessors but it just can't replace double the memory bus.

    • @PQED
      @PQED Год назад +2

      @@imnotusingmyrealname4566 AMD typically doesn't skimp on the VRAM and memory bus, so yeah, they age pretty well most of the time.
      Their problem was always drivers (software), but it looks like they finally sorted that out, which is a relief.
      I haven't personally owned an AMD card since the 4870 (great if could find the right driver), but if a good deal came my way nowadays I wouldn't hesitate.
      And yeah, Maxwell cards were probably one of the last great architectures, and performs well even today if you have a model with enough VRAM.
      Its advances very much paved the way for Pascal and the legendary 1080ti.

    • @imnotusingmyrealname4566
      @imnotusingmyrealname4566 Год назад +1

      @@PQED i regret buying a 1070 for 440€ and not just extending to the 1080Ti, such value will never be achieved I'm afraid.

  • @TheGameBench
    @TheGameBench Год назад +43

    Nice to see someone addressing the memory bandwidth and cache. Doesn't seem to be that big of an issue for gaming, but it seems that it can be a problem for software that doesn't really benefit from the cache. There are some professional workloads where the 16GB 3060 Ti could end up being a good value, and there could be others where it won't be a good option.

    • @ItsNotAProblem
      @ItsNotAProblem Год назад +12

      I like how you called it a 3060ti. I think it's a good name for it.

    • @TheGameBench
      @TheGameBench Год назад +5

      @@ItsNotAProblem Haha... crap. Still screwing that up.

    • @pfizerpricehike9747
      @pfizerpricehike9747 Год назад +3

      Might as well just call it buscrippled 3070 with fake frames, other than that 1 extremely useless niche artificially locked-down software „feature“ there is nothing different between 3 and 4000 series

    • @TheGameBench
      @TheGameBench Год назад +1

      Seems like they saw the 6500 XT and thought... we should do that to everything under the 80 class.

    • @KefazX
      @KefazX Год назад

      The 3060 12GB is a good card for Iray rendering. Hopefully the 4060 Ti 16GB will be too.

  • @zyhawk42
    @zyhawk42 Год назад +171

    Really liked the explanation of the cache system, it was easy to understand! Hope you'll keep doing more educational stuff in other videos

    • @joseseda5914
      @joseseda5914 Год назад +3

      Same here; cpu, memory, ssd, etc. Whenever you have the chance, as a pc noob, I would love. Even if it's just a 5 min. Explanation like you guys did in this video.

    • @LudwigHorsecock
      @LudwigHorsecock Год назад +1

      Agreed. People constantly bring it up, hopefully this will explain just why it’s so important

    • @BlueDrew10
      @BlueDrew10 Год назад +1

      It's crazy, I was actually just looking for an explanation on how GPUs do texture draw calls, in order to figure out if GDDR7 will truly result in lower VRAM capacity needs. While this doesn't fully answer that question, I am now much more informed on the subject than yesterday. 😆

    • @bloberglawp9981
      @bloberglawp9981 Год назад

      +1

    • @kwerboom
      @kwerboom Год назад

      Same here. It's generally good to hear about the basics especially when there are changes in the way a new generation of hardware uses the basics.

  • @davidwagner3710
    @davidwagner3710 Год назад +502

    $500 for a 16GB 3070. I cant wait for another DOA NVidia!

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад +2

      Agreed.

    • @GregoryShtevensh
      @GregoryShtevensh Год назад +20

      Same launch as the 3070, but more VRAM, better features, better basically everything... Have a cry though

    • @rawdez_
      @rawdez_ Год назад +1

      @@GregoryShtevensh NOBODY SHOULD BUY ANY OF THAT OVERPRICED AF CRAP EVER AND IT SHOULD ROT ON SHELVES until ngreedia and ayyymd go bankrupt with theirs 2-3 times overpriced GPUs.
      during last 5 years, 1000-3000 ngreedia GPUs were money printing machines, literally. now they're still priced like they are money printing machines BUT THEY ARE NOT capable of making money anymore.
      prices should drop to pre-mining levels and be at least x2-3 times LOWER, because they make 0 sense without mining.
      after that we would finally get some decent progress in gaming and in gaming hardware. currently we have almost 0 progress in 6-7 years.
      and let me remind you that even with pre-mining prices GPUs never were cheap and ngreedia made BILLIONS IN PROFITS even then. now they just make even more billions with insane margins.

    • @rawdez_
      @rawdez_ Год назад +22

      @@blaken3824 a 4090 die 608mm2 costs 300 bucks MAX to make according to wafer calcs and 2 y.o. TSMC prices per wafer, its more like 200-250 bucks now. 300 bucks number from calcs is too high because its 2 y.o. prices, ngreedia cut TSMC orders = they are getting better prices to not cut even more orders, + on the same wafer where big dies don't fit you can make smaller dies like a 4080, a 4070Ti etc. which ngreedia gets "for free" - if you account for them unlike wafer calcs do = they lower the cost of a wafer significantly.
      $1600-$2000 for a less than 300 bucks 4090 die (more likely $200-250)? just lol. I understand if they'd want 800 bucks for that, max. it still would be at least 200 bucks overpriced.
      a 4080 die 379mm2 costs 150 bucks MAX - again without accounting for smaller dies that are made "for free" on the same wafer and other things. $1200+ for that?
      and ngreedia is trying to sell 60-class cut down silicon 295mm2 that worth around 50 bucks as 4070ti for 800 bucks with 12gb VRAM, they even tried to sell it like 80-class silicon for 900 bucks at first. 12gb VRAM on 800-900+ bucks cards, Carl! 12gb VRAM... in 2023... on 800-900+ bucks GPUs!!!111
      12gb VRAM is fine for 3060 performance level, a 4070tie should have at least 16GB or 24Gb as a 3090. otherwise its a morally obsolete GPU, especially for 800-900+ bucks.
      the 4070ti is basically a morally obsolete piece of crap suitable only for either extreme budget 1080p monitors, or for old games @1440p.
      and ngreedia wants 800-900+ bucks for that ridiculous piece of crap GPU with 12GB VRAM. The 4070 Ti sucks at higher resolutions as it has less memory bandwidth then even a 3070 Ti. It's an extremely crippled card.
      Rtx 4070ti main selling point is to milk dumb boiz' wallets with a stupid overpriced AF GPU with low VRAM thats already obsolete because it can't run shit natively without upscale and glitchy fake frames.
      DLSS 3 is glitchy fake frames that don't improve shit but sell morally obsolete hardware that can't run games natively to stupid ngreedia fanboiz.
      7900xtx 529mm2 and of those 529 only 300 are actually the GPU, the rest are memory controllers on 6nm and those are CHEAP. AyyMD's GPUs are WAY cheaper to make than ngreedia's overpriced AF crap.
      ngreedia is killing PC gaming on purpose to sell its crap laggy af GaaS subscription for 20 bucks/month.
      thats why it releases crap slow overpriced AF morally obsolete GPUs with low VRAM that have no sense.
      and AMD does exactly the same - killing PC gaming on purpose to sell more morally obsolete APUs to sony/ms for their consoles.
      + both corporations milk the market with overpriced AF GPUs

    • @marcogenovesi8570
      @marcogenovesi8570 Год назад +86

      @@GregoryShtevensh Hmm yes "better everything", sure, RAM interface is 128bit instead of 256 so the bandwith is halved but sure it's "better everything" and therefore clearly better

  • @dil6969
    @dil6969 Год назад +414

    Looking back to the Pascal days, it's hard not to be cynical about current generation GPU launches. The only positive thing I can say is that AMD is actually pretty competitive currently, but the prices for both options are still out of control.

    • @BBWahoo
      @BBWahoo Год назад +34

      1080ti for LIFE (until the 22gb 4080ti)

    • @ProjectPhysX
      @ProjectPhysX Год назад +6

      I'm stil l running Pascal (Titan Xp) and I see no reason to upgrade within the next 3 years. The only viable upgrade right now would be the 3090, all other cards have either less VRAM capacity or less bandwidth, or don't fit in the PC case. And only a 2x speedup is not really worth it.

    • @SarcasticDragonGaming
      @SarcasticDragonGaming Год назад +18

      I replaced my 1080Ti back in December for a 7900 XTX. A year ago the idea of switching away from Nvidia was totally unthinkable but I gotta say I have absolutely no regrets, especially considering my only other option would’ve left me with 8GB less VRAM.

    • @w04h
      @w04h Год назад +6

      No reason not to go amd if you have the budget unless you specifically need nvidia amd is better choice for gaming

    • @sihamhamda47
      @sihamhamda47 Год назад +1

      The only important for AMD GPU thing is hoping AMD does not stay in the same VRAM rabbit hole for longer, as AMD still use the same VRAM scaling for their lineup (although with a bit lower price than Nvidia ones):
      4-8 GB for entry level
      12-16GB for mid range
      20GB for high end
      24GB for highest end/flagship
      That needs to be increased further in next generation if AMD still needs to be ahead from everyone

  • @rhoharane
    @rhoharane Год назад +29

    This is a really good format, Steve! Thanks so much for going the extra mile to reach out to nvidia for technical details and their statements on specific things and explaining it well.

  • @daviddavid4962
    @daviddavid4962 Год назад +61

    The explanation of how cache works was great. Way better than reading technical jargon on some websites. Thanks for another great video.

    • @messiahcze
      @messiahcze Год назад +7

      Loved it too. Please include more of these educational pieces into your videos, GN!

  • @ilikepie1974
    @ilikepie1974 Год назад +284

    For DLSS, I think a really good car analogy would be GM super cruise. It's their self-driving thing but it's trained on a road per road basis, so it only works on certain roads.

    • @GamersNexus
      @GamersNexus  Год назад +86

      I think that one makes sense. Great point!

    • @danielparks9035
      @danielparks9035 Год назад +19

      Right, it's only useful in specific scenarios. I don't know how Nvidia could possibly be confused here. DLSS 3 is only available in a select number of games, it adds artifacts, and not only does it not decrease input lag despite producing more frames, it adds more lol. So it's useless in old games, multiplayer games, most indie games, etc. It's not that hard.

    • @CapaNoisyCapa
      @CapaNoisyCapa Год назад +11

      Car analogies rarely work and they're often a trap for the person making it. If there were at least one person who really understood how TC really works (ABS too) and could drive a car relatively fast, Nvidia would be in the corner begging for mercy in less than a minute.
      You see, a car is objectively faster without TC and ABS if the driver is somewhat competent. I'm not talking F1 level driver here, just some guy who did maybe a dozen track days in his own car will get good enough to turn off all the driving aids to be faster and have more fun.
      It's the same with DLSS3 with frame generation. You'll soon realize that all those frames are making you worse at most games because you can't react fast enough because of all the latency introduced by fake frames.

    • @0Synergy
      @0Synergy Год назад +4

      @@CapaNoisyCapa ABS is actually faster these days since it has individual control over all 4 wheels vs 1 brake pedal.

    • @CapaNoisyCapa
      @CapaNoisyCapa Год назад +3

      @@0Synergy Ahm... ABS always had independent control over the four calipers biting the discs on all four corners since it's inception.
      I agree that nowadays it takes a really fast driver in a very fast car to be faster than modern ABS implementations but TC remains largely the same. The sound of TC cutting the engine is depressing...

  • @rickysargulesh1053
    @rickysargulesh1053 Год назад +810

    Am I the only one sick of graphics cards thanks to Nvidia and AMD price gouging consumers?

    • @rickysargulesh1053
      @rickysargulesh1053 Год назад +157

      Also 500$ is a joke for a 4060ti

    • @TimberWulfIsHere
      @TimberWulfIsHere Год назад +24

      Wym amd? Amd is far cheaper df

    • @SweatyFeetGirl
      @SweatyFeetGirl Год назад +84

      @@TimberWulfIsHere amd will release a 299$ 8gb card aswell , as fast as the RX 6700 with 10gb which can be bought for the same price already..

    • @callofmetals24
      @callofmetals24 Год назад +10

      I like AMD prices 🤷🏾‍♂️

    • @GamersNexus
      @GamersNexus  Год назад +415

      Definitely not the only one who is 'checked out' on GPUs until prices come down.

  • @Hezzadude12
    @Hezzadude12 Год назад +423

    Ah yes, Nvidia finally gave into the pressure and are offering higher VRAM... on a card that will probably still be stomped by last generation cards that ARE gimped on VRAM. We basically can't win - but at least the pressure is getting through to them.

    • @rkan2
      @rkan2 Год назад +38

      Well meh, the 3060 had 12GB already. Nothing new here..

    • @conyo985
      @conyo985 Год назад +71

      The price is bad. 499 for a 16GB card? It really makes the ARC A770 an even better deal.

    • @DragonOfTheMortalKombat
      @DragonOfTheMortalKombat Год назад +33

      @@conyo985 A770 is a far better deal than any Nvidia GPU honestly. Had they kept it 450$, it would've been good. Most likely, it'll be on shelves an under MSRP just like the so called "4070" and 4080.

    • @Stanley-px3bt
      @Stanley-px3bt Год назад +18

      My strategy is the same, wait for the 5090 to come out, and buy a used 4090.

    • @rawdez_
      @rawdez_ Год назад +13

      a 4090 die 608mm2 costs 300 bucks MAX to make according to wafer calcs and 2 y.o. TSMC prices per wafer, its more like 200-250 bucks now. 300 bucks number from calcs is too high because its 2 y.o. prices, ngreedia cut TSMC orders = they are getting better prices to not cut even more orders, + on the same wafer where big dies don't fit you can make smaller dies like a 4080, a 4070Ti etc. which ngreedia gets "for free" - if you account for them unlike wafer calcs do = they lower the cost of a wafer significantly.
      $1600-$2000 for a less than 300 bucks 4090 die (more likely $200-250)? just lol. I understand if they'd want 800 bucks for that, max. it still would be at least 200 bucks overpriced.
      a 4080 die 379mm2 costs 150 bucks MAX - again without accounting for smaller dies that are made "for free" on the same wafer and other things. $1200+ for that?
      and ngreedia is trying to sell 60-class cut down silicon 295mm2 that worth around 50 bucks as 4070ti for 800 bucks with 12gb VRAM, they even tried to sell it like 80-class silicon for 900 bucks at first. 12gb VRAM on 800-900+ bucks cards, Carl! 12gb VRAM... in 2023... on 800-900+ bucks GPUs!!!111
      12gb VRAM is fine for 3060 performance level, a 4070tie should have at least 16GB or 24Gb as a 3090. otherwise its a morally obsolete GPU, especially for 800-900+ bucks.
      the 4070ti is basically a morally obsolete piece of crap suitable only for either extreme budget 1080p monitors, or for old games @1440p.
      and ngreedia wants 800-900+ bucks for that ridiculous piece of crap GPU with 12GB VRAM. The 4070 Ti sucks at higher resolutions as it has less memory bandwidth then even a 3070 Ti. It's an extremely crippled card.
      Rtx 4070ti main selling point is to milk dumb boiz' wallets with a stupid overpriced AF GPU with low VRAM thats already obsolete because it can't run shit natively without upscale and glitchy fake frames.
      DLSS 3 is glitchy fake frames that don't improve shit but sell morally obsolete hardware that can't run games natively to stupid ngreedia fanboiz.
      7900xtx 529mm2 and of those 529 only 300 are actually the GPU, the rest are memory controllers on 6nm and those are CHEAP. AyyMD's GPUs are WAY cheaper to make than ngreedia's overpriced AF crap.
      ngreedia is killing PC gaming on purpose to sell its crap laggy af GaaS subscription for 20 bucks/month.
      thats why it releases crap slow overpriced AF morally obsolete GPUs with low VRAM that have no sense.
      and AMD does exactly the same - killing PC gaming on purpose to sell more morally obsolete APUs to sony/ms for their consoles.
      + both corporations milk the market with overpriced AF GPUs

  • @fastamx069box8
    @fastamx069box8 Год назад +17

    Hey Steve,
    Excellent presentation. Please continue to provide this educational format. This is just one of many aspects of GN that sets you apart from the rest.
    Kindest Regards
    Fastamx

  • @new1080
    @new1080 Год назад +60

    I'm looking forward to seeing the benchmarks for how these compare to the Intel Arc A750 and A770, especially in both gaming and transcoding/encoding.

    • @Corristo89
      @Corristo89 Год назад +5

      The A770 16GB typically performs a bit better than the RTX 3060. Where it really shines is with Raytracing activated, due to having 16GB of VRAM. But the A770 can't hold a candle to the RTX 3060 Ti. As a card for 1080p gaming the A770 is alright, especially since its price is OK. But the moment you want to move on to 1440p gaming, it runs out of steam far too quickly for the 16GB of VRAM to matter.
      The AV1 encoding capability was cool initially, but now all of Nvidia's and AMD's current cards have it. If you want to add AV1 encoding to your PC running a last gen (RX 6xxx or RTX 3xxx) card, then just get a A380 for cheap, which has AV1 encoding too. Although that's kind of like buying one of those PhysX cards over 10 years ago, only for Nvidia to just add PhysX to their GPUs.

    • @imnotusingmyrealname4566
      @imnotusingmyrealname4566 Год назад +2

      yeah being on the cutting edge with features often required weird solutions like extra cards or for 8k two DP 1.4 cables or just a bad value for the performance

    • @evan9536
      @evan9536 Год назад +2

      I sold my 3070 build for an A770 build in February and it's performed noticeably betters than my 3070 in both productivity and gaming.

  • @MrAlBester
    @MrAlBester Год назад +22

    DOA
    8gb: 300-330
    16gb: 350-380
    When you can get the RX 6700 10gb(≈3060 Ti) for 280$

    • @tringuyen7519
      @tringuyen7519 Год назад

      Agreed. $400 4060Ti with 8G VRAM vs $400 6750XT with 12G of VRAM on Newegg! Only extreme Nvidia fanboys will buy the 4060Ti.

  • @OhHeyItsShan
    @OhHeyItsShan Год назад +218

    Looking forward to seeing my 3060 compared to the 4060 in July.

    • @iwillnoteatzebugs
      @iwillnoteatzebugs Год назад +35

      4060 is basically a 3060ti

    • @bobvila4381
      @bobvila4381 Год назад

      Frame Gen would be the only reason to get a 4060 over a 3060...

    • @Snow.2040
      @Snow.2040 Год назад +17

      @@iwillnoteatzebugs it is definitely gonna be much worse than an rtx 3060 ti, it is basically a 3060 with 4gb less vram, 10% extra performance, and frame gen

    • @kneppernicus
      @kneppernicus Год назад +21

      ​@@bobvila4381 Nvidia's frame gen sucks, it's a complete embarrassment for games and destroys latency. Frame gen is trash and a crap reason to "upgrade", a complete snake-oil technology meant to rip off more consumers and fill the pockets of Nvidia while not giving a single !@#$ about the consumer (other than your cash and wallet).

    • @damasterpiece08
      @damasterpiece08 Год назад

      @@kneppernicus could be nice to play with at 1080i :))))))))))

  • @RurouTube
    @RurouTube Год назад +175

    So back then when Nvidia presented 3060 Ti, they benchmarked against previous gen GPUs (2060 Super and 2080 Super) at 1440p. Now they benchmarked 4060 Ti against... 3060 Ti and 2060 Super at probably 1080p.
    Basically they are going backwards in performance target, but most likely the bandwidth is not enough for 1440p gaming. Based on what AMD did with infinity cache, extra 32MB cache is likely not enough for 1440p (the performance boost is less universal in 1440p vs 1080p).

    • @Quast
      @Quast Год назад

      yeah, I doubt that but I wouldn't doubt that NVidia will go down with xy60s chip size as far as they can in the future.

    • @estamnar6092
      @estamnar6092 Год назад +10

      I game at 1440p on a 3060... its not "ultra everything", but most games run at ~120fps on medium settings

    • @Lucky-rl6gg
      @Lucky-rl6gg Год назад +3

      Sad fact is people will still buy it.

    • @Mako2401
      @Mako2401 Год назад +1

      ​@@estamnar6092 Is 3060 128 bit?

    • @estamnar6092
      @estamnar6092 Год назад

      @@Mako2401 idk. I know it runs games at 1440p@~120fps

  • @VRGamingTherapy
    @VRGamingTherapy Год назад +2

    So everyone tore Nvidia apart when they announced two 4080 variants. Now they're back at it with 3 fucking 4060 variants with TWO ti models. Holy shit.

  • @jolness1
    @jolness1 Год назад +10

    I think it’s interesting that AMD and Nvidia are taking different routes to deal with the fact that I/O is not shrinking with node improvements. AMD is splitting the I/O as much as possible to use an older node to save money just like they did with ryzen and nvidia is just trying to use the smallest bus (which means they NEED gddr7 next gen more than likely) to save space.

  • @jeevana.6391
    @jeevana.6391 Год назад +45

    It's important to know raw performance as many games don't even support DLSS, and like you said, it provides a full picture. Or, in a way that Nvidia marketing can understand, most people only use their car in a straight line.

  • @vatglut
    @vatglut Год назад +43

    Keep the explanations coming. I think an informed consumer is always better than an uninformed or under informed consumer. As you pointed out there are time stamps if you wish to skip.

  • @jasonjavelin
    @jasonjavelin Год назад +84

    That $500-450 range was what I was optimistically hoping the 4070 would be but that was a long shot. My 3070 is solid for now

    • @BBWahoo
      @BBWahoo Год назад +11

      This is a load of barnacles.

    • @a-alex-p7033
      @a-alex-p7033 Год назад +1

      @@BBWahoo staawp😭

    • @mr.electronx9036
      @mr.electronx9036 Год назад +15

      4070 is a good card with bad pricing like all 40 cards
      look at the 4080, for 600 bucks it could outsell all amd cards
      nvidia is just pathetic

    • @boss_niko
      @boss_niko Год назад

      so you noob will buy 3060 or 4060 tell me now.

    • @ItsNotAProblem
      @ItsNotAProblem Год назад

      ​@@mr.electronx90364060 is a shit card with a shit price

  • @Fiztex553
    @Fiztex553 Год назад +2

    If we look carefully at graphs presented by NVIDIA we see that for 4060ti 8GB vs 16GB they've used different ray tracing presets for some games. The only reason to do that is the VRAM limit leading to a significant performance drop with the 8GB version. So, while increased L2 cache can help with some complexed shadering and effects which are used often in the pipeline, it won't help in applications where big chunks of data should be processed at the same time, like for example real time ray tracing. And this is only at 1080p!
    Looking forward to the testing results, hope they will clarify the real deal to everyone. Probably the only positive thing from this release is that 6750XT will drop in price even more.

  • @koranthus1
    @koranthus1 Год назад +12

    I really enjoyed the educational portion, even though a lot of it was already known to me, but I still learned a little more about the interplay between the caches to VRAM to system memory.
    Also, loved the formula for calculating the bus width from bandwidth. Looks like they took the 288 real-world number and divided by .52 to get their estimated "effective" number.
    On the topic of educational sections, have you considered cutting that part into a separate video that you can say in the future "If you'd like to learn how cache and VRAM pay together, click the icard or find the link in the description"?

  • @Djoki1
    @Djoki1 Год назад +81

    Knowing my local market, the pricing for these GPUs is going to be awful.
    Some shops still sell 3060Ti's at roughly $700.
    My recent 4 day trip to Austria cost me about the same, every expense included.
    I could literally go to another country, get a GPU there for half the price, look around a different city for a while because most european cities are beautiful and come back home with like $200 left in my pocket.
    Its like, most of the money these sellers make somehow vanishes into thin air /s.

    • @martinbernath
      @martinbernath Год назад +10

      Where are you from? Computer parts are ridiculously overpriced in europe compared to the US and UK. I live in Slovakia (right next to Austria) and you can pretty much multiply all the costs by a factor of 1.5 and get price in euros, making all the new cards and cpus really pricey.

    • @PinePizza
      @PinePizza Год назад +1

      Almost happy that I purchased a 3070 ti for 580€ a while ago now lol. Btw just a few weeks before the 4070 got released at a semi normal price here.

    • @dwardyblancky236
      @dwardyblancky236 Год назад +5

      It's pretty weird. Even in my place in Russia, under sanctions and parallel imports of electronics, which is an insanely complicated way through the devil's ass, 3060Ti ~ $450-500.

    • @LudwigHorsecock
      @LudwigHorsecock Год назад

      Redback spiders, hospitals hours away, casinos running congress, and overpriced gpus who piss on the msrp. Still not as bad as Ohio, but it’s up there

    • @ironbru1986
      @ironbru1986 Год назад +1

      @@martinbernath GPU prices in the UK are terrible too, I should know I live there.

  • @giant47
    @giant47 Год назад +7

    DLSS 3 doesn't improve input latency. It's one of the primary benefits of higher frame rates. Nvidia is trying to sell us on a living example of Goodhart's law. The metric became a target and they found a way to improve the metric without the actual benefit the metric held. It's madness that they think it's acceptable to report DLSS 3 numbers as if the figures are comparable.

  • @jezusrvd
    @jezusrvd Год назад +23

    Yeah I like the memory breakdown. Especially when explaining how it falls into manufacturing and the business side of things.

  • @ArmchairMagpie
    @ArmchairMagpie Год назад +48

    It's actually even easier to deduct bus width. We know that they are using Microns GDDR6X chips, probably a down-clocked variant of the MT61K512M32KPA, which are in essence 2 GB each tied to a 32-bit memory controller in normal mode, which means: 4x32 = 128 bit. I suspect, since the number of cores and shaders remain the same, that they are using a sub-variant of the AD106-350 which will be utilizing clamshell mode to accommodate the increase to 16 GB.

    • @dex6316
      @dex6316 Год назад +5

      “We know that they are using […]”. How is an average person supposed to know that? I read the white paper and didn’t see it listed. Also, how do you have this information on an unreleased card when Nvidia has historically used multiple memory types from multiple memory vendors? If you know they are using 32Gb dies it is easy, but you are requiring more technical knowledge than is required to calculate from bandwidth. Bandwidth formula is pretty consistent across generations, while memory module density changes pretty much every couple of years. Someone would have to consistently keep track of the memory industry which is such an extreme niche.

    • @ArmchairMagpie
      @ArmchairMagpie Год назад +10

      @@dex6316 I would consider myself at maximum an average person, lol.
      How would you know? You only have to look at the PCB and its layout, you see the parts and the part codes, and you can easily look them up directly at the vendor site. Normally, the memory chips are all very close to the GPU for latency reasons. Nvidia has been using Micron chips last and this generation, but they could also use a different vendor, but it doesn't matter because SK Hynix (which AMD is using) is basically offering similar products. Last gen was using chips with 256x32 layout (words x bits), this is why the limit was at 12 GB, since basically all slots were filled. This gen has its capacity doubled to a 512 layout. So, the math is easy.
      I am not questioning the way how someone deduces bus with, rather how you could make it even easier.
      Yes, you are right, it is pretty technical information, but I thought it is adequate on this channel to say that, no offense intended.

    • @eetoonamamanakooo
      @eetoonamamanakooo Год назад +14

      @@ArmchairMagpieyou’re right, looking at part numbers and looking through the net with basic knowledge can make you sound a hell of a lot smarter. In any case, good job, this paragraph has more research than some articles.

    • @gt4lex
      @gt4lex Год назад +1

      Or you could have asked.

    • @ArmchairMagpie
      @ArmchairMagpie Год назад

      @@gt4lex Ask about what? Even GN didn't have the bus width information outright and offered a way of deducing it. Which works, but I kinda like mine too.

  • @Marauder1981
    @Marauder1981 Год назад +3

    Saying that this card can do RTX is like saying "Yes, you can totally participate in the Rallye Dakar with your wheelbarrow!

  • @a.sanford8731
    @a.sanford8731 Год назад +43

    I pretty much decided when I bought my 3070 that I'd wait until the 6 series to upgrade. Nvidia keeps reminding me that's probably still a good idea.

    • @BBWahoo
      @BBWahoo Год назад +16

      I'm in the akward 1080ti scenario, no nvidia upgrades until I get 4090 adjacent perf for 1000 or less.

    • @Captain-Chats
      @Captain-Chats Год назад +1

      I have a 3080 and was always told to wait a few generations before upgrading. I feel right now that is very good advice 😂

    • @slandgkearth
      @slandgkearth Год назад +1

      Totally not worth it, Nvidia releases are full of BS

    • @mantid83
      @mantid83 Год назад +1

      I bought a 3070 for $300 10 months ago. I guess I'll be waiting also.

    • @PiPArtemis
      @PiPArtemis Год назад

      I'm considering saving and shooting for a 3080 on ebay or something so I can skip the next generation or three. The 40 series just doesn't make a compelling argument for itself

  • @WielkaKasza
    @WielkaKasza Год назад +54

    $400 to $500 for a low-end GPU lmao. Nvidia is insane.

    • @user-mfsc-2024
      @user-mfsc-2024 Год назад +2

      true budget card😂😂😂😂

    • @freefall_910
      @freefall_910 Год назад +1

      They are not, they know fans will buy in with big lines .

    • @tringuyen7519
      @tringuyen7519 Год назад +2

      3060Ti with 8G VRAM is still $420 to $470 on Newegg. Nvidia has brainwashed gamers well…

    • @mr.electronx9036
      @mr.electronx9036 Год назад +5

      imagine, my gtx 1080 was 550 bucks and that was a top tier gpu
      now you pay 500 bucks for 60 tier gpu lol

    • @ItsNotAProblem
      @ItsNotAProblem Год назад

      Peoole talk smack in the comments but you still see 400$ 3060's in the best sellers

  • @twiggsherman3641
    @twiggsherman3641 Год назад +7

    I bought a 5800X3D and 6950XT recently for LESS THAN what Jensen wants for a 4080. I upgraded from a 3600X and 1070 Ti. I'd much rather have an Nvidia GPU, but the pricing is just too dumb. Jensen can keep his cards.

  • @ariellaboy7982
    @ariellaboy7982 Год назад +4

    As far as the educational parts, I really enjoyed the refresher on how cache works (a few things I forgot about how each cache level and the bus width interplay have been relearned) and very much enjoyed learning new things about manufacturing costs and die real estate must be balanced to deliver a product that is both effective and cost efficient to the consumer and the company selling the product.

  • @bleack8701
    @bleack8701 Год назад +2

    Educational section was very appreciated. It's nice to refresh on the basics from time to time

  • @Noneya5555
    @Noneya5555 Год назад +29

    The deep dive into how gpu cache works was extremely informative and helpful. I've been wondering why Nvidia reduced the memory bus on the 40 series cards, and how those reductions impacted performance.
    Please keep the educational segments. And keep up the good work! 👍🏾

  • @sonicgd
    @sonicgd Год назад +21

    That cache segment is awesome. Keep doing educational stuff!

  • @Vexonia_Music
    @Vexonia_Music Год назад +71

    I'm personally wondering if the 4060 is going to be able to compete with the 16 gb version of either the A770 and the A750 and similar price range, since those two cards have been getting better over time.

    • @zivzulander
      @zivzulander Год назад +20

      I have a 16GB A770 (Acer model GN reviewed) and it's been great thus far. Only area where it's lacking is using it for any type of machine learning applications, where Nvidia still has a big edge. But the price difference makes Arc quite solid for gaming. Intel has improved the card a fair bit since launch in terms of bug fixes and performance.

    • @davidbutton3500
      @davidbutton3500 Год назад +3

      No Intel A770's available in Canada. Was hoping to get one, but seems its a no-go.

    • @AIChameleonMusic
      @AIChameleonMusic Год назад

      A valid point!

    • @ProjectPhysX
      @ProjectPhysX Год назад +4

      Think of it this way: the memory bandwidth on Arc is about *double*, and they cost less.

    • @bottomtext5872
      @bottomtext5872 Год назад +2

      @@davidbutton3500 Memex and Canada computers has em stocked and ready for you to buy. Not sure where you're looking

  • @BobBobson
    @BobBobson Год назад

    I don't really need the educational bits, but I don't mind them, and they're important for people who don't already know. Having the info in the video means nobody has to stop and search for the information, and you give that information to people who won't bother doing a simple internet search.
    Good shit.

  • @paxvesania2008
    @paxvesania2008 Год назад +2

    I'm planning on building a PC around end of the year or early next year for 1080p gaming.
    Seems like this 4060Ti is a solid choice. Can't wait for the performance review and see how it fairs.

  • @DerpishBird
    @DerpishBird Год назад +27

    $500 for 16GB 4060 Ti while 3060 Ti came out at $400. I hope the performance and larger VRAM is a big improvement to justify +$100.

    • @stephenn504
      @stephenn504 Год назад +44

      It wont

    • @SupraSav
      @SupraSav Год назад +22

      It will not. They will call it inflation + cost of new tech/manufacturing.

    • @TimberWulfIsHere
      @TimberWulfIsHere Год назад +1

      It won't.

    • @joemarais7683
      @joemarais7683 Год назад +10

      Nope. It’ll just be a 16gb 3070 for $500. Which is what we should have gotten in 2020.

    • @oliberrr
      @oliberrr Год назад +6

      Looks like it won’t. 128bit vs 256?

  • @Raika63
    @Raika63 Год назад +15

    I appreciate even the basic explainers, despite already knowing how it works. For one, sometimes stuff changes over time, and other times it's just good to be sure your understanding is correct.

  • @GermanKerman
    @GermanKerman Год назад +63

    The 4060ti being 400 to 500 is so expensive. Wtf

    • @coreylineberry8557
      @coreylineberry8557 Год назад +3

      I imagine with how AMD has priced their cards for that price you'll be able to get their 4070 competitor.

    • @primerivz6239
      @primerivz6239 Год назад +13

      It's wild that you can get a 6800xt for $500 today which will blow the doors off the 4060ti

    • @elmariachi5133
      @elmariachi5133 Год назад +2

      A 4060ti would be expensive at 500 .. But as this 4060 really is a 4030 by performance .. there are are no words for describing this anymore.

    • @GermanKerman
      @GermanKerman Год назад +2

      ​@@elmariachi5133 greed is a good word

    • @mayonotes9849
      @mayonotes9849 Год назад +2

      Remember when the 60 class GPUs only cost 200-300+?

  • @georgioszampoukis1966
    @georgioszampoukis1966 Год назад +2

    The only reason that we see memory boosts on the lower tier cards is because these cards do not threaten the sales of the workstation cards at the same memory capacity. Having said that, workstation cards should start with at least 16GB in 2023+.

  • @oatbear8243
    @oatbear8243 Год назад +3

    Am I missing something? More cache hits are great, but if the textures don’t fit in memory, then it’s not going to help you, right?

  • @user-ck9cw8fs5n
    @user-ck9cw8fs5n Год назад +10

    Appreciate the educational section about the cache. I super appreciate how simple you made it

  • @SirCaco
    @SirCaco Год назад +30

    Absolutely out of the question. These prices are unacceptable, I'll pass.

  • @purplegill10
    @purplegill10 Год назад +37

    For me the only thing really interesting from this lineup is that absolutely TINY 115 TDP number. I really, really hope we get some mini itx cards this time around and, if possible, even a half-height card. That's the only reason I'd be willing to pay that much though given how high the prices are in a lower wage/higher rent/higher inflation world.

    • @funi1083
      @funi1083 Год назад +3

      @bruh maybe some people have less space in their room?

    • @SapphireThunder
      @SapphireThunder Год назад +3

      @@funi1083 There's many, small mATX cases that fit proper, full size gpus. If they don't have space for something like that, they would be better off with laptop.

    • @purplegill10
      @purplegill10 Год назад +3

      @bruh It's not about the case cost, it's about people who like doing SFF for many different reasons. While I have a regular midtower today, back in the day I made an SFF for myself because I was moving around a lot and couchsurfing. Had I had a full-sized desktop I probably wouldn't have had the opportunities I had given that a laptop was unaffordable to me at the time. Moreover, there's people who build SFF for other legitimate reasons like for aesthetics or just having fun. Plus, if an itx-sized card _did_ come out with that low of a power draw, assuming the wattage/board power is accurate and similar to the TDP, you could drop it into an office prebuilt with a basic sata to pcie adapter due to how little wattage you need after accounting for slot power.

    • @funi1083
      @funi1083 Год назад +1

      @@SapphireThunder fair

    • @purplegill10
      @purplegill10 Год назад +1

      @@SapphireThunder Before I lived in the place I have now, I couldn't run a desktop system under my table so my only option was using it on top of my desk which was a folding card table. A mini itx cube was realistically the only thing that would have worked at the time and if I had a long gpu it wouldn't have fit. Nowadays there are great cases that fit much larger gpus while still maintaining a small footprint but that doesn't mean small cards don't still have their uses.

  • @HolyDuckTurtle
    @HolyDuckTurtle Год назад +8

    Thank you for the graphs and highlighted images! Sometimes the technical descriptions go over my head and I'd love to see things like illustrated flow diagrams in certain cases. e.g. In a failure analysis, a diagram of what is supposed to happen vs what actually did.

  • @chrisbarnes5141
    @chrisbarnes5141 Год назад

    I really appreciated that little education segment. As someone that's built a handful of gaming PCs over recent years it's nice to start learning these aspects of the parts as opposed to just looking at pretty visuals and saying 'hurr durr numbers bigger, I want that card'

  • @Jeff-bn9nd
    @Jeff-bn9nd Год назад +13

    That 4060Ti box is EPIC

  • @tabletaccountforyoutube
    @tabletaccountforyoutube Год назад +6

    Nvidia doesn't seem to think anyone wouldn't want to use DLSS 3 frame generation. I don't. I don't care about fake frames and I want my game to BE smoother not just LOOK smoother. I want to get the full reduced input latency benefits of real frames. There's things aren't comparable AT ALL in my opinion.

  • @corndog9482
    @corndog9482 Год назад +4

    Been a PC gamer since the 90's, when entry-level cards were between $150 - $250. $400 - $500 for '60-class' entry-level cards?????!!!! Woooooooooooooow, seriously? 🤡🤡🤡🤡

  • @Jackson-ol5xt
    @Jackson-ol5xt Год назад

    As a recent graduate with a computer engineering degree, I appreciate the cache explanation. Very informative and interesting to understand design decision wise. I’d also like to provide some further insight.
    One thing to note is the miss time for L1 cache is the hit time for L2, and the miss time for L2 is the hit for L3, or whatever the next layer is. Generally speaking, a larger L2 cache allows your processor more cycles computing because if there is a miss in L1, the hit time of L2 is significantly faster than the hit time of the next layer (in this case VRAM). Physical memory is often in the hit time of hundreds of cycles, whilst a good L2 hits in usually 2 cycles (if the L1 hits on request). If the L2 is missing a lot more, then your GPU is stuck waiting for a memory transaction in one of its thread warps and it cannot do anything with that set of data until it gets what it needs. That’s why having fast memory is so important.
    It’s also why AMD’s decision to greatly increase the L3 and L2 caches on their CPUs have had such a positive effect on gaming and other memory intensive applications.

  • @MemeticsX
    @MemeticsX Год назад

    The educational content is always welcome. Better to err on the side of providing more context rather than less; people can skip past it if they want, but they can't view it if it's not there.

  • @nicholasthesilly
    @nicholasthesilly Год назад +19

    $400 is too steep for a card with 8 gigs of memory in 2023.
    $500 is too steep for a card with this level of performance in 2023.
    They should have just made the alternate version 12gb and priced it at $450 (at most.) Going from 12 to 16 is much less useful than 8 to 12, and at this pricing, it makes more sense to just spring for a 4070, even though that is still kinda overpriced.
    At least they are giving their GPUs a sizable amount of cache now. I do like that decision.

    • @tjep2670
      @tjep2670 Год назад

      My conspiracy theory is that the 4060 ti 16gb is designed to push people into getting the 4070. I doubt they will even print as many cards and wait to sell if any sell.

    • @krissman123
      @krissman123 Год назад +2

      @@tjep2670 This is how all of the 4000 series cards have been (with the exception of the 4070). The 4080 makes the 4090 look better, the 4070 ti makes the 4080 look better and the 4060 ti makes the 4070 look better. But the easier way to explain all of this is that ALL of the cards are way overpriced which is why everything seems this way.

  • @kyleharder3654
    @kyleharder3654 Год назад +4

    The margins on these cards must be insane

    • @karatwilight
      @karatwilight Год назад +1

      A few dollars per card for Nvidia. Less for partner models. Some partner models measure profit margins in cents per card, not dollars. nvidia has never considered the consumer segment to be a significant portion of their revenue. It's just a branding and bragging thing at this point, staying a step ahead of AMD in any technical sense. Beyond that Nvidia could eliminate consumer GPUs entirely and not see more than a minor blip in their profits. Odds are that cutting consumer-focused employees, support, and research divisions would actually improve their bottom line.

  • @davidgunther8428
    @davidgunther8428 Год назад +7

    This is like the GTX 960 again: barely faster than the previous model with not much memory, but coming with double the memory later.

  • @michaeljaystaufferjoyce7235
    @michaeljaystaufferjoyce7235 Год назад +2

    So $500 now gets you basically a Entry level card. When I started building computers, $500 would get you the best GPU on the market. I hate how inflated PC parts have gotten, especially in the GPU markets.

    • @zdspider6778
      @zdspider6778 Год назад

      There's something very weird about the GPU and motherboard markets... You used to buy an $80-100 motherboard and be happy with it, now you need like $400-500 for a motherboard that is MISSING features. The other components have stayed mostly the same, with DDR4 being pretty affordable.

  • @TR-kn1xx
    @TR-kn1xx Год назад

    On a slight (read "complete") side-note:
    My BMW has TC and ESP, and it's always on as default setting. It can also be set to a slightly more forgiving mode or even switched off altogether.
    I usually turn it off completely however, at least in town, during winter. The added fun of it is immeasurable in numbers.
    Still, it's really interesting to compare to how other cars behave with all systems off. The handling characteristics and "temperament" of the car really comes through with all systems off - a longer, front-heavy car with FWD and a turbocharger is completely different to an RWD car with N/A engine, a shorter wheelbase and a neutral weight distribution.
    ESP and TC will mask a car's poor basic setup...
    So, to sum up:
    An RWD car with N/A will have the most natural feel to it, all other things equal
    It's really important to keep testing every graphics card's raw performance as best you can. HUB's statement on the matter explains this well...
    ...though like Steve GN said, showing the ESP/TC numbers as well isn't a bad thing, but it's got to be abundantly clear for a fair comparison.

  • @E_Sunbro
    @E_Sunbro Год назад +13

    And here I thought they would learn from the 3060 12GB confusion. Nope.

  • @ShaneMcGrath.
    @ShaneMcGrath. Год назад +4

    Hey Steve, Might want to check out NorthridgeFix.
    He has been getting quite a few 4090's coming in for repairs, The connectors are still melting even with the Cablemod adaptor.
    Seems like it isn't users fault, I think they(4090's) should all be recalled.

    • @argonzeit
      @argonzeit Год назад

      When a user is afraid that they are going to break the most expensive part in their PC because of how hard they have to push a connector in, that's not user error that's a design flaw.

  • @Alt3mediagroup
    @Alt3mediagroup Год назад +9

    I belive without dlss3 nividia wouldn't be winning thier newer cards seem to lack behind without dlss turned on its quite depressing that they need ai to make it look better
    Love all the videos and information

  • @WarrenLeggatt
    @WarrenLeggatt Год назад +2

    My previous card was a Rx480 8gb that lasted years and proved a far better bet than the 1060 in the long run, now I have a 3060. I just can't wrap my head around they are releasing these cards with 8gb in the current world of high vram usage, and less than the previous generation. I know the 12gb in the 3060 was sort of forced on them as people would have laughed at 6gb again on the 60 class at that point.
    I sort of feel my 3060 will outlast the 4060, the later will soon suffer with vram limitations even at 1080p. I have managed to get skyrim to eat 11gb of vram with mods in the past lol
    This is a backwards move driven by greed and nothing else really

  • @Olivyay
    @Olivyay Год назад +1

    Traction control in cars also doesn't slow throttle response or introduce steering lag, and doesn't introduce glitches in the suspension.

  • @headbreakable
    @headbreakable Год назад +5

    DLSS2&3 are 2 different tech the game dev need to implement, both exclusive to Nvidia but one only belongs to 40s gpu, yeah, sounds like a lot of work for very few player on PC if you ask me.

    • @Voyajer.
      @Voyajer. Год назад

      Especially when factoring in the effort to implement dlss3 just to make your game look wonky.

  • @kvahlman
    @kvahlman Год назад +16

    I agree about having all the bars, but kind of confused how the part of the slide that HAD games that don't support FG was ignored completely. After all, the only "unfair" advantage DLSS 3 offers IS the FG.

    • @winebartender6653
      @winebartender6653 Год назад +3

      Because the point he is making is that the ones that DO have FG can also just disable FG to actually see the difference between it being enabled and disabled.
      As he explained, it's just dandy the FG is a thing, but it is only in select titles and a lot of people do not like using it (myself included, I own a 4090 for reference) as the graphic issues are quite noticeable.
      If we want to steal Nvida's analogy of cars. It's like having a sport mode that unlocks 30% more horsepower but ONLY when at a race track, and then advertising the car has that maximum horsepower.

    • @kvahlman
      @kvahlman Год назад +2

      @@winebartender6653 I know the point, it was made very clearly. What I was looking for was an ADDITIONAL comment about the comparisons shown which did not have this issue. The commentary made it seem there was none, which wasn't the case. But to be honest the manufacturer charts can/should be largely ignored anyway so no great loss there. Just a little weird.

    • @GamersNexus
      @GamersNexus  Год назад +2

      We noticed it. The problem with it was just that they largely ignored it in the spiel/presentation to us -- they were pretty much only focused on the DLSS3 titles, which meant we were focused on those. If equal time had been given to both -- sure.

  • @andrewhudson5826
    @andrewhudson5826 Год назад +4

    The tutorials were great, I love learning about these things. More please!

  • @micheledemurtas5233
    @micheledemurtas5233 Год назад

    6:44 "Now you know how to do the formula" Thanks Steve

  • @peepweep01
    @peepweep01 Год назад +3

    interesting to learn how all of this works, id love to see more in depth explainations on how each component functions in relation to the others. It would also be benificial to learn more on how each section of components function when faced with different scenarios. Keep up the great work.

  • @RAGERPLAYS
    @RAGERPLAYS Год назад +8

    Absolutely appreciate the educational bits of this video always appreciate additional insight. and nice diagram as well!

  • @MayaPosch
    @MayaPosch Год назад +7

    Announcements like these make me more excited for the new AMD cards and a possible Intel Alchemist+ refresh.

  • @DisheveledHuman
    @DisheveledHuman Год назад +4

    So im guessing you will get more vram but also expect more pop-in textures because they cheated the vram in at the cost of memory access speed.

  • @rojovision
    @rojovision Год назад +1

    You have traction control in multiple states until you realize that you forgot to pay the multistate traction control subscription fee.

  • @Search4TruthReality
    @Search4TruthReality Год назад

    I appreciate seeing "the basics". Please continue to share. Thank you.

  • @BlackJesus8463
    @BlackJesus8463 Год назад +4

    1% lows clearly indicate the cache isn't good enough for 8GB cards. 😂😂✌

  • @LordApophis100
    @LordApophis100 Год назад +5

    For cache size not only die area is relevant, but also access times. The bigger the cache the slower it is, that’s why there are multiple levels of cache instead of just one big cache. Adding another 1ns of latency will kill all performance increase at some point.

  • @zivzulander
    @zivzulander Год назад +37

    The 16GB 4060 Ti is the only card thus far that might be _almost_ reasonable from Nvidia this gen. "Almost reasonable" because even accounting for inflation it's priced more like a *70 class card.

    • @Richard-tj1yh
      @Richard-tj1yh Год назад +16

      Its still an overpriced scam

    • @SpoonHurler
      @SpoonHurler Год назад +3

      I think the 4090 is reasonable for what it is but beyond that I agree with you... I want to see how well the 4060 TI 16GB handles 1440p before I saw it's reasonable though... cause having a 50 fps 1% low with a 55 fps average is not worth 500 USD imho (I'm making up random numbers btw)

    • @Richard-tj1yh
      @Richard-tj1yh Год назад +10

      128 bit bus lmao

    • @JoeL-xk6bo
      @JoeL-xk6bo Год назад +7

      3070 performance for same price in 2023 is reasonable LMAO

    • @sihamhamda47
      @sihamhamda47 Год назад

      ​@@Richard-tj1yh bruh that could be a HUGE bottleneck

  • @cracklingice
    @cracklingice Год назад +1

    Dear Nvidia. Traction control doesn't lead to a lesser experience like DLSS (unless you intend to create smoke). Nor is it required to get sufficient performance because the car company decided to replace the engine with one from a car tier lower and make up for that with software. We all know that the 4070 is 60 class silicon and these new cards are 50 class silicon.

  • @shanez1215
    @shanez1215 Год назад +18

    Just saw on their site that 4060 is confirmed to be 299. They may have updated it very recently.
    I'm genuinely surprised than an Ada card is launching for cheaper than Ampere.

    • @333toxin
      @333toxin Год назад +8

      looking at the die size and die class in general
      it’s basically a 4050 🤷🏻‍♂️

    • @Antagon666
      @Antagon666 Год назад +8

      Because it's 4050 with upscaled model number.

    • @MrAlBester
      @MrAlBester Год назад +1

      Still DOA with the 6700 10($280)

    • @mchonkler7225
      @mchonkler7225 Год назад +2

      It's the same rasteriztion performance for $30 cheaper 2 years later, 😂.

    • @zdspider6778
      @zdspider6778 Год назад +6

      It's a 50-class card in disguise. It uses the AD107 chip which the 107 has always been 50-class silicon. They are taking the piss. A 50-class card shouldn't cost more than $119-149.

  • @Lord_of_Dread
    @Lord_of_Dread Год назад +3

    I appreciated the stuff around the effective bandwidth. I've been waiting to buy a new GPU for VR since 2019, and it seems every card that has come out has a fatal flaw that stops it being useful, last gen it was both price/availability (scalpers) and power draw. This time around, the 4070; the card I was expecting to be 'the one', only came out with 12GB VRAM, and a price way higher than 12GB warrants. I still want 16GB because most of the games I play are modded (the best VR games are all heavily modded), but its been useful to understand how the lower memory bandwidth and smaller memory bus might not be as brutal on VR as I was expecting due to the increased cache (VR is kinda analogous to 4k as I understand it).

    • @shiinc0
      @shiinc0 Год назад

      a 6950 xt or a 6800 xt should work very well for you

    • @Lord_of_Dread
      @Lord_of_Dread Год назад

      @@shiinc0 The fatal flaw with those is power draw though. I am hoping AMD can bring a 16GB card with that level of performance, better efficiency and price

  • @TheNerdy1
    @TheNerdy1 Год назад +48

    Thanks for always covering the news and interesting topics! Keep up the awesome work!

    • @rawdez_
      @rawdez_ Год назад +1

      Steve should ask Dr. Ian Cutress to calculate ngreedia's die prices for him, then look at ngreedia's billions in profits, and to remember that mining is dead and GPUs AREN'T MONEY PRINTING MACHINES ANYMORE but priced as such. ALL GPUs are AT LEAST 2-3 times OVERPRICED, ngreedia's margins are insane, ngreedia's profits are insane. and he keeps talking about "costs to make better silicon and passing it to consumer".
      c'mon, Steve. he should be WAY smarter than repeating ngreedia's shilling points about why their crap is OVERPRICED AF.
      NOBODY SHOULD BUY ANY OF THAT OVERPRICED AF CRAP EVER AND IT SHOULD ROT ON SHELVES until ngreedia and ayyymd go bankrupt with theirs 2-3 times overpriced GPUs.
      during last 5 years, 1000-3000 ngreedia GPUs were money printing machines, literally. now they're still priced like they are money printing machines BUT THEY ARE NOT capable of making money anymore.
      prices should drop to pre-mining levels and be at least x2-3 times LOWER, because they make 0 sense without mining.
      after that we would finally get some decent progress in gaming and in gaming hardware. currently we have almost 0 progress in 6-7 years.
      and let me remind you that even with pre-mining prices GPUs never were cheap and ngreedia made BILLIONS IN PROFITS even then. now they just make even more billions with insane margins.

    • @rawdez_
      @rawdez_ Год назад +2

      a 4090 die 608mm2 costs 300 bucks MAX to make according to wafer calcs and 2 y.o. TSMC prices per wafer, its more like 200-250 bucks now. 300 bucks number from calcs is too high because its 2 y.o. prices, ngreedia cut TSMC orders = they are getting better prices to not cut even more orders, + on the same wafer where big dies don't fit you can make smaller dies like a 4080, a 4070Ti etc. which ngreedia gets "for free" - if you account for them unlike wafer calcs do = they lower the cost of a wafer significantly.
      $1600-$2000 for a less than 300 bucks 4090 die (more likely $200-250)? just lol. I understand if they'd want 800 bucks for that, max. it still would be at least 200 bucks overpriced.
      a 4080 die 379mm2 costs 150 bucks MAX - again without accounting for smaller dies that are made "for free" on the same wafer and other things. $1200+ for that?
      and ngreedia is trying to sell 60-class cut down silicon 295mm2 that worth around 50 bucks as 4070ti for 800 bucks with 12gb VRAM, they even tried to sell it like 80-class silicon for 900 bucks at first. 12gb VRAM on 800-900+ bucks cards, Carl! 12gb VRAM... in 2023... on 800-900+ bucks GPUs!!!111
      12gb VRAM is fine for 3060 performance level, a 4070tie should have at least 16GB or 24Gb as a 3090. otherwise its a morally obsolete GPU, especially for 800-900+ bucks.
      the 4070ti is basically a morally obsolete piece of crap suitable only for either extreme budget 1080p monitors, or for old games @1440p.
      and ngreedia wants 800-900+ bucks for that ridiculous piece of crap GPU with 12GB VRAM. The 4070 Ti sucks at higher resolutions as it has less memory bandwidth then even a 3070 Ti. It's an extremely crippled card.
      Rtx 4070ti main selling point is to milk dumb boiz' wallets with a stupid overpriced AF GPU with low VRAM thats already obsolete because it can't run shit natively without upscale and glitchy fake frames.
      DLSS 3 is glitchy fake frames that don't improve shit but sell morally obsolete hardware that can't run games natively to stupid ngreedia fanboiz.
      7900xtx 529mm2 and of those 529 only 300 are actually the GPU, the rest are memory controllers on 6nm and those are CHEAP. AyyMD's GPUs are WAY cheaper to make than ngreedia's overpriced AF crap.
      ngreedia is killing PC gaming on purpose to sell its crap laggy af GaaS subscription for 20 bucks/month.
      thats why it releases crap slow overpriced AF morally obsolete GPUs with low VRAM that have no sense.
      and AMD does exactly the same - killing PC gaming on purpose to sell more morally obsolete APUs to sony/ms for their consoles.
      + both corporations milk the market with overpriced AF GPUs

    • @AIChameleonMusic
      @AIChameleonMusic Год назад +1

      Will second that ty for the comment Nerdy1

    • @mjc0961
      @mjc0961 Год назад

      @@rawdez_ I ain't reading all that. I'm happy for u tho. Or sorry that happened.

    • @rawdez_
      @rawdez_ Год назад

      @@mjc0961 sorry its too many letters for you, mate. you are a perfect buyer/customer.

  • @rustler08
    @rustler08 Год назад +2

    Ah yes. Either get a used 3080 with 1-2 years of remaining warranty or a 4060 Ti that's slower for the same price, but at least it has VRAM!

  • @MTGOFerret
    @MTGOFerret Год назад +1

    To use NVIDIA's traction control argument, in ultra high horsepower drag race scenarios you actually DO turn it off so you don't have it limiting the performance of the vehicle . So yes in a car it would make sense to test both.

  • @vlastimil-furst
    @vlastimil-furst Год назад +30

    It is interesting how AMD is generally expected to be the one copying nVidia's trends and introducing them one generation later.
    But this trend with lower bandwidth and more cache is what AMD introduced in the Radeon RX 6000 family. And nVidia clearly sees some advantage in it, too.

    • @benjaminoechsli1941
      @benjaminoechsli1941 Год назад

      I bet it's because AMD was the first to the pricey TSMC nodes. Nvidia probably went, "dang, we're paying a lot more for this stuff than Samsung's... offerings. How does AMD save a buck?"

    • @coryray8436
      @coryray8436 Год назад

      Yep. They introduced the 6600XT which is basically a 5700XT at the same price with less memory bandwidth.

    • @vlastimil-furst
      @vlastimil-furst Год назад +1

      @@coryray8436 Well, that was during a crypto rush. Now that the rush is over, the pricing of said card is very reasonable, and it's hard to argue for nVidia from value point of view unless you value their software and their proprietary features.
      Anyway, it's nice to see that 6600 XT slightly outperforms 5700 XT at a 4/5 core config and half the memory lanes (bandwidth is lowered a little less due to an increase in VRAM clocks).

    • @coryray8436
      @coryray8436 Год назад +1

      @@vlastimil-furst AMD, unlike Nvidia, is willing to let prices fall and purge old inventory. I know it's more complicated with 3rd parties involved, but y'know what I mean. I can foresee there being lots of 30 series on shelves for quite a while aging like fish.

    • @vlastimil-furst
      @vlastimil-furst Год назад

      @@coryray8436 And I like their business model when it comes to CPUs, too. The Ryzen 5 3600 CPUs they sell dirt cheap, I think they're just a byproduct of Zen2 Epyc CPUs, silicon not worthy to be in an Epyc product. So, they just made it their low-end costing below $100.

  • @davyrando1203
    @davyrando1203 Год назад +4

    I'm actually quite curious about this card - the increased cache size could offset the lower VRAM capacity, and it might punch way above it's weight in games with lower graphical/texture fidelity, such as competitive shooters, small game spaces/arenas, and e-sports titles.
    That price though... we'll see what happens.

    • @lukilladog
      @lukilladog Год назад +1

      Bro, games use textures up to 80mb in size, cache is there for bndwidth.

    • @hsanrb
      @hsanrb Год назад +1

      Unless you think "above its weight" is 400+ FPS, it will probably be irrelevant to cards already on the market. To Nvidia (and AMD with the 7600) this is just providing a new card that plays those low-end games for people who do not have a card thats been out in the past 10 years. The "I need a new gfx card but I don't need anything fancy" people.

    • @wytfish4855
      @wytfish4855 Год назад

      most games that would "stoop so low" would've run plenty fine on current hardware already though.

  • @fanwu3628
    @fanwu3628 Год назад +3

    I love the technical breakdown, I'm fairly new to learning about PC parts so this definitely helped in regards to model comparisons.

  • @stevemaxwell5559
    @stevemaxwell5559 Год назад

    Part of the reason I watch you is because I learn stuff. Even if I'm not likely to be so concentrated as to take absolutely everything in, some of it sticks.
    It's also why I will watch your stuff on stuff that I'm not interested in.
    So, keep the educational bits in is my vote.

  • @InternetListener
    @InternetListener Год назад

    5:32 Can you order a customized card through an on-line configurator? Just like if you were buying a Car or an Apple:
    -Pre-build RTX $300 mini: 4060, 15 Tflop 3.0 shaders, 35 Tflop 4.0 RT Cores, 242 Tflop 4.0 Tensor cores, 3 DLSS 3, 8 GB buffet, 24 MB cache, 272 GB/s vram BW (but like 453), TGP: 7-115W
    -Pre-build RTX $400 pro: 4060ti, 22 Tflop 3.0 shaders, 51 Tflop 4.0 RT Cores, 353 Tflop 4.0 Tensor cores, 3 DLSS 3, 8 GB buffet, 32 MB cache, 288 GB/s vram BW (but like 554), TGP: 7-160W
    -Pre-build RTX $500 max: 4060ti, 22 Tflop 3.0 shaders, 51 Tflop 4.0 RT Cores, 353 Tflop 4.0 Tensor cores, 3 DLSS 3, 16 GB buffet, 32 MB cache, 288 GB/s vram BW (but like 554), TGP: 7-165W
    -Pre-build RTX $1600 ultra: 4090, 83 Tflop 3.0 shaders, 191 Tflop 4.0 RT Cores, 1320 Tflop 4.0 Tensor cores, 3 DLSS 3, 24 GB buffet, 96 MB cache, 1008 GB/s vram BW (but like 2016), TGP: 21-450W
    Just wondering:
    -Custom RTX below $60000 Titanic Ti: Call me RTX 40, 30 Tflop 3.0 shaders, 191 Tflop 4.0 RT Cores, 1320 Tflop 4.0 Tensor cores, 3 DLSS 3, 32 GB buffet, 96 MB cache, 1008 GB/s vram BW (but like 2016), TGP: 7-225W

  • @joerussell9574
    @joerussell9574 Год назад +5

    I like the inclusion of the algerbraic formulae for us that suck at math can easily use outside calculation tools since the formulae is known and we do not have to rely on sketchy marketing shills for (mis)information! Thanks GN these cards look like a miss, I am definitely going all AMD on my next gaming laptop as I am so happy with my Zen 3 atm but in the future I want an all AMD compared to my AMD/NGREEDIA dedicated 1650(refit with ddr6) but it serves my gaming needs as I play older games and for 500$ at the time last year this was a good deal for me.

  • @dcarpenter85
    @dcarpenter85 Год назад +12

    As someone who owns and uses a 3060 Ti on a daily basis, DO NOT buy the 8gb 4060 Ti. Please, for the love of all things good and righteous, do not buy the 4060Ti 8gb. Don't do it. Just don't.

    • @existentialselkath1264
      @existentialselkath1264 Год назад +2

      Even my 2060 super was limited by its 8gb in games when it launched!
      Unless you're the type to play at lowest settings but the highest framerate possible, none of the extra power of the 3060ti and 4060 8gb will mean anything

    • @SweatyFeetGirl
      @SweatyFeetGirl Год назад +1

      people will still do it. people have bought the 1050ti which is 70% slower than rx 470 for same price

    • @fnorgen
      @fnorgen Год назад +2

      I gamed on a 3070 for a while. The performance was good... until it ran out of VRAM, which happened a little too often for my liking. This was back in 2021, so I expect the problem would be even worse these days. even 12 gigs feels a little tight at times, but at least it's still tolerable.

    • @Megneous
      @Megneous Год назад

      This. If we're going to get ripped off, we might as well get a decent amount of vram while getting ripped off, you know?

    • @MoultrieGeek
      @MoultrieGeek Год назад +1

      Agreed, I went with the Radeon 6750xt over the 3060ti simply because of the higher vram. Depending on the game performance is roughly similar (slight edge to AMD) but 12gb vs 8. Easy choice for me.

  • @baroncalamityplus
    @baroncalamityplus Год назад +9

    The 4070 TI should have shipped in the 4080 box with blue tape covering the 80 and in pen 70 ti written on it like Steve has the 4060 ti here.

  • @deandowling
    @deandowling Год назад +1

    Long time viewer who already knows about cache and whatnot.
    I enjoyed the educational bit - you folks always seem to strike a great balance between technical info and lay people understanding, so it was great to relearn it.
    Like you mentioned, people can always skip over it if they don't want to see it, and the information helps inform the online debate that inevitably arises from these types of launches.

  • @brucethen
    @brucethen Год назад +1

    A 128 bit bus also greatly reduces board complexity and the cost of mapping traces

  • @WelcomeToDERPLAND
    @WelcomeToDERPLAND Год назад +4

    This gen is definitely a skip, hopefully Gen 5 offers better performance per $, cause this entire gen has been barely any gains on last gen for the same or double the price per dollar, absurd.

  • @puregarbage2329
    @puregarbage2329 Год назад +3

    Those prices on the 4060 ti are insane 😂

  • @stephanhart9941
    @stephanhart9941 Год назад +7

    Wow the 4050!! Oh 4060.😢

  • @vigorgaming8
    @vigorgaming8 Год назад

    I love the part with the formula for calculating video memory bandwidth

  • @adams071
    @adams071 Год назад

    May be a fun side project for 3d animation in GN where there is a 3D animation of how current computers operate. Kinda reminds me of a old computer museum where you literally walk through a giant computer. Forgot the name of the place but it was so awesome.

  • @Z0eff
    @Z0eff Год назад +7

    13:23 *technically* nvidia here does show a few titles without DLSS which is a more apples to apples comparison. Which is the same +15% uplift as described in another slide.
    Gotta give the devil a tiny bit of credit when it does make it somewhat useful ;p