Это видео недоступно.
Сожалеем об этом.

Why The RTX 4060 Ti Sucks, Really Bad! GeForce RTX 4060 Ti vs. RTX 3060 Ti, 40 Game Benchmark

Поделиться
HTML-код
  • Опубликовано: 15 авг 2024
  • ThermalTake CTE C750 Case - bit.ly/Hardwar...
    Support us on Patreon: / hardwareunboxed
    Join us on Floatplane: www.floatplane...
    Buy relevant products from Amazon, Newegg and others below:
    GeForce RTX 4060 Ti 8GB - geni.us/YxYYX
    GeForce RTX 4090 - geni.us/puJry
    GeForce RTX 4080 - geni.us/wpg4zl
    GeForce RTX 4070 Ti - geni.us/AVijBg
    GeForce RTX 4070 - geni.us/8dn6Bt
    GeForce RTX 4060 Ti 16GB - geni.us/o5Q0O
    GeForce RTX 4060 - geni.us/7QKyyLM
    GeForce RTX 3070 - geni.us/Kfso1
    GeForce RTX 3060 Ti - geni.us/yqtTGn3
    GeForce RTX 3060 - geni.us/MQT2VG
    GeForce RTX 3050 - geni.us/fF9YeC
    Radeon RX 7900 XTX - geni.us/OKTo
    Radeon RX 7900 XT - geni.us/iMi32
    Radeon RX 7800 XT - geni.us/Jagv
    Radeon RX 7700 XT - geni.us/vzzndOB
    Radeon RX 7600 - geni.us/j2BgwXv
    Radeon RX 6950 XT - geni.us/nasW
    Radeon RX 6800 XT - geni.us/yxrJUJm
    Radeon RX 6800 - geni.us/Ps1fpex
    Radeon RX 6750 XT - geni.us/53sUN7
    Radeon RX 6700 XT - geni.us/3b7PJub
    Radeon RX 6650 XT - geni.us/8Awx3
    Radeon RX 6600 XT - geni.us/aPMwG
    Radeon RX 6600 - geni.us/cCrY
    00:00 - Welcome back to Hardware Unboxed
    00:24 - Ad Spot
    01:05 - Why is the RTX 4060Ti so bad?
    04:29 - Test System Specs
    04:50 - Star Wars Jedi Survivor
    05:14 - Cyberpunk 2077 [RT]
    05:37 - Cyberpunk 2077
    05:53 - Call of Duty Modern Warfare 2 [Best Case]
    06:30 - Control
    06:48 - Doom: Eternal
    07:24 - Resident Evil 4 [RT]
    07:44 - Hogwarts Legacy
    08:07 - For Honor
    08:23 - Fortnite [DX11]
    08:38 - The Callisto Protocol
    08:54 - Hunt Showdown
    09:07 - Far Cry 6 [RT]
    09:19 - A Plague Tale Requiem
    09:32 - The Last of Us Part 1
    10:05 - 1080p Average
    11:02 - 1440p Average
    11:42 - Final Thoughts
    Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
    Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
    FOLLOW US IN THESE PLACES FOR UPDATES
    Twitter - / hardwareunboxed
    Facebook - / hardwareunboxed
    Instagram - / hardwareunboxed
    Outro music by David Vonk/DaJaVo

Комментарии • 2,9 тыс.

  • @MaunoMattila
    @MaunoMattila Год назад +621

    These kinds of reviews just expose how poor RUclips reviews can be. 4060 TI is the best GPU to get for $400 and here's why is that.
    + DLSS 3 capability is the future
    + Strong Ray Tracing performance
    + Excellent 1080p performance (100 fps on average at Tom's Hardware)
    + Competitive 1440p performance (75 fps on average at Tom's Hardware)
    + Reliable 8-pin power adapter
    + Low power consumption (160W TDP)
    + HDMI 2.1 Connector
    + Compact physical size
    + New, launched on May 24th, 2023
    You can run any 2-5 years old game with +90 FPS in FullHD/QHD, you can run the latest AAA games with DLSS and Ray Tracing ON. In future games you can utilize the new techs like AI support, Ray Tracing, DLSS 3 etc. Very competitive product, but I truly cannot understand why somebody would buy an AMD GPU? Most probably Intel will replace AMD in GPU market in coming years.

    • @Hardwareunboxed
      @Hardwareunboxed  Год назад +1987

      Just Buy It!

    • @swagheel
      @swagheel Год назад +794

      Ohh bro about to get cooked!! 🔥

    • @mariodrv
      @mariodrv Год назад +253

      Suuure...

    • @Cuddlesworth220
      @Cuddlesworth220 Год назад +463

      Not even worth the time

    • @andyk2181
      @andyk2181 Год назад +2056

      I like how reliable power adapter is now a selling point

  • @HighYield
    @HighYield Год назад +1180

    Nvidia basically dropped the entire mid-range & entry-level SKUs down one GPU tier. Now the RTX 4060 is based on AD107, which always was a 50-class chip, and the 4060 Ti is based on AD106, which in the past was used for the non-Ti 60-class. But they can't hide that fact from your reviews.

    • @therealsunnyk
      @therealsunnyk Год назад +18

      I'd be interested to see this compared with the 3060 (non-Ti) for this reason. Yes, the price/performance is the same, but what could NVidia's current gen be giving us?

    • @ironman-fg3jh
      @ironman-fg3jh Год назад +34

      @@therealsunnyk it's just so they can clear the 30 series stock

    • @IVPixel
      @IVPixel Год назад +62

      Yeah but muh AI muh framegen

    • @MrGrzegorzD
      @MrGrzegorzD Год назад +15

      This. Since 4080 all are scummy named. And I am not NV fanboy. Nor AMD at current practices. I do own 6800xt(sons pc) and 4090 myself...

    • @Zerion
      @Zerion Год назад +9

      I bet they will phase out all mid and low range cards, and their future products would only consist of 80 and 90 class.

  • @lesliestanfield2733
    @lesliestanfield2733 Год назад +473

    Look at the die size, the 4060 Ti is the successor to the 3050 mobile.

    • @djchotus1
      @djchotus1 Год назад +11

      That's not the story. The one is Samsung and huge and power hungry. The other is TSMC. Smaller and more powerful and more efficient. Not the same apples to oranges

    • @lmaoded4854
      @lmaoded4854 Год назад +68

      ​@@djchotus13060m is more power efficient than 4050m

    • @TheFriendlyInvader
      @TheFriendlyInvader Год назад +16

      ​@@djchotus1This isn't how generational die tapouts work...

    • @stoneymahoney9106
      @stoneymahoney9106 Год назад +9

      @@djchotus1 That's not how die binning works. All the good quality dies capable of running at the lowest voltages go into pro cards tuned for best efficiency and get sold at a premium to data centres with deep pockets, and gamers get the crap left over running at speeds way that require voltages way above the efficiency sweet spot with half the VRAM and double power consumption for 10% more performance.

    • @toaster_bloke9999
      @toaster_bloke9999 Год назад +2

      @@lmaoded4854 Why lie?
      The 4050m is 40% faster at the same power draw of the 3050m.
      And the 3000 series absolutely did have bigger dies compared at each model level compared to previous generations.

  • @middleclassthrash
    @middleclassthrash Год назад +80

    At this point, if you don't feel that your intelligence had been insulted by Nvidia, then you have been thoroughly brainwashed by their marketing department.

  • @BogdanM116
    @BogdanM116 Год назад +357

    As a 3060Ti owner, thanks Nvidia for keeping my GPU competitive so I can stretch it another 1-2 years before moving to AMD.

    • @paradigmshift7541
      @paradigmshift7541 Год назад +4

      and you'll be using way more power for those 1-2 years, it's like energy is irrelevant to people lmao

    • @BogdanM116
      @BogdanM116 Год назад +81

      @@paradigmshift7541 I undervolted it and it's using 180W, 20W more than a 4060Ti. Also the undervolt has improved the performance by about 2-3% so for 1440p (the res I'm using) it's basically a 4060Ti. Energy is relevant to me, but I don't think 20W extra power draw for another 1-2 years justifies spending 520 USD (That's how much a 4060Ti is in Europe), especially since it's basically the same performance. Y'all Nvidia fanboys need to find something else to convince us, talk about FG idk.

    • @kaitogg322
      @kaitogg322 Год назад +9

      ​@@paradigmshift7541yeah i would like to save the 20$ yearly bucks than a 4060ti would save against my actual 3060ti, ty nvidia

    • @BogdanM116
      @BogdanM116 Год назад +9

      @@Relex_92 You're right, I wasn't very clear in my comment. My intention was to say that Nvidia keeps my 3060Ti competitive with the 40-class equivalents. I know that now, with the RX 6700XT and RX 6800 the 3060Ti/3070 is not really competitive, only if you're the biggest fan ever of Ray Tracing. I am fully aware that 8gb of VRAM will be a serious issue for people playing 2024-2025 AAA games on this card. The thing is I'm not really excited for any newly announced games and probably won't be until GTA 6 or smth. I mainly play simulator stuff like Asseto Corsa, Automobilista, F1 with some story games like HZD and The Witcher when I'm bored. This card will be enough for me to enjoy the games I play at 1440p high settings for the next 1-2 years, can't say the same for everyone and that's why you're right. By the time I want new games that need more GPU muscle and VRAM, I will probably already be on an AMD GPU with 16 or 20gb of VRAM. Probably gonna buy the RX 7700XT/7800 long after release when you start to get those sweet deals.

    • @MrSolvalou
      @MrSolvalou Год назад

      Fingers crossed yours doesn't have the faulty Hynix memory

  • @nipa5961
    @nipa5961 Год назад +1458

    Nvidia never fails to disappoint.

    • @jorelldye4346
      @jorelldye4346 Год назад +56

      And we never fail to buy their shit.

    • @korinogaro
      @korinogaro Год назад +18

      NVidia has way less fails to their name than AMD. Like 30 series was awesome and with good pricing (the COVID BS messed it up). The 10 series was also incredible. 12 was ok. And like... you can go into the past and at least 2/3rds of gens were really good, and half of the rest was at least ok. 40 series is a disaster.

    • @logirex
      @logirex Год назад +5

      Come on it is 5-8% faster that is nothing to zzzzzzzzzzzzzzz
      and it has only been 30+ months since the 3060Ti launched 😂

    • @theplayerofus319
      @theplayerofus319 Год назад +9

      ​@@korinogarowhat matters is what we pay and the cards were way too expensive.... amds 6000 series is more vaiable now because its cheaper and more vram with same Performance lol

    • @RjayjayC
      @RjayjayC Год назад +14

      AMD always fails to capitalize.

  • @Geulivier.Martinez
    @Geulivier.Martinez Год назад +440

    This product shouldn't have cost more than $250, and probably should have also been a 4050 Ti.

    • @corvoattano9303
      @corvoattano9303 Год назад +37

      My exact sentiments! And the 4060 should've been a 4050 at 170 USD.

    • @thelith
      @thelith Год назад +33

      Almost eveything in the 40 series is marketed as a full tier higher card and priced accordingly this is nothing new unfotunately.

    • @bigmatth23
      @bigmatth23 Год назад +3

      You got it right

    • @delresearch5416
      @delresearch5416 Год назад

      At least only a 4060 at most.

    • @chronossage
      @chronossage Год назад +3

      Even worse. Back when nvidia was still making 2 4080 cards. This card would of been the 4070.

  • @edcoleman7019
    @edcoleman7019 Год назад +34

    I got a new 3060TI for $305 recently, and given all the info on this channel it feels like that's more than a reasonable price for the performance it delivers.

  • @hunterhearsthelmsy1
    @hunterhearsthelmsy1 10 месяцев назад +5

    Here in India 4060 ti on sale is 35 k and 4070 is 60k minimum and 3060 ti 33k minimum so yeah 4060ti is a good deal, country wise prices vary alot,AMD is expensive here than Nvidia,4060ti is definitely a good buy for people in india

    • @TonyChan-eh3nz
      @TonyChan-eh3nz 10 месяцев назад

      How about intel?

    • @hunterhearsthelmsy1
      @hunterhearsthelmsy1 10 месяцев назад

      @@TonyChan-eh3nz AMD chips are a bit cheaper but motherboards are too expensive.

    • @TonyChan-eh3nz
      @TonyChan-eh3nz 10 месяцев назад

      @@hunterhearsthelmsy1 I was also talking about GPU’s

    • @hunterhearsthelmsy1
      @hunterhearsthelmsy1 10 месяцев назад

      Arc 770 for 39k and 750 for 31k, a bit pricey as we can see

    • @sergiolandz6056
      @sergiolandz6056 8 месяцев назад

      @@hunterhearsthelmsy1 prob with intel right now is its wonky drivers. they still have a huge hill to climb to make them relevant. If you take the a770 model, the 3060ti is basically better than the a770... its a sad state

  • @FeTiProductions
    @FeTiProductions Год назад +3110

    Use my comment to show NVIDIA how many of us hate the 4060 Ti

  • @avishekmitra3698
    @avishekmitra3698 Год назад +170

    1060 matches 980. 2060 matches 1080. 3060 matches 2070. 4060Ti fumbles against a 3060Ti. Someone is not trying at all.

    • @user-jt5ky9tp7d
      @user-jt5ky9tp7d Год назад +1

      put in mind that graphics cards have more bandwidth Memory bus than the 4060 ti😂😂😂

    • @lloydaran
      @lloydaran Год назад +58

      But but but! bUT! fRaMe GeNeRaTiOn! More FPS for free!!1!1

    • @Aegie
      @Aegie Год назад +30

      ​@@lloydaranyes aNd dLsS yeah evErYboDy wanTs tHAt

    • @J0derVIVIVI
      @J0derVIVIVI Год назад +1

      Oh they are and have been, upselling low tier cards as a tier higher for years they just got cought this time.

    • @ekinteko
      @ekinteko Год назад +10

      1080 is NOT matched by 2060, but rather the 2070. The GTX-16 Series/Super Variants were okay (for the low-end/midrange), but the Vanilla RTX-20 Series was poor value all round. AMD did have the edge on value with the RX-5000 duo cards.
      The RTX-30 Series was actually not great, it was decent, it only seems great since the RTX-20 Series was so bad. Again, AMD had the edge on value with the RX-6000 cards.
      Same with the RTX-40 Series, it's actually not that bad. It only seems bad because the RTX-30 Series was decent. All that Nvidia needs to do is shave off at $100 price from the midrange cards ($200 from high-end), and it's looking good again. This time AMD does NOT have the edge on value, the RX-7000 cards are similarly disappointing, and also need a substantial price drop.

  • @rnelson1415
    @rnelson1415 Год назад +109

    It's a shame Hardware Unboxed doesn't share Nvidia's vision for what games really want: more AI-generated frames than rendered frames, and 10 more years of 1080p.

    • @PrashantMishra-kh1xt
      @PrashantMishra-kh1xt Год назад +24

      And irony is Frame Generation uses even more VRAM for a poor 8 GB card. Gamers definitely want more 8 GB cards for 10 more years.

    • @superdave8248
      @superdave8248 Год назад +2

      LOL😆

    • @Lord_of_Dread
      @Lord_of_Dread Год назад

      TBF I'll probably still be on 1080p in ten years. My 3DTV is 1080p, and they don't make 3D monitors or TVs anymore, so any monitor or TV you can buy right now would be a massive downgrade.

    • @astronemir
      @astronemir Год назад +1

      My phone is 1080p. Why the fuck would I spend $300+ on a gpu to play games on a $20 monitor? You can literally get 1080p monitors for free 2nd hand

    • @Enmanuel248
      @Enmanuel248 Год назад

      @@astronemir There are 1080p monitors that are up to $700-800. Just because you don't see the benefits of playing 1080p on a Fast TN panel, doesn't mean some people still do that lol.

  • @stephenkelly8312
    @stephenkelly8312 Год назад +126

    I can’t be the only person to switch from Nvidia to AMD (6000 series) when I saw what my options were from the green team. I’m interested to see what the steam hardware survey looks like this time next year.

    • @afelias
      @afelias Год назад +27

      I don't think the numbers will shift big on that scale. Steam will probably report that plenty of people will still use 1060s and 1650s. NVIDIA is, on the big numbers scale, just losing to NVIDIA of a half-decade ago. There's very little push to do much upgrading ever since 1080p reliable FPS was cracked all the way back then. Because NVIDIA of today is practically refusing to make 1440p reliable and affordable. Why else would their x60 card cost way more than it used to, but still target 1080p?

    • @Javious_Rex
      @Javious_Rex Год назад +28

      You're not, switched both gaming machines in my house. 2060 6GB to 6700XT and 3070ti to 7900XT. We should be good for a couple of generations. We don't ray trace and team red gives us the power to get all the detail and FPS we want without team greens "special features". No regrets here.

    • @termitreter6545
      @termitreter6545 Год назад +7

      I dont got preference, but I actually wouldve prefered to try an Nvidia GPU again after two AMD cards (latter had better price/performance). But Nvidias GPUs this gen are just so bad/overpriced theres no point even trying.
      Ended up with 6800 XT. Sometimes AMD drivers can be wonky, but they have gotten a lot better these days, and any issue cant be as bad as 8 or 12GB of VRAM with a slow memory bus. I just hate that memory stutter and screwed up frametimes.

    • @Altrop
      @Altrop Год назад +15

      AMD currently holds 20% market share, last year it was 12%.
      So you're not the only one.
      RTX5000 better be really good with plenty of VRAM otherwise Nvidia will lose more gaming customers. On the other hand, their gaming revenue is nothing compared to their AI revenue so I wouldn't be surprised if they just keep going down this road.
      The RTX4080, a powerful chip with only 16GB VRAM that will 100% be bottlenecked by VRAM in 2024 games at high resolutions/settings, just dropped to $999.. it used to be over $1200.. absolutely bonkers when some 2023 AAA games already come close to using all of that 16GB VRAM. Despite being slower, I would choose a 20GB 7900XT over a 16GB RTX4080 if I could get either of them for free.
      That 20-24GB VRAM on AMD's 7900 cards will not go unused within a normal card's lifespan (4 years on average before people upgrade), it's not there for show lol.

    • @PineyJustice
      @PineyJustice Год назад +21

      Steam survey isn't accurate, due to the way it works it gives huge boosts to whatever is in use by gaming cafes, 1000 accounts on 1 pc with a 3060 and steam counts it as 1000 3060s.

  • @ZZUtopia
    @ZZUtopia Год назад +430

    Would've much rather had 16GB on the 4070 ti

    • @julianB93
      @julianB93 Год назад +23

      With 280W Power target that would actually be a card i would buy. Well maybe a tiny bit cheaper too.

    • @Eleganttf2
      @Eleganttf2 Год назад +17

      Yep idfk what they were thinking tbh i really hope mid next year they introduce 4000 Super series to fix this horrible mess

    • @AlexFeCity46
      @AlexFeCity46 Год назад +19

      yes, and for 4080 then 20GB

    • @AndyViant
      @AndyViant Год назад +47

      Reality is the 4060 Ti doesn't have the grunt to need 16gb. 12 would have been heaps.
      Meanwhile the 4070Ti looks like being limited by ram going forwards.

    • @syncmonism
      @syncmonism Год назад +35

      @@Eleganttf2 The first mid-ranged graphics card with 8GB of Vram, the 8GB version of the RX 480, was released SEVEN years ago, for a mere 240 USD (about 305 USD if you adjust for inflation).
      It's RIDICULOUS that 8GB is still being used on 400 dollar cards SEVEN years after it became normal for a card in the 300 dollar range (after adjusting for inflation).

  • @mariodrv
    @mariodrv Год назад +164

    Just get a 6700XT or an RX 6800. Nothing else is even closely worth the money.

    • @AmirEpix
      @AmirEpix Год назад +27

      Me personally gonna buy xfx rx 6900xt merc for only 400$ in my region. It's a used card but as good as new

    • @Hector52441
      @Hector52441 Год назад +21

      @@AmirEpix
      6900XT for $400?
      What the Hell?

    • @Yerinjibbang
      @Yerinjibbang Год назад

      absolutely lol

    • @64bitmodels66
      @64bitmodels66 Год назад +42

      @@wiLdchiLd2k the only Nvidia cards that can match those 2 in performance are bottlenecked in VRAM and cost double that, and the 4000 series is unaffordable unless you're willing to burn money/have enough for a 4090. most people don't

    • @retakleksum111
      @retakleksum111 Год назад +26

      @@wiLdchiLd2k vram is the answer. textures on medium or low is much worse than low fps.

  • @FIVESTRZ
    @FIVESTRZ Год назад +25

    Gives me the 11900K vs 10900K vibes for sure!

    • @JJFX-
      @JJFX- Год назад

      Well the difference here is that it isn't simply a refresh pushed to the limit. It would be if the 11th gen chips were ~50% smaller and the memory controller was essentially running everything in single channel.
      This is worse because it truly is a generational improvement but instead of passing that to us they cut it down until it performs the same as last gen to maximize profits. It would be extremely impressive what Nvidia has done if it wasn't so damn malicious.

    • @FIVESTRZ
      @FIVESTRZ Год назад

      @@JJFX- without getting into architectural changes I’m talking about one generation you’re getting 100 FPS in one title and 90 in another, then the newer gen gets 110 in first title and 87 in second like huh? 😂 this would been more impressive as a 3060 “Super” as you mentioned rather than a new generation but I think that involves admitted defeat on the entry level 4000 series as you release a new 3000 series card

    • @JJFX-
      @JJFX- Год назад +1

      @@FIVESTRZ Oh no you're absolutely right, I'm simply saying this is even worse because Intel was essentially making the best of bad situation whereas Nvidia is making the worst of a good situation lol

    • @FIVESTRZ
      @FIVESTRZ Год назад

      @@JJFX- 😆 facts facts. Either way it is what it is can’t be too mad if they are phasing out 3000 series with each launch - just means nothing changed. If the 3000 series lingers this will become a head scratcher

  • @Hexenkind1
    @Hexenkind1 Год назад +56

    Igors Lab also discovered that there might be something wrong with the telemetry of the 40 series cards. They apparently sip more energy then we thought they would. Which is quite the scandal if you ask me.

    • @defeqel6537
      @defeqel6537 Год назад +3

      Pretty sure HUB gets their power figures "from the wall" instead of software

    • @janodelic2615
      @janodelic2615 Год назад

      As far as I know this is only the case for the entry level 4060 non-ti. They saved basically on the supervision chip.

  • @mariodrv
    @mariodrv Год назад +75

    Why do people insist on calling it a 60? It's a REBRANDED 50!!! That's why it sucks at higher resolutions!

    • @Chrissy717
      @Chrissy717 Год назад +26

      The whole product stack doesn't make sense. The RTX 4080 is cut down to 60% of the power of an RTX 4090. Historically speaking, 60% of the top tier GPU has always meant that it's a 70 class card, not an 80 one.
      And this means that basically every other card aside from the 4090 this generation has the wrong price and name. The 4070 TI and 4070 are both 60 TIs and 60s while the 4060 TIs and 4060s are actually 50 class GPUs.
      Very very concerning.

    • @mjc0961
      @mjc0961 Год назад +7

      Because that's the name Nvidia gave it, so we have to use it so that other people will know what product is being discussed?

    • @Chrissy717
      @Chrissy717 Год назад +21

      @@wiLdchiLd2k No, names are meaningless anyway. But they carry a specific expectation with them in terms of power and performance.

    • @Magnum_Express
      @Magnum_Express Год назад +9

      @@wiLdchiLd2k the issue is ...imagine this; a car company offers a 1500 series truck. meaning it has a payload of roughly 1500 lbs. 1000 in the bed, 500 in the cab. But then suddenly reduced the payload to 1200. but charges the 1500 price..so you think the price is fair, i mean thats what they charged for the previous version.... Meanwhile they are making a truck that has an actual 1500lb payload, but they are calling it the 2000 series truck (its got a bigger number it has to better right?)... for an extra few grand. See the problem? Youre paying more and getting less while also being decieved about what the product should be able to handle.

    • @himekomurata72
      @himekomurata72 Год назад +1

      4090 should've been called Titan (fitting for its size too) while 4080 should've been 4090 and etc

  • @tyre1337
    @tyre1337 Год назад +43

    mad lad benches 50 games because he was curious, love this channel

  • @davea.9927
    @davea.9927 Год назад +6

    EVGA's decision to drop nvidia making more and more sense now

  • @logi2447
    @logi2447 Год назад +76

    It would be nice if you could test PCIE3.0. Many of us are with AM4 300&400 mobos. I bet that 5% difference at 1440p might be gone.

    • @yensteel
      @yensteel Год назад +15

      Agreed! There's many systems that still use PCI gen 3. It could be a big performance penalty vs Gen 4, let alone gen 5.

    • @fabianrosenthal4644
      @fabianrosenthal4644 Год назад +17

      I've seen two german reviewers tackling this issue. First was der8auer in a 4060ti/3060ti video and PCGH in a 4060 on pcie3 website article (diagrams should be self explanatory even for non-natives).
      Results were of course mixed but some were really interesting. Because in some games you'd lose up to 10 % on 1080p max. details E.g. Forza 5 and Hogwarts Legacy.

    • @RandarTheBarbarian
      @RandarTheBarbarian Год назад +7

      It's entirely possible considering the the 4060 and Ti only use 8 lanes. So if you drop it down a gen you're operating with the modern equivalent of a 4 lane pcie bus. Or to go the other direction, the equivalent of a full slot at PCIE 2.0 speeds.
      It's like they saw the RX 6400 putting a mobile class GPU on a desktop card, halved bandwidth and all, and said "we could do that", but instead of making it an XX40 or even an XX50 they put it in the 60 class which feels all kinds of wrong. That wouldn't be so bad if it was a clear improvement on the 3060 ti, but nah it trades blows with its previous gen counterpart.

    • @termitreter6545
      @termitreter6545 Год назад +4

      @@fabianrosenthal4644 Oh wow, 10% is really bad. I know the 6600 XT had the issue, but only 2% or so average.

    • @TK-11
      @TK-11 Год назад +2

      Just look for PCI 16x vs 8x comparisons of similar nVidia cards like this one. ruclips.net/video/COcHHX2MdKs/видео.html Most games it doesn't really matter with only a few FPS difference but some will suffer more. It's the 128bit memory bus that kills the performance of this product not being limited to 8x PCIE.

  • @ZerodragYT
    @ZerodragYT Год назад +74

    I'm glad that I won't need to upgrade from my 3060Ti for a very long time

    • @adamek9750
      @adamek9750 Год назад +10

      You do because 8gb vram 😂
      Don’t worry I’ll have to upgrade mine myself😢

    • @UWG3
      @UWG3 Год назад +1

      @@adamek9750 Anyone only interested in 1080p, med - high settings are still set for the next 2 years. Where the 3060 ti starts to struggle is 1440p high. Steam hardware charts show an uptick in 1440p monitors but 1080p still reigns supreme on the platform so most are still safe.

    • @Fhwgads11
      @Fhwgads11 Год назад +4

      @@UWG3 Me with my 12gb 3080 and a 5k monitor:
      “I’m in danger!”

    • @RafitoOoO
      @RafitoOoO Год назад +1

      If you play in 1440p I have some bad news.

    • @danivinci
      @danivinci Год назад +1

      You need to upgrade it now, if you are interested in 4k or 1440p ultra and ray tracing. Just because 4060ti sucks doesn't mean you don't have to upgrade 3060ti. It just means you have to spend more money, this was nvidia's plan all along.

  • @Shapershift
    @Shapershift Год назад +59

    I wonder what are excuses that the shills will use to justify this card.

    • @saynotomanifestv3101
      @saynotomanifestv3101 Год назад +61

      Muh fake frame generation

    • @XX-_-XX420
      @XX-_-XX420 Год назад +18

      Usually efficiency, which it's not even good at. A 6700xt uses less power for better performance. And consider it's a 50 class card the powerdraw is really bad as they usually had enough with just 75W.

    • @J0derVIVIVI
      @J0derVIVIVI Год назад +2

      @@saynotomanifestv3101 Indeed i know of one that is all about the "technologies" you do not get with AMD and how said tech makes the product future proof.

    • @termitreter6545
      @termitreter6545 Год назад +7

      @@XX-_-XX420 Funny thing is, you can often buy a better AMD card, downclock it, and you might even get better performance/frame than some new Nvidia GPUs.

    • @XX-_-XX420
      @XX-_-XX420 Год назад +4

      @@termitreter6545 not sure about rtx4000 vs RX7000 as I haven't looked into that, but a 6700XT os very effecient you basically get 3070-3070ti performance at 130W. In some games you'll be far better due to the 50% more Vram you have.
      And since a 4060ti is worse than a 3060ti draws about 200W it's pretty easy to look at the 130W card offering more performance and pick that for any possible reason as its better at literally everything.
      For RX7000 I saw some guy having decent gains with a 7900XT after undervolting he was talking about 250W vs like 400 or something like that. But with the 6700XT I know exactly how good it is as I had one. Even stock it only uses 186W under full load.
      So Rx6000 vs rtx4000 it's a no brainer as AMD is better in every single way. (looking at the mid range atleast).

  • @liaminwales
    @liaminwales Год назад +7

    Nice to know my 3060 TI is almost the same as the 4060 TI, only problem is I wanted an upgrade with more VRAM for video editing. Guess it's time to look at used options.
    As a bonus I can get about 5% more FPS from a small OC/UV.

  • @VincentVonDudler
    @VincentVonDudler Год назад +4

    14:10 - "The 4060Ti is a terrible product at $400."
    Great video. Sums it up nicely.
    Would be good to know at what price reviewers believe this card becomes reasonable and/or desirable.

  • @xouri8009
    @xouri8009 Год назад +203

    The people that were shouting in the wind that the 4060 SKU was in reality a 4050 should be feeling fairly vindicated about now... jesus, this is hard to believe

    • @dycedargselderbrother5353
      @dycedargselderbrother5353 Год назад +8

      Reviewers need to be a bit cagey regarding leaks, rumors, and predictions to protect their reputations (and avoid time-wasting semantic arguments with the peanut gallery) but it's pretty easy for laypeople to look at die size, bus with, and VRAM capacity as well as what's going on with the rest of the stack to make informed predictions. The positioning of these products shouldn't be surprising to anyone who has been paying attention to how die and bus cuts have worked across generations. GTFO with these claims that a few dozen MB of cache is some sort of performance magic. That's meaningful for code (especially crypto mining code) but means little for general rasterization, as has been shown empirically.

    • @awebuser5914
      @awebuser5914 Год назад +4

      The problem is that there is *no* Lovelace GPU design that fits the performance uplift what one would expect from a 4060/4060Ti. It's not like they have something in their back pocket to fill the gap from the 4070 to the 4060 (which is HUGE!). It was a conscious decision to _really_ segregate the market between midrange and upper-midrange. The 4070 does a decent job trouncing a 3080 in modern games and gets close to, or sometimes beats, a 3080Ti whereas the 4060/4060Ti struggle to match same-tier prior products!

    • @chronossage
      @chronossage Год назад +12

      @@awebuser5914 Remember back in the day when the new xx60 card was matching last years xx80 card.

    • @awebuser5914
      @awebuser5914 Год назад +4

      @@chronossage "Remember back in the day when the new xx60 card was matching last years xx80 card..." No, because that's complete bullshit and never happened. *NO* xx60 (non-Ti) card has ever matched a xx80 (non-Ti) card, EVER.

    • @ArmadilloAl
      @ArmadilloAl Год назад +4

      @@awebuser5914 If they can't build a 4060Ti, then they shouldn't sell a 4060Ti. It's not really that hard of a concept.

  • @frank.bullitt
    @frank.bullitt Год назад +267

    Awesome content from the Benchmarking God,
    Your content helped me buy the 6700xt in Jan 22,
    Keep up the great work

    • @randxalthor
      @randxalthor Год назад +7

      Got mine in Nov. Very happy with my choice. HUB was also my starting point for researching which card was the best buy.

    • @HazewinDog
      @HazewinDog Год назад +1

      a few months before the big price crash? ouch. I sold my 6700XT for the same price I bought it for in April 2022, just as prices were collapsing. A month later they had lost 40% of their value.

    • @Takisgr07
      @Takisgr07 Год назад

      imagine being that gullible to spend money on someone whos clearly biased and dont even use the newer cards to their fullest lmao.

    • @maximus3294
      @maximus3294 Год назад +4

      ​@@Takisgr07afaik, they're mostly grading the performance of the GPUs in rasterization, which is still the predominant way of rendering games. don't blame them for catering to most gamers instead of using tech that's exclusive to this generation.

    • @frank.bullitt
      @frank.bullitt Год назад

      @@Takisgr07 RT doesn't interest me in the least. I got the card for the longevity(12gb vram) and I'm still very happy with it.

  • @marksaxon
    @marksaxon Год назад +28

    Great review!! I just bought my friend a 6750 for $350 on Newegg as I'm putting together a new gaming rig for him. I wanted to get him something by team green but the 40 series just stinks. It was more to do about the VRAM since the 6750 has 12GB and I think it will age better over time. He only plays at 1080p and doesn't care about RT. He's coming from a GTX 760 so the 6750 will be a monumental upgrade. Let's see if the 50 series course corrects!

    • @allxtend4005
      @allxtend4005 Год назад +3

      the Rx6750 is way way WAY better then the 4060ti.
      You wanted him something from team green ? every time i build a PC for friends i get them something from AMD because the price to performance was Always better and this is what matters the most, then the downgrade is reality on team green because a friend back then with a GTX1060 has now worser performance in the same games as back then then a friend with a Rx580 nowdays.

    • @dmcd7333
      @dmcd7333 Год назад

      @@allxtend4005 Not a Nvidia schill, but users benchmark has the 4060 ti 13% faster than the RX 6750. I came from a 1060 card so either one of the RX6750 or 4060 ti is a huge improvement over what i have. I have to ask myself if that 13% faster is worth $394 on microcenter vs $429.00.

    • @forced4motorsports
      @forced4motorsports 10 месяцев назад

      @@Annihilator-zv6xh Except that it is. I just returned a cash happy, jittery, driver uninstaller POS of an RX 6750 TX - this was a bloody daily occurrence. I replaced it with a 4060 Ti and it's faster, cooler, has not crashed a single time, and is even more efficient by about 35-40%; running in the 130 watt range for most games I tested. Even Starfield, unoptimized crap that it is, runs at 60-80 FPS on average.

  • @k24skin
    @k24skin Год назад +1

    This is the same business model that the automotive manufacturers are making="If we make our affordable lower tier products suck enough, then our customers will be forced to bump up to our higher priced product line where our profit margins are much larger"...AKA "Feed the lower class a SH1T Sandwich and they'll eventually pay for our overpriced menu items" :/

  • @Quint1975
    @Quint1975 Год назад +278

    Consumer oriented vids and the actual facts makes your vids worth all my time and more, thanks.

    • @zachcanreed8549
      @zachcanreed8549 Год назад +1

      It’s why they are the best in the business rn!

    • @PainterVierax
      @PainterVierax Год назад +5

      Yep, compared to the last two drama videos it's good to see benchmarks and real consumer oriented content.

  • @Raider_MXD
    @Raider_MXD Год назад +92

    Wow, the performance of that GPU is just sad

    • @DyceFreak
      @DyceFreak Год назад +2

      Not at all if you care about it's power consumption per performance. It will pay for itself over time vs the 200watt behemoths. But if you're expecting a linear performance graph between generations in line with cancerous consumeristic progress, then this isn't your girl.

    • @andrew6978
      @andrew6978 Год назад +19

      @@DyceFreak It's the price that's sad, not the performance.

    • @XX-_-XX420
      @XX-_-XX420 Год назад +19

      ​@@DyceFreakyeah if you care about effecieny it's even worse. A 6700XT will use less for far better performance.

    • @XX-_-XX420
      @XX-_-XX420 Год назад +2

      ​@@andrew6978yeah this would be a decent 4050, but the powerdraw is definitely way to high as 50 class cards typically had enough with 75w, this thing on the other hand....

    • @GewelReal
      @GewelReal Год назад +6

      Performance is really good if you imagine it's a 4050Ti for 250$

  • @mrcnorth7149
    @mrcnorth7149 Год назад +5

    Thx for this HUB. Your video highlights just how important it is to do research first before buying any new cpu's or gpu's.

  • @Atlanticmantic
    @Atlanticmantic Год назад +2

    the 4060 Ti is a 4050 Non-Ti card for $400 dollars. Can not understand how NVidia fans still defend NVidia, come on people stop fanboying and start screaming or next generation you're going to get RTX 5080's that should be RTX 5060 Non-Ti cards for $1000! Think im joking? We charge you more and you get less, Suck it. - NVidia 2022-23. Frame Generation and DLSS is just there to make a less powerful card look more powerful. Reviewers need to remove all these "Performance" tricks out of testing and do PURE raster P E R I O D ! ! ! ! !

  • @jake..A
    @jake..A Год назад +60

    I get worried seeing the latest model of cards doing worse than the previous gen thinking nvidia would nerf the previous gen in an update to make the newer gen look better.

    • @fs5866
      @fs5866 Год назад +5

      It will happen when they get close to release the 5000 series, they will nerf 3000 so they can clear stock of 4000

    • @varmastiko2908
      @varmastiko2908 Год назад

      Yes it will happen at some point, but then just revert to older drivers.

    • @JJFX-
      @JJFX- Год назад +4

      They're too clever for that. People would immediately know their cards got 10-20% slower and why. What they would do is simply make sure new games perform better and of course additional features like DLSS and RT become the selling point.

    • @mazz85-
      @mazz85- Год назад +1

      Is why i am still using 3050 i got last year...

    • @Eric-ct2ri
      @Eric-ct2ri Год назад

      im more worried nvidia might release decent cards than not to long later gimp them without telling people so people expect a certain performance but dont get it because the box doesnt actually tell you the difference between the gpus. ie kinda like what they did with the 3060's 1060's 1030's and what not without reviewers people would get absolutely scammed and they wouldnt know it.

  • @NaClSandwich
    @NaClSandwich Год назад +69

    Thanks for grinding out all these benchmarks really interesting picture of the generational performance "increase"

    • @nickyang1143
      @nickyang1143 Год назад +4

      Yeah I'd love to see a graph of dollar per frame adjusted for inflation from GTX 780 onwards

    • @Deathyman
      @Deathyman Год назад +4

      ​@@nickyang1143start at the 680, that's when Nvidia started their fuckery with using smaller dies meant for smaller cards. The 680 was meant to be a 660 but AMD screwed that generation up so bad Nvidia could get away with it.

  • @the.human.spirit
    @the.human.spirit Год назад +1

    To the guy who said the 4060 Ti is the best $400 GPU to get right now:
    If you think the 4060 Ti is the best GPU for $400, you haven't done your research. At all. There's a handful of cards, that you can buy right now, that literally have better FPS per dollar than this thing.

  • @kellowattentertainment
    @kellowattentertainment Год назад +1

    The fact that the 3060 Ti can beat the 4060 Ti says alot about how Nvidia is/was trying to rip off consumers. This is despicable.

  • @romank1099
    @romank1099 Год назад +13

    The more you buy... the less you get -Jensen Huang

  • @thefurmidablecatlucky3
    @thefurmidablecatlucky3 Год назад +59

    Rx 6800 non-XT can be had for $450 in the US and not much more than a 4060Ti in other regions so I really don't see why anyone would buy the 4060Ti right now

    • @hasnihossainsami8375
      @hasnihossainsami8375 Год назад +5

      Only reason I can think of is sff pc builders, for whom these low power cards are an absolute god send. Then again, they're low power precisely because they're using a tier down silicon per price point, hence the non-existant performance uplift. Sff builders are used to overspending on their parts so it'll be an easier pill to swallow for them ig.

    • @hasnihossainsami8375
      @hasnihossainsami8375 Год назад +9

      @@wiLdchiLd2k that's not a selling point, it's a gimmick - a nice to have, and shouldn't dictate the price whatsoever. And it should always comes AFTER a raw performance uplift. Same with FSR and all other variations of DLSS.
      For GPUs a feature becomes a selling point only when it's an industry standard and/or is supported by the vast majority of applications relevant at the time. For example, RT was entirely a gimmick back in 2018, but not today. Even if you were to argue that DLSS3 will eventually reach where RT is today, today's hardware will fade into irrelevance by then, just like how the entire RTX2000 series did.

    • @pinakmiku4999
      @pinakmiku4999 Год назад +1

      ⁠@@hasnihossainsami8375Totally disagree with your points. First of all, 20 series is quite relevant today. It hasn’t fade into irrelevance. Second, features are an important part of product costing as it improves the product experience so a company can charge more if they are providing you more feature. Why are you paying for any software then!? lol. Softwares are actually like features to improve experience. This is not to support the Nvidia greediness to increase price. But as a general rule, features do have value if they are improving experience, as simple as that. And dlss3 do improve that experience in several games now like Spiderman, Hogwarts legacy, etc and don’t be ignoring the fact that Amd sponsored titles don’t have dlss3 due to Amd scummy practices

    • @rafaelandrade2029
      @rafaelandrade2029 Год назад

      @@pinakmiku4999 Do you really need to buy a new gpu to only get a new software improvement???? I mean if you were getting more rasterization and an improved software i could understand going for nvidia.

    • @hasnihossainsami8375
      @hasnihossainsami8375 Год назад +8

      ​@@pinakmiku4999 spoken like a true Nvidia shill.
      Everything below the 2080Ti, a $1200 card, is borderline unusable with RT on in modern games without DLSS. Fact check yourself.
      And DLSS3 is not a selling point because it fosters the idea that same hardware performance for the same price 2.5 years later with only software uplift is okay. It is not okay. We're paying for the full package, hardware included, not just the software. Imagine if the 3070 had the same performance as the 2070 but could do DLSS2 even better - same scenario.
      The 4060ti is not even a stagnation, but instead a downgrade from the 3060Ti in every physical performance metric except power consumption, and it makes up for lost ground with software. That's disgusting.

  • @forestR1
    @forestR1 Год назад +1

    Steve, you blokes are doing a great job of calling this out garbage customer abuse. I'm not going to touch them.

  • @MWcrazyhorse
    @MWcrazyhorse Год назад +1

    Realistically the comparison should be done on a pci-e 3.0 board, because that's what 90% of people thinking of upgrading in this price category will be running.

  • @mycosys
    @mycosys Год назад +119

    I'm thrilled with the 4060Ti release - means my 6700XT will be relevant for AAA gaming for quite a few years

    • @20seconds64
      @20seconds64 Год назад +2

      LOL felt the same

    • @DyceFreak
      @DyceFreak Год назад +4

      Who cares about AAA gaming, the 6000 series supports Win7 so you can make your computer into a classic gaming behemoth instead!

    • @NyangisKhan
      @NyangisKhan Год назад +3

      Just like how I didn't have any reason to upgrade my i7 2700k for 7 years. We just went over the CPU dark ages. Now we're in the GPU dark ages where every generation is garbage.

    • @DragunnitumGaming
      @DragunnitumGaming Год назад

      my thoughts exactly! finally AMD cards will get some spotlight and marketshare

    • @XX-_-XX420
      @XX-_-XX420 Год назад

      ​@@DyceFreakpretty sure after the r9 390X you had to get win 10 or newer. Specifically didn't buy RX5000 at the time as it required win 10 or newer and I was using a win 8.1/10 dual boot.

  • @Bi5muth_
    @Bi5muth_ Год назад +28

    The other problem that i see with the 4060s (ti or not) is that they use 8x PCIE, so even in the older computers that you'd probably put this card in to give it a little oomf (like a Ryzen 2000 series) it won't be able to use as much as the card wouldve been used if it was a full 16x. So basically even the market that would buy it cant even use it as well as a 3060.

    • @mircomputers
      @mircomputers Год назад +2

      amd cards are just so much better on older cpus because of lower driver overhead

    • @RinoAP
      @RinoAP Год назад +4

      PCIe 3 x8 won't bottleneck those (suppossedly) 50 class 128bit GPUs

    • @GewelReal
      @GewelReal Год назад +2

      x4 would make a difference. x8 no way

    • @danieloberhofer9035
      @danieloberhofer9035 Год назад +4

      It has already been tested that the PCIe x8 interface makes this an even worse product than it already is for upgrading older systems that use PCIe 3. The reduced bandwith of the memory bus and the insufficient frame buffer capacity make it access the main memory more frequently - and these hurt even more if the PCIe bandwidth is halved.

    • @ThunderingRoar
      @ThunderingRoar Год назад +1

      @@GewelReal well derbauer already tested it

  • @His_Noodly_Appendages
    @His_Noodly_Appendages Год назад +2

    Nvidia is trying to convince you that a turd smells good.

  • @LeadStarDude
    @LeadStarDude Год назад

    I think what most people don't realize is that the shrinking of the bus is the main reason there was no real uplift in performance this generation. For well over a decade the 50 series were 128-bit bus, 60 series was 192-bit, 70 series was 256-bit, and 80 series were 384-bit bus. With the 4000 models all have been stepped down a notch. Now the 60 series is 128-bit, 70 series is 192-bit, and 80 series is 256-bit. So basically nVidia lowered the bar. They created a bottleneck situation with these newer GPUs that wasn't present in their previous generations. So while the newer process is truly more powerful, they cut the bandwidth down so much that we're seeing basically the same performance as last gen, but with a decent lowering of power consumption. That being said, it is a true ripoff because if they would have kept the bus sizes the same we would have seen a 30+% performance uplift this generation. 🤦‍♂️

  • @gargantuan2810
    @gargantuan2810 Год назад +9

    Nvidia is using DLSS as an excuse to release new products with hardly any generational improvements. They claim that DLSS 3 only works on the newest GPUs, but I'm sure they just designed it that way. In the future we will probably have newer and better upscalers that only work on the newest products to encourage upgrading.

  • @Mark-yz7ci
    @Mark-yz7ci Год назад +15

    watching this with my 6800xt 16 GB and a bag of popcorn

  • @kutark
    @kutark Год назад +1

    What's so stupid about all this is if they just did a 160bit bus with 10gb and maybe 10% more cuda cores than it has, pretty much we wouldn't be having this conversation, and Nvidia would still have fantastic margins.

  • @Pedone_Rosso
    @Pedone_Rosso Год назад +1

    You guys are always so negative!
    For example, for this gen mainstream graphic card, Nvidia's 4060,
    the desktop and the laptop versions are finally delivering nearly the same level of performance.
    They listened, and they stopped BS-ing us with same name products
    that are built and behave in completely different ways, on different platforms:
    how can you fail to see the benefit for us?
    That's pure integrity if you ask me!
    But here at Hardware Unboxed,
    and also on other so called "pro consumer" focused media,
    you never look at the bright side.
    You always have to dump your negativity,
    and your disgustingly materialistic love of "cost per frame" metrics,
    on our beloved fantastillion dollar companies.
    Shame on you!
    (LOL! ... Let me specify, as we're on the internet, that I'm trying to be sarcastic in this comment)
    Seriously:
    thanks for your work, guys.

  • @WybremGaming
    @WybremGaming Год назад +11

    4060ti = a 4050 in disguise. 150usd max.

  • @yissnakklives8866
    @yissnakklives8866 Год назад +38

    I've seen the 6800 non-xt for just over $300 recently, and used 6800xt's are selling for very close to $400

    • @jurgengalke8127
      @jurgengalke8127 Год назад +4

      Yep I paid $330 inc tax and shipping for my new 6800, would have got the 6800xt for $400 but it sold out as I carted

    • @sparkythewildcat97
      @sparkythewildcat97 Год назад

      @@jurgengalke8127 I think you probably got the better deal. And, unless your playing at 4k or are unhappy with 120-144fps, then the 6800 is probably plenty.

    • @hfa3309
      @hfa3309 Год назад +5

      And undervolted RX 6800 can eat like 120-150W while being faster that thar 4060ti crap :D

    • @allxtend4005
      @allxtend4005 Год назад +2

      @@sparkythewildcat97 i mean if you want to paly in 4k then the RTX 4060ti will suck even more, the card cant even put out 60 fps in 1440p games, how you except to play in 4k then ?
      By lowering graphic to low settings just to play in 4k ? kinda stupid.

    • @wallacesousuke1433
      @wallacesousuke1433 Год назад

      But it's Radeon 🤮

  • @Bobis32
    @Bobis32 Год назад +2

    if you're considering a 4060 ti buy a 6700xt before they go out of stock for 100$ less for within 5% performance and 4gb more ram for better longevity

  • @youreright7534
    @youreright7534 Год назад +2

    4070(which isnt a great buy) matches the 3080 which is 2 tiers above it, while having 50% more vram than its last gen tier, yet the 4060ti can barely match its own tier, with the same amount of ram AND less bandwidth😂

  • @antraxbeta23
    @antraxbeta23 Год назад +16

    It's simple , Nvidia want's to sell you Frame gen and AV1 stuff , not extra performance , they really think people will buy in to that and will rush out to buy 40XX , i guess that has its good sides to , more used 30Xx series on the market = less $$ for used parts

  • @yogibbear
    @yogibbear Год назад +40

    My popcorn is ready for the 4050 benchmarks! Let's go! 🥳

    • @Zuggeerr
      @Zuggeerr Год назад +13

      Damn if 4060 is really 4050 then what exactly is 4050? GT4030?

    • @kotztotz3530
      @kotztotz3530 Год назад +6

      Yeah...I agree. They can't release lower class cards when the 4060 is AD107 which is the smallest die lol

    • @bb5307
      @bb5307 Год назад +4

      @@Zuggeerr Yeah there's a reason they don't/can't make x030 chips anymore. The tiniest and cheapest chips all moved up a couple of tiers (in price, not performance)

    • @akiesa559
      @akiesa559 Год назад

      If they keep the same silicon as the laptop version should be around 20% slower than the 4060

    • @Dhrazor
      @Dhrazor Год назад +3

      there is no and will be no 4050, they don't have a smaller die than this...

  • @Knorrkator
    @Knorrkator Год назад +1

    Please don't confound memory bandwidth and memory bus. The problem is low memory bandwidth, bus width alone doesn't tell anything.

  • @tourmaline07
    @tourmaline07 Год назад +1

    Even more pleased as a 2080Ti owner, after five years it still smokes this garbage and probably is within 25% of a 4070 in many games ;)

  • @eddsson
    @eddsson Год назад +101

    It'd be a superb 4050ti if priced as such.

    • @corvoattano9303
      @corvoattano9303 Год назад +7

      250 Max

    • @InternetListener
      @InternetListener Год назад

      I'd buy one for sure.

    • @andersjjensen
      @andersjjensen Год назад +8

      It "is" a 4050ti. Nvidia bumped the entire stack down because not enough people bought the 3090 and Nvidia thinks everyone is absolutely loaded. The truth is that the 4080 should have been a cut down AD102 with 20GB VRAM, the 4070ti should have been the fully enabled AD103 die, the 4070 a slightly cut down AD103, both with 16GB. The 4060ti should have been a fully enabled AD104 with 12GB, etc. The fully enabled AD106 die should have been the 4050ti (where 8GB makes any sense at all!).

    • @joshbittner
      @joshbittner Год назад +2

      This should be $200, if made at all

    • @MrGiHunt
      @MrGiHunt Год назад +4

      Exactly, as a 4050ti for like 200 bucks, it would be a great card..

  • @36cores
    @36cores Год назад +10

    Forget the name on the box, it's a $400 50 class GPU.

  • @dsirius1500
    @dsirius1500 Год назад +8

    What a great, precise and surgical review, thank you for it Steve. Congratulation and keep posting.

  • @coreystill7197
    @coreystill7197 Год назад +1

    I really don't understand why people keep saying the 3070 graphics card is not great! I own more than like 600 games on steam and every last one of them plays amazingly. I'm not talking indie games either! I mean all the heavy-hitting games!

    • @riven4121
      @riven4121 Год назад

      Because Nvidia crippled it by not giving it enough VRAM. 8 gigs is not enough for that class of card.
      The 2080 Ti will outperform it in Ray tracing because it has more VRAM than it does

  • @supabass4003
    @supabass4003 Год назад +41

    Day 1 3060ti buyer, $800aud(+shipping) and worth every cent. Have not bought a GPU with less than a 256bit bus since 2003, I also replaced my 2060 Super with the 3060ti, was expecting to get the 4060ti as speculation suggested it should have been 3080 level in performance but this is what we got instead. FX5200 vibes

    • @f15active
      @f15active Год назад

      Me too. I had some trepidation at the time but ultimately very happy I got it.

    • @DyceFreak
      @DyceFreak Год назад

      FX5200 is actually currently a go-to card for Win98, so it's held up quite well despite the FX's bellyflop of an existance. I just went from a VEGA64 to a RX7600, I waited 4 years and the entire price gouging fiasco before buying again. I pity your rampant consumerism.

    • @temperedglass1130
      @temperedglass1130 Год назад +1

      Simp

    • @Ahmad_8412
      @Ahmad_8412 Год назад +1

      yeah dude, and we thought that 20-Series was the worst gen ever give price to performance..... 2060 super managed to match The Legendary GTX 1080, for 100$ less Msrp and new features, In this generation, u would at least hope that 60 Ti card would get close to 3080.... oh well it cant even beat 3060 ti properly lmao

    • @xFrancisMichaelx
      @xFrancisMichaelx Год назад

      3060ti also, and running with a 5800x3D, 32gb ram.. get 140-150fps 1440p on tarkov and WZ2 👌

  • @mercurio822
    @mercurio822 Год назад +13

    The radeon 6700xt is the goat of this generation of cards

    • @mjkittredge
      @mjkittredge Год назад

      I'd say it was, until the 6800/xt came down under 500 dollars. But still happy with my choice, 6700xt is great midrange

  • @ItsAkile
    @ItsAkile Год назад +2

    No ones saying its not a solid buy but by itself in a vacuum, its stagnant and and even downgraded and only rely on power consomption/ Frames Gen to look decent. as a new buy from a basement of cards its a pretty solid $400 card but really isn't doing anything for the mid range 3060ti/RX6700 Card owners like me, so I retract my complements.

  • @jerrywatson1958
    @jerrywatson1958 Год назад +5

    This Thumbnail got me! Great work Steve. Click attracting w/o being clickbait. Thanks Steve.

  • @NCCountryLiving
    @NCCountryLiving Год назад +3

    It makes total sense now that EVGA said fk it!

  • @awebuser5914
    @awebuser5914 Год назад

    The "problem" with the 4060Ti and 4060 is that nVidia decided to create a huge gulf between the midrange and upper-midrange. The 4070Ti was exceptional, giving near 3090Ti performance, same with the 4070 to a slightly lesser degree, getting close to a 3080Ti in some games.
    TBH, the root of the issue is that the 4090, and 4080 to a degree, set entirely new levels of performance, far beyond any previous generational uplift and that set expectations far too high. The 4070Ti was positioned "correctly" in naming and delivered expected results, the 4070 was "fine" and at least leapfrogged the 3080, but both of them are priced as premium products, far above the previous gen cards.
    Architecturally, there is _no_ Lovelace 4060-level GPU. The gap in-between the 4060 and 4070 is colossal and this was obviously done for margins. nVidia decided to keep prices the "same" on the low-midrange and cut-down the 4060 GPU design drastically to pad margins as much as possible.
    Like he said in the video, in the early days of Lovelace, it was perfectly reasonable to extrapolate from the high(er) end Lovelace lineup and assume that if a 4070Ti can jump almost two levels, why wouldn't the midrange? Unfortunately, it didn't turn-out this way...

  • @PoRRasturvaT
    @PoRRasturvaT Год назад +3

    The issue is we can't opt-out of paying for the optical tensor flow, which I think is what's inflate the price. NVidia is pushing for DLSS3 to be massively implanted.
    The cards do have insane energy efficiency though. 4070 has 3080 performance at 200W, which is a big heap. If it was priced correctly, it would have been the real 4060Ti at $400, and this 4060Ti should've been a 4060 at $300

    • @renatoramos8834
      @renatoramos8834 7 месяцев назад

      this isn't a 4060, only 128 bits bus and 8gb ram. Needed 192 and 12gb, respectively.

  • @tryhard28
    @tryhard28 Год назад +4

    Nvidia trying to bring rx6700xt up on steam hardware survey

  • @joelconolly5574
    @joelconolly5574 Год назад +3

    Even with 8GB VRAM, it would make sense if it's equal to the 3070 but uses 60 less wattage. The fact that it's barely faster than the 3060 Ti, or worst, terrible than it, just goes to show how crippled midrange 40 series are. Worst card in existence.

  • @timometsanoja9666
    @timometsanoja9666 Год назад +1

    The synthetic gimping is what kills this... They are making this hardware slower on purpose. Which is very counter productive...

  • @johnboylan3832
    @johnboylan3832 Год назад +2

    'Oh, buy AMD because they give you 400GB of VRAM in cards that get you 12fps when you actually need it!!!'
    There. You don't have to watch the video now

    • @rob4222
      @rob4222 Год назад +1

      8GBs aren't enough sometimes even today.
      Turn Textures down and RT off, if it makes you happy

  • @projectc1rca048
    @projectc1rca048 Год назад +5

    Yup, it doesn't matter how much vram they put on this card if the memory bit bus is too small its pointless. Memory bit bus size is more important than vram capacity imo.

  • @Thakkii
    @Thakkii Год назад +14

    I think what upset me the most,i was legit waiting for this gen cards and then i end up getting an 6700XT, yes i could get this card 2y ago, oh well.
    I am happy with this card coming from an RX570, the uplift was substantial but still just shows how new-gen card not always is worth waiting for ( talking about entry / mid - level cards) now I'm deciding if i go AM5 or just get an 5600/ 5800X3D , currently i am using an R5 1600AF.

    • @anjaanmusafir8556
      @anjaanmusafir8556 Год назад +12

      5800X3D

    • @ThePurplePassage
      @ThePurplePassage Год назад +6

      Yeah, the 5800X3D would be attractive here imo; keep your existing DDR4 RAM and mobo (assuming compatible),

    • @sweetsurrender815
      @sweetsurrender815 Год назад +2

      Get the 5800x3D. You don't need to go with AM5 just yet.

    • @Thakkii
      @Thakkii Год назад

      @@ThePurplePassage ye but in order for me to get 5800X3D i need to swap mobo and ram , as my ancient Gigabyte GA-B350M Gaming 3 is not good for the CPU , yes supports but dealing with Bios updates from gigabyte is just pain , is not simple as download bios and update , you actually need to do a lot steps and update other stuff.
      This is why i was considering AM5 or INTEL (never used a intel setup tbh but i have no preferences) but I've heard AM5 gives some headaches as is not matured yet , not sure how is it now.
      My budget rn is 400€, 5800X3D is 300€ , new mobo around 100€ , 32GB RAM 3600MHZ CL18 70€ (CL16 100€) , surpasses my budget at the moment but i dont mind wait a lil longer and keep R5 1600AF as is alright in most scenarios ( currently playing BDO and works)
      I have 2 options here:,
      -Get tomorrow R5 5600/B550/32GB ram (411€).
      -Wait a month and get 5800X3D instead (or AM5 setup with an 7700 or 7600) I'm having some kind of fomo here , will 5800X3D sell out? i never had fomo buying an component lol but since 5800X3D is out for a year or so and AM4 is in the end of line. Could grab CPU now and get rest of stuff next month ( unless the CPU will go down in price but hard to know those things)

    • @Masterfeedbackprod
      @Masterfeedbackprod Год назад

      If a bios up date ie downloading installing @bios and the latest bios f51g then 3 clicks is alot of work not sure you would like the half a day work for building a new pc also gigabyte have that security issus so every one on a gigabyte mb should update thier bios now! that includes if you buy and new mb first job always update bios , that said no point building a new am4 system just go am5 or intel if your building fresh

  • @alexbaldes
    @alexbaldes 2 месяца назад +1

    It's a good thing I bought a PC with the 4060 TI having 16 GB on the card instead of 8 GB on it

  • @cracklingice
    @cracklingice Год назад +1

    The issue is Nvidia making a 4050 Ti and then calling it / pricing it as a 4060 Ti.

  • @not_so_native_native
    @not_so_native_native Год назад +12

    Steve is there a way to simulate a 4060ti with a 4080,like you did with 3dv cache 12/16 core Where u disabled cores etc. Just to show what numbers we could get on a bigger bus and more memory(vram) ?(keeping cores normal etc) Or is the sort of simulation not possible?
    If it is possible it would be great to see how it should've performed if Nvidia never cut back soo much

    • @RinoAP
      @RinoAP Год назад +3

      It's impossible other than heavily editing the vbios, which even enthusiasts can't easily do

    • @termitreter6545
      @termitreter6545 Год назад

      No, GPUs are too complicated for that. I think theres going to be a 4060TI with 16gb of memory? Not sure if that card isnt gonna differ in other ways, though.

    • @ThunderingRoar
      @ThunderingRoar Год назад

      not really, you cant disable SMs on gpus like you can disable CPU cores in bios

  • @malccy72
    @malccy72 Год назад +38

    Great video and so glad you didn't go down the Nvidia DLSS false frame road. When you buy a gpu you want it significantly better than the previous series and not have to depend on high-latency false ugly DLSS frame software to make it appear better - thank you very much.

    • @FLCLimaxxx
      @FLCLimaxxx Год назад +6

      The frames are real in my heart.

    • @flintfrommother3gaming
      @flintfrommother3gaming Год назад +5

      Maybe the real frames were the friends we made along the way.

    • @harleydavidson2218
      @harleydavidson2218 Год назад +2

      @@wiLdchiLd2k asking for a friend are you a left leaning democrat ?

    • @Takisgr07
      @Takisgr07 Год назад

      Cope.

    • @harleydavidson2218
      @harleydavidson2218 Год назад

      @@Takisgr07 I mean, my friend just wanted to know cause usually the corporate shills are moronic Left Ln., Democrats he just wanted to know

  • @shadowfire246
    @shadowfire246 Год назад +3

    I upgraded from a 3060ti to a 4070. The extra vram is very nice and overall the card is faster. But I wouldn't have gone with the 4060ti.

  • @nathanixslade
    @nathanixslade 10 месяцев назад +1

    Do you think could be a drive problem not optimized for the new Lovelace architecture , the 3000 series use the Ampere old but very established architecture ?

  • @colinhughessimracing2757
    @colinhughessimracing2757 Год назад +23

    Love these comparisons keep up the great work!
    So glad i got a 3070 after a age of waiting for only £250 about six months ago. I had a feeling 40 series was going to be out of my budget and the 4060ti just confirms I made the right choice. A little worried about VRAM moving forward but so far its been great.

    • @KillShot227
      @KillShot227 Год назад +8

      Unless you're worried about maxing out settings your good. I have a 3070 as well. 80 something games in my steam library. No problem getting high frames for my 170hz monitor at 1440p.

    • @andrew6978
      @andrew6978 Год назад +5

      I think the vram thing is overblown.

    • @_barncat
      @_barncat Год назад +1

      I got scalped for a 3070 back in 2021but fk it anyway, does good for me on a 1440p

    • @guitarsolos89
      @guitarsolos89 Год назад

      Still rocking 3070 with little to no issues. Overpaid like crazy here in Sweden when it got released, it was almost 800euro, and this card took me months of searching to get.. Prices are still crazy high in my country for GPUs, they're still at 680-700euro retail price

    • @jasonking1284
      @jasonking1284 Год назад +3

      @@andrew6978 It IS overblown, but still definitely a thing.... especially if you want to play maximum everything...

  • @Patrick-S
    @Patrick-S Год назад +17

    Very informative video as always. Thank you for the hard work!

  • @simonedagostino9358
    @simonedagostino9358 Год назад +2

    At this point I'm not even disillusioned, I'm just amazed

    • @memitim171
      @memitim171 Год назад +1

      I'm fairly certain this is the first time we've seen actual regression that wasn't in 1 or 2 titles and caused by driver issues since 3d acceleration began, truly a landmark card...

  • @VicharB
    @VicharB Год назад +1

    For under $500 you can now find used but really good 3080Ti 12GB; period. If your budget is $350-$400, get 3070/Ti used.

  • @anonyshinki
    @anonyshinki Год назад +8

    The real mystery is why Nvidia is releasing gaming GPUs at all right now. They'd probably save money with a tax break by cancelling them, and then reusing the chips for AI and other data center usage.

    • @dycedargselderbrother5353
      @dycedargselderbrother5353 Год назад +1

      Best guess is they feel the need to release regularly to maintain mindshare and AIB relationships. If GPU investment for AI goes tits up and Nvidia needs to drop back to gaming, it will be a bit of a problem if they've been out of the market for several years.

    • @fVNzO
      @fVNzO Год назад +1

      I think this is simply to keep onto existing workforce right. They have a lot of people working specifically on GeForce stuff they feel some kind of obligation to keep drivers up to date etc. So the easiest thing for them is to just keep the whole thing at least afloat to avoid any riscs associated with leaving the market all together.
      It's for convenience sake. Nothing more. They have no need or want to produce compelling products to normal people anymore. That is clear.

  • @johndoe-gd9uw
    @johndoe-gd9uw Год назад +7

    Why is pc gaming hardware moving backwards?

    • @ia3630
      @ia3630 Год назад +2

      Jensen, Lisa and their investors need new leather jackets, oc

    • @e_Moses
      @e_Moses Год назад

      Because people are morons and buy whatever shit these companies put out.

    • @HWMonster
      @HWMonster Год назад +1

      Greed.

    • @ofon2000
      @ofon2000 5 месяцев назад

      They overproduced RTX 3000 and Radeon 6000 GPUs during the Etherium mining craze...once it ended...they had a bunch of stuff that wasn't sold yet. Now they've gotta give sub-par value on the new generation so the old stuff will still sell at good prices. Next gen (a year or so from now) should be considerably better. If not...then I don't know what to say.

  • @linpawz3029
    @linpawz3029 Год назад +1

    Hate how Nvidia got 1 step ahead of RUclips reviewers when they roasted 90 class GPUs for lesser value.
    Unfortunately, now the 90 class is the only one worth buying. 😢

  • @descai10
    @descai10 Год назад +6

    glad i got my 3060 ti for $399 right on release

    • @Radek494
      @Radek494 Год назад

      Lucky

    • @WeeManXL
      @WeeManXL Год назад

      Bought a 3070 used for under 300$ one year ago. Happy for now on a 1440p monitor, but I don't play the latest games.

    • @saliva1305
      @saliva1305 Год назад

      @@WeeManXL where did you buy it from? i have a 3070 barely used i've been wanting to sell

  • @jabbathehut6850
    @jabbathehut6850 Год назад +3

    same gpu will soon be sold for 500 with extra vram .. next level bullshitting

  • @flyjum
    @flyjum Год назад +1

    Wait... they sell a 4060 ti version at 16gb for $499 what the frick. Its very clear nvidia tried to shift all of the 40 series cards up an entire tier and is why they "cancelled" the 4080 12gb and just rereleased it as the 4070 ti. It should have been the 4070. Remember the 3060 had a bus width of 192 which is the same as the current 4070 ti. The 3050 has 128 bit bus like this 4060 ti card.

  • @Pruflas-Watts
    @Pruflas-Watts Год назад +1

    You can drop all the facts in the world but consumers need to wake up and stop paying for trash. Nvidia has gotten away with this because they can and AMD used that opportunity to do the same, both purposely cutting back on mid ranged and lower range variants but selling them at sky high prices.
    128-bit memory buses, 8x pcie lanes and 8GB of VRAM at the $200 and above price point is absolutely criminal in 2023.

  • @jyllianrainbow7371
    @jyllianrainbow7371 Год назад +9

    I look forward to many more years with my RX 6600 at this rate.

    • @Aegie
      @Aegie Год назад +2

      UE5 will kill most of entry level cards

  • @andrew6978
    @andrew6978 Год назад +4

    AMD and Intel have an opportunity to blow Nvidia away in the low end segment, will they take it?

    • @TheCountess666
      @TheCountess666 Год назад

      People will continue to buy nvidia, and nvidia will justify the poor performance with DLSS.
      GPU market's F-ed.

    • @chanzeyu1
      @chanzeyu1 Год назад

      If people are willing, the RX7600 is just that.
      Outside of the US, this card is probably the first card post covid with a reasonable price tag. Close to 3060Ti performance while being cheaper than the 4060TI and 4060.

  • @mint3
    @mint3 Год назад +1

    Got my girlfriend a 12gb 3060 for $280 brand new. For 1080p that's going to last her years to come. The 128 bit bus and 8gb vram on the 4060 ti is going to age SO poorly

  • @tecnoPTY
    @tecnoPTY Год назад +1

    This is what I think it should be:
    Actual. === Shoul be
    - 4060 RTX 8 GB = 4050TI RTX
    - 4060 TI 8GB = 4060 RTX
    - 4060 TI 16GB = 4060 TI
    And all of them -$50 - $75

    • @valentin3186
      @valentin3186 Год назад

      +8GB of VRAM won't change the fact that is not a XX60Ti die

  • @MrNside
    @MrNside Год назад +3

    Nvidia Engineer: "Hey boss, good news! We're getting close to finalizing the 4050 series cards, and we're getting better than expected performance."
    Nvidia Exec: "That is good news. I wonder if we can call it the 4060 series and charge another $100 for it? Hahaha, who am I kidding? Of course we can!"

  • @CM0G
    @CM0G Год назад +5

    Do you know if they're going to sell the DLSS3+AV1 DLC separately?

    • @HazewinDog
      @HazewinDog Год назад

      not if you buy now, mine came with both enabled :P

    • @memitim171
      @memitim171 Год назад

      🤣

  • @zystico
    @zystico 3 месяца назад

    As a 3060 Ti owner, I would like to thank Nvidia for keeping my GPU 'up to date' in terms of performance by releasing the 4060 Ti.

  • @spymasterigi
    @spymasterigi Год назад +1

    Don't forget to mention the PCI-E Port on this card is capped to x8. That means that for everyone owning a PCI-E Gen 3 system this card isn't even a option when upgrading from a older series of GPU's. And there are a lot of 3.0 systems out there.

    • @Hardwareunboxed
      @Hardwareunboxed  Год назад +1

      Doesn't really matter unless you run out of VRAM, at which point you're stuffed anyway.

    • @spymasterigi
      @spymasterigi Год назад

      @@Hardwareunboxed Yes you're right. This should be my point as you also mentioned the 8gb of vram.

    • @Hardwareunboxed
      @Hardwareunboxed  Год назад

      As I said, it makes no real difference.

  • @Bruhhhoff
    @Bruhhhoff Год назад +20

    As a 3060Ti Owner I see this an absolute W 🗿

    • @SB-pf5rc
      @SB-pf5rc Год назад

      i don't. i love seeing what game devs can do when they can let their graphics stretch their legs. the mainstream experience will always be dictated by the consoles, and this represents another generation of custom PCs that can't offer a console experience for console money. that sucks.
      your 3060ti isn't going to be able to deliver fewer frames just because the newest hardware can do more.

    • @Bruhhhoff
      @Bruhhhoff Год назад +1

      @@SB-pf5rc I agree..

    • @SB-pf5rc
      @SB-pf5rc Год назад

      @@Bruhhhoff agreeing in youtube comments?! FORBIDDEN
      ...but yeah as a launch day 6900xt owner i'm very happy to sit this generation out. probably gonna sit the next one out too. saving money is OK with me!