Nvidia GeForce RTX 4060 vs. GeForce RTX 4060 Ti, 40 Game Benchmark: 1080p & 1440p
HTML-код
- Опубликовано: 15 июл 2024
- Hetzner: www.hetzner.com/hub/sa
Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Buy relevant products from Amazon, Newegg and others below:
Radeon RX 7900 XTX - geni.us/OKTo
Radeon RX 7900 XT - geni.us/iMi32
GeForce RTX 4090 - geni.us/puJry
GeForce RTX 4080 - geni.us/wpg4zl
GeForce RTX 4070 Ti - geni.us/AVijBg
GeForce RTX 3050 - geni.us/fF9YeC
GeForce RTX 3060 - geni.us/MQT2VG
GeForce RTX 3060 Ti - geni.us/yqtTGn3
GeForce RTX 3070 - geni.us/Kfso1
GeForce RTX 3080 - geni.us/7xgj
GeForce RTX 3090 - geni.us/R8gg
Radeon RX 6500 XT - geni.us/dym2r
Radeon RX 6600 - geni.us/cCrY
Radeon RX 6600 XT - geni.us/aPMwG
Radeon RX 6700 XT - geni.us/3b7PJub
Radeon RX 6800 - geni.us/Ps1fpex
Radeon RX 6800 XT - geni.us/yxrJUJm
Radeon RX 6900 XT - geni.us/5baeGU
Video Index
00:00 - Welcome back to Hardware Unboxed
00:25 - Ad Spot
00:58 - Introduction
02:18 - Cost Per Frame Initial Thoughts
02:54 - Test System Specs
03:15 - Cyberpunk 2077
03:36 - Cyberpunk 2077 [RT]
04:04 - Call of Duty Modern Warfare II
04:23 - A Plague Tale: Requiem
04:38 - The Callisto Protocol
04:51 - The Callisto Protocol [RT]
05:46 - Fortnite DX11
06:02 - Fortnite DX12 [RT]
06:20 - Resident Evil 4
06:44 - Resident Evil 4 [RT]
06:54 - Marvel’s Spider-Man Remastered
07:07 - Marvel’s Spider-Man Remastered [RT]
07:19 - Doom Eternal
07:49 - Hogwarts Legacy
08:05 - Hogwarts Legacy [RT]
08:30 - Star Wars Jedi: Survivor
09:02 - 1080p Average
09:47 - 1440p Average
10:05 - Final Thoughts
Read this review on TechSpot: www.techspot.com/review/2709-...
Nvidia GeForce RTX 4060 vs. GeForce RTX 4060 Ti
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Outro music by David Vonk/DaJaVo Наука
Man how can this generation of GPUs be so bad? Maybe they are doing it to make the next gen GPUs look like a massive upgrade?
Ahaha you naive little one. Let me introduce you to the 5090 for just $2999.
Probably me in 2025.
Yes they should be. Given that 2000 series is a disappointment while 3000 was massive uplift. Sadly it was timed with the mining boom.
One word: Greed
They were hoping tech illiterate people tunnel vision DLSS 3 frame generation since the hardware is gimped like Memory bus size ect
Good software but introduces a lot of ghosting on moving objects
The 4090 is the only decent card in the stack. If I knew the rest of them were going to be this bad I’d of bought it from the start and been happy. Now I’ll just wait a generation.
The 4060 vs 4060Ti is basically the Giant Douche vs Turd Sandwich competition from South Park.
One of the best analogies ever presented in a youtube comments section.
Ty for reminding me why I don’t watch South Park 😂
Or the old Simpson episode with the two aliens running for president.
"I'll vote third party!"
"Hahaha... and throw your vote away?!"
At least here the 3060ti and 4060 is at the moment the same price so I’d get the 3060ti if I needed a GPU. At least while the sale lasts.
This.
Both are 100$ over what they should be . And Ti is lacking an additional 4GB of Vram
get 2nd hand cards, 3060 and 3060ti are $150+ cheaper than the new 40 series counterparts
Can only agree. At this point you could get the feeling Nvidia wants to release seasonal GPUs since they wont have the power to survive another year.
And how should they do that extra 4GB, hmmm? Should they gimp the bandwidth more to 3 lanes of 4GB, or should they complicate routing because half of the lanes are twice the memory capacity of the other half, or should they throw those chips out entirely to the skip and make a new mask for a new chip? Come on, you have SO MANY IDEAS about how it HAS TO HAVE an extra 4GB, you MUST have thought HOW, right? Right????
Gave up on new gpus and got a 3070 for $300 instead
@@markhackett2302 they should have used the 104 die on 4060ti like they did on 3060ti thats how
Nice comparison of the 4050 and 4050 ti.
True
AKA the battle of the ewaste.
I really enjoyed this matchup between the 3061 and 3061 ti
@@Collin_J You mean 3060.5 ?
You mean 4050 and 4060* otherwise is exaggerated
Nvidia : The Way Gamer Meant To Be Played™
The more you buy the more we cash!
Problem is alternatives are not out there and they don't seem to come any time soon...Is this a monopoly? Should we buy ARC A750 althought youtubers keep not recommending it? Are AMD not much better but less criticized, why?
@ALaPhresca you're miles off base with that one.
@@InternetListenerAMD less criticized? Always got the impression of people holding them to a higher standard, because they are already used to bs from Nvidia and cling to the hope of AMD reintroducing competition.
@@InternetListener LOL, your info is outdated, A750 is 180-200$ right now which makes it an unbeatable value. You tubers are recommending it. Much better than the trash 3050 that costs more, performs much worse. Performance is on par with and even better than the 12GB 3060 in most cases ( including RT ) while costing a 100$ less.
I like seeing you get out from behind the desk and changing things up a bit mate. It's an awesome setup, good to show it a bit.
I agree!
He needs the exercise...
Thank you for comparing the 4050 and 4050ti. These are both great options at $150 and $200, but the comparison helps people choose either of the two anyways.
Hahaha exactly! If they dropped both cards $100 we would be in a different situation
@@Tpecep its not the performance thats the issue, its the price of the cards. A whole new generation two years later should not have vram and bus width cut. And the performance improvement over the previous gen is minuscule
@@Tpecep They aren't powerful. The specs are in line with other 50 class cards of the past. The price is high because they are selling it based on the software it has, which most games still don't use. Not to mention the person using a 1080p focused card likely wouldn't be using DLSS3 upscaling with input lag in the first place, so it's a wasted purchase.
those gpus are bad value, i've got 2nd hand rx 5700 xt for $124
@@Dregomz02 good choice! The used gpu market is awesome right now
Nvidia casually made their midrange cards into lowend this gen.
stay on ur high end rx 550
Midrange card that plays warzone 144plus fps on 1440p thats a proper midrange card that buh sure
@@damomuller6796 Ah, yes, that game released 2012 is playable on a 2023 card. How on earth could they manage such magic?!?!?!!
@@markhackett2302 warzone came out in 2012? lmao
@@markhackett2302 try 2022 lol 2012 ahaha put the pipe down bud
I went from a 1050 to a 4060 and was very happy with my upgrade. I can’t afford 4070 so the 4060 was a bit overpriced for what it was, but it was my best option. Also DLSS is very nice in some games. I’m quite pleased given I upgraded from an old 1050, but I can see big disappointments if you’re upgrading from 2000 or 3000 series
i really need your help here
So i am thinking about upgrading my graphics card
The system i have right now is :
Aorus b450 elite v2
Ryzen 5 3600
16gb ram corsair 3200
And gtx1050ti
I want to buy 4060ti 8gb (version)
But i am confused so someone's opinion might help my situation
@lefterisdiamantopoylos7708 I have a very similar set up, but I have an rtx 3060, but im thinking about a 406p for 1440p gaming, but idk I don't know much about pcs that's why I'm watching this video lol
@@lefterisdiamantopoylos7708 I'm currently upgrading my computer aswell. my Ryzen 5600x just just got delivered and am switching between the RTX4060 or ti version. Many people say to get the RTX 3060ti however the 4060ti is often cheaper than the 30 series card. I'm kinda pulling my hair out over it frankly.
@@josiahsegovia04save up for 4070, 4080 or 4090 instead. I’ve seen a lot of benchmark tests and their performance are very similar. The only thing 4060 mostly won over 3060 ti is less power consumption.
Same here happy with my upgrade from 1060 to 4060, if you got 3000 series don't buy 4060.
Can't wait to see the 4060 ti vs the 4060 ti 16GB video, that will be fun.
in vram non-demanding games (basically 99% of games on 1080p) it will be no difference at all, with literally same fps
@@daesz with more people switching to 1440p it could still make a difference there, but the price is still too high, even more with the 16GB version
@@Chris3s with high Vram demand games and in 1440p the only thing that 16Gb of Vram can do is normalize the 1% lows and eliminate stuttering (with lower average FPS due to higher res). only an Nvidia NPC would pay $100 more to eliminate the stuttering 😂
@@Chris3s It wouldn't make a lot of difference because it's still bandwidth starved.
@@Chris3s i would rather consider 6700xt or even 4070 than 4060-16 with that price tag :)
Love your summary graphs and quick conclusions. Leaving a like and a comment to offset my short watch time!
Nvidia gives you the bare minimum at a premium price to run your games at 1080p so thye can again profit from you in the near future.
Planned obsolescence and greed at it's finest.
Nvidia gimpworks 2.0 except now it's the hardware not the software.
If you need a 1080p card, there are better cheaper options. These two cards may be fine for today's games but not much future.
The RX 5700 XT would be one of those choices.
@@mitchjames9350and Rx6700XT
@@mitchjames9350 Or the never dying Polariscards with 8 GB VRAm like 570 or 580. You can get good models for 50 - 80€ here in Germany. I am getting one for my old computer for my daughter.
I got the 4060 replacing my old 1060, because there is nothing else under 120W at the same performance level for my m-itx box. Waited long enough, 50 series not coming before 2025.
At that point for the price of one 4060, I might as well get something like a used 3070 and just sit on that for a while. 4060's start from basically £300 and you can get a used 3070 for around that or less if you're lucky
Nice to finally see more reviewers being a lot more blunt about the upsell, and actually stating the model it SHOULD have been (would like to see this done with a more complete list of consumer cards. Call it analysis if you will).
A lot of people clearly need this information properly hammered in to avoid obvious mistakes, so I don't mind seeing pieces like this, since repetition seems to be all some amount of consumers react to (the remainder either don't care, or are flatout just too stubborn/senseless brand loyalists).
Thumbs up!
Thank you for saying what they are, 50 class cards! Love the in depth analysis you guys do, much appreciated.
Ah, you fooled me. I clicked expecting to see the 16GB review today...
Wasn’t sampled to reviewers.
Same
@@VoldoronGaming That's how you know Nvidia know it's shit.
@@VoldoronGaming I know, they still might have got one through another channel and rumor said they were on sale today. Though I don't see them selling anywhere, so maybe the rumor was wrong.
I thought the timing was quite curious on this one - the 16GB for the 4060ti and extra $100 wouldn't change the conclusion though 🤔
Steve, don't forget to mention that these cards use 8 lanes of PCIE, while the previous generation used all 16x. This makes this gen SLOWER if used on a mobo with PCIE 3.0. Roman Derbauer made testing on PCIE 3 and 4, check it out.
For 288GBs, 8 lanes is plenty. Now if they had put in the proper 448GBs of the 3060TI for comparison, then 16 lanes would have been appropriate...
@@jasonking1284the problem is exactly that 8 lanes of pcie gen 3 are as fast as 4 lanes of pcie gen 4, so you got 8 GB/s. On a 16x slot you would get 16 GB/s. Considering DDR4 RAM has a bandwidth between 19.2 GB/s and 25.6 GB/s depending on MT/s, so you're definitely gonna feel the difference here.
These are meant to be the "budget" option and unless you have a brand new system (which might even have PCIe 5) or a higher/decent end one from last gen, you WILL be stuck on gen 3.
Which wouldn't be a problem if it wasn't just x8. It won't be as bad as the 6500XT's x4, which put it at 1050 Ti 's performance in gen 3 systems, but it will be a problem
@Anankin12 I have seen some reviews regarding this very issue. It makes very little difference to the 4060. So for the 4060, it doesn't appear to be a significant "problem"
@@jasonking1284 that's "good" then
Also, remember people: 128bit bus means theyìll age WAAAY more poorly than things in the same tier of performance today that have a decent bit bus (like let's say a 3070).
All in all, this video felt like "what bug should you eat? Well, the smaller and less squishy one... if you just have to..." and I'm all for it...
It's been this way since the 9700pro
That's why i go 256 bus or nothing Kid. 256 Bus and 16GB Vram should be the Standard but Cocaine Jensen gotta milk it.
The 3060ti will stomp on this card when you bench it in games coming out 2-3 years from now.
@@supabass4003 We desperately need another 9700 pro. But the 3000 series was okay, like the 3080 with a 320 bit bus, if they had just more vram on it it would age well.
Right, Like i had a 4070 Ti running mw2 @ 1440p with ultra / extreme settings with DLSS on and tho i was getting 150+ fps mostly and you think thats cool with the low memory bus and bandwidth you felt slight hitching = 1% lows and i think that's what people dont take into thought when buying a gpu as they look at the AVG or highest fps but forget about the 1% Lows and there's a reason why they are felt on the 4070 Ti, And if the new COD is even more demanding you will end up turning settings down to maybe balanced which looks muddy, But i do know comp players play that low to get more fps and in cod eh to alot its not a big issue. But to you buy a gpu to turn up settings nicely not turn down so it can go either way
Seen the same test on youtube with a 4090 no problems at all and even with a 4080 being 256Bus =700+ GB of memory bandwidth the 4080 would handle those settings much better
TL/DR = Buy a really balanced gpu that you can get..4070 Ti as well again was holding frames most of the time but them 1% Lows can be a issue (Quick hitching) And alot of the times you may not see the fps move but sometimes you will..
-
Hi, will there be any benchmarks with these GPUs using PCI-E 3.0?
I would think that a lot of gamers that are looking at buying any of these two GPUs are still on older, maybe AM4 systems that are using PCI-E Gen 3.0. It would be interesting to see if it makes a difference.
Are you planning Thermal Grizzly KryoSheet review? I wonder if there's much difference between it and thermal paste for non-delided cpus and also gpu.
Again: the scaling problems shown here caused by bandwidth limits, will not be solved by doubling VRAM with the Ti 16G. Bus width will remain 128-bit.
So what?
But I say you are entirely wrong, that extra 8GB will usually be used as cache, but for raytracing games, will be used for, you know, RAYTRACING.
Not that there is much point doing more than light RT on this speed of card.
@@markhackett2302 You're not understanding that all the cases where 4060Ti was barely any faster than 4060 wa due to memory bandwidth limitations. The extra 8GB will do nothing to help that.
@@DragonOfTheMortalKombat You're not understanding. The cases where it failed were bandwidth starvation, but that occurs WHEN, exactly? When VRAM doesn't hold the assets. So double the VRAM and it doesn't "double the bandwidth", it reduces the NEED for bandwidth. If 2TB were on the card and it loaded up EVERY ASSET, it would take a long time to start up the game (bandwidth), but it could then use 0bits/sec bandwidth afterwards BECAUSE IT NEVER HAS TO GO OUT OF VRAM EVER AGAIN. What YOU are incomprehending of is that this is an extreme for illustration purposes. Any increase to 16GB won't hold the ENTIRE game assets, but it will increase the VRAM hits.
Now completely muddying the waters, there is a bandwidth from VRAM to the GPU, but if the card had 2TB of L2 cache...
So, an increase of VRAM will not increase GPU-VRAM bandwidth, but that isn't all the difference. Is it. Some of those assets are bigger than 8GB can hold.
See Hogwarts not loading in textures or Halo Infinite not loading textures, etc. THAT won't happen any more. So do you want SOME games to render BAD frames quickly or GOODS frames slower?
So SOME of the loss is not "runs out of VRAM bandwidth", it is a mix of "runs out of VRAM bandwidth" AND "runs out of VRAM". And you ignore and don't comprehend that it is BOTH things because you want to complain about only one.
@@markhackett2302people go either results and for 50p this card is trash.
Feel free to defend it and buy one for yourself as many times as you eat. Rest of us chilling til next gen
Who has claimed scaling will be improved with more VRAM? I don't think anyone ever has. Running out of VRAM with just 8GB's will be solved with 16GB's.
I'm looking forward for the 4060 ti 16gb review. I'm expecting it to make me feel really pleased about the ex mining 6800 xt I purchased for $390 USD.
Awesome job mate! thx
With the lack of anything new to take a look at and the fact that many of us are still using older cards (1080 ti here), how about some performance comparisons between older cards and newer cards.
I don't know how much impact newer drivers would have on older cards or how far back that support extends, and I'm certainly not suggesting you test every card released for the last ten years or anything, but surely you have data available from previous testing that could be displayed in chart form.
Cheers
My RTX 3070 keeps laughing while watching this comparison 😂
F for buying 8gb gpu for probably around 750 800 dolalrs back then :)
@@NostalgicMem0ries Actually 300 USD used,but no mining with it. Bought from a friend.
Hey Steve, you could test the RTX 3060/3060 ti vs RTX 4060/4060 ti in an old PCI Express 3.0 system. I have a x370 Motherboard with a Ryzen 5600 and 32gb of ram and i dunno If the narrow PCI Express 4.0 8x could criple the newer cards to the point that the older 30 series is s the only viable choice i have.
There is a 4060ti vs 3060ti comparison out there in a Pcie 3.0 system, and they were pretty much tied in 1440p, and a 4% diference in 1080p in favour of the 4060ti
Other youtubers have already done side by side comparions. No performance difference in PCIE3 8x
What cpu are you using?
When is RTX 4050 launching, would u recommend it over arc 750?
This generation makes me feel great about my reference AMD 6800 I got around launch for MSRP. At the time, I thought it was overpriced (technically was), but this generation makes it look like a great investment.
Well when you watch channels like this and it shows you Cyberpunk 2077 RT running at 25fps instead of the 80fps or whatever it is with DLSS3 you're getting a skewed reality.
@@Phil_529 As a mainly FPS player though, I would never use DLSS 3 because of the added latency. When Im buying a card, I want to see raw performance. Not performance with add-ons like DLSS or FSR. Those of us who are watching these various reviews should know our own use case. As an FPS player, frame generation does not make sense. But perhaps as a player who mainly plays single player games or games that are less twitch based, by all means, frame gen it up. So I'm going to look at reviews that don't lean heavily into DLSS or FSR, whereas you may lean more heavily into reviews that do emphasize DLSS or FSR.
@@Phil_529 it is true and each to their own. In esports games, dlss and rt are meaningless.
@@Phil_529software upscaling isn't a hardware performance issue.
@@Phil_529only 130 or so games even support dlss3 so its not even in question at this point.
Sooooooo glad I bought a 3080 12g instead of a 4070, which is what I was considering. 4080 would be nice to have, but the price is insane.
4080 is best gpu this gen if we dont include its price, great bus wildth, great vram capacity performs like juggernaut on all titles, but yeah price is just insanity... if it iwas 799 or at least 899, many would have bought it instantly, now ... damn i can build great midranger pc for 1200
Past the 1000 series, ALL Nvidia prices are insane...
Yup, bought a 3080 12GB after the crypto crash and I suspect I'll be staying with it for another GPU generation or two... or three...
@@Varmint260at minimum until the 50xx series is fully complete.
@@jasonking128430 series would've redeemed the 20 series pricing if crypto hadn't happened. 3080 at $700 was a great deal.
Think I'll wait to see what next gen offers...hopefully some proper VRAM for a start (plus memory bandwidth)
Isnt today the release of the 16gb version?
I didn't realise yesterday, but this video is very well timed, it's quite a sick burn on Nvidia given the 16GB card, for it to be 'as bad' has to raise the performance high enough, for the cost per frame to be the same value - and it just won't.
Can you do update on 7900 xtx. Latest drivers brought crazy FPS boost in some games.
Dude, mine hits 3Ghz and over 1.1Tbs on its 24GB VRAM. This thing Destroys and you've got people buying 12GB 4070 Ti's for the same $$ 😂😂
How much is the RAM in this pc that you used?
It would be nice to see a comparison of the specs table in the beginning of the video.
It's like asking if a swapped 100bhp moped is worth it
Thanks for the video, but i'm gonna pass on both of these, and stick with my AMD RX 6650XT 8GB, as the open source drivers are great on Linux.
I never expected the Ti model to be an equal amount faster compared to prize. Have they ever been that? Usually you pay more pr. fps for each fps. It's not linear but exponential.
Is Tim going to review 7040 mobile cpus? THere are a ton of products hitting the market but no in depth review so far
the title should be RTX 4050 Ti vs RTX 4050
4050 ti vs the 4060
@@VoldoronGamingnope, none of the relative corecounts reflect a 60 class card
So glad I got my RX 6800 a couple years ago.. this current generation will age like fine milk.
Is 4060/4060Ti worth buying if what you need is AV1 encoding/decoding support plus low power consumption and you do close to no gaming?
Heyll no! Just buy an Intel ARC A380. The AIB ones are even board powered. All of the ARC GPUs have AV1 encode/decode support.
@@cielazul713 But I’ve heard they’re not that stable to be used. There’re many bugs.
@@amandeeps9842 That is old information. ARC launch drivers were very bad but that was a long time ago and the RUclipsrs never reported honestly on current drivers which are very good now. You say you do almost no gaming, so you should not worry as ARC has always been perfect for creator work like AV1 since the beginning of launch in Asia. The A380 is a perfect match for your usage requirements plus it is a very affordable option. If you do not trust me, please do some research on Google "Recent ARC Drivers" do not only get tech information from RUclips.
@@cielazul713 That’s nice. I’ll check more about it. Thanks!
@@amandeeps9842 You are very welcome 😀
so glad i got a 3060ti a year ago. my friends were telling me to wait for the 40 series and I didn't. it was the right decision as I got a huge discount on the then price
To me, these two look more like 4050 and 4050 ti.
Even as an Nvidia fanboy I can see these are really 4050 and 4060. Hope the stack is less confused in 50 series.
More like 4050 and 4050ti
We'll wait and see next year 2024 👍🏻
@@EarthIsFlat456 Te "Ti" was, canonically, the higher tier cut down. Either it is a 60 class because NVidia isn't making anything SLOWER, but it should be PRICED appropriate to the 50 class because that is where the hardware leads to, not a 3060 price comparison because if you DO that, then it is not an uplift, or barely one, around 5%, so is entirely "don't get this card, get a 3060 instead, it is cheaper". Or it should have been called a 4050 so that if/when a 5060 comes out, customers aren't expecting a 60 class to be cheaper by 30% or more than previous 60 class cards.
@@Shadowsmoke11probably 2025 tbh
@@The23rdGamer Any facts to back up your claim?
This reminds me of the end of the Radeon GCN architecture: Vega. Starved from memory bandwidth to keep up the gaming performance. The only difference being that lower-end GCN: Polaris didn't suck.
Why didn't you test with DLSS? That makes RT Overdrive in Cyberpunk run at almost over 100fps on 4060 Ti
4060 seems solid for 1080p 60fps, with dlss and fg room for anything super new or with RT or UE5. Mainly depends on what price you can get it for vs other (and potentially used) cards
Very true , eventhough for that particular price point , u can get more out of amd , it still largely depends whether you can get them in ur region and sometimes the prices are not according to msrp like in US.
I got mine for 240. I thought it was pretty good value for that.
A brand new card should not need to be limping around and relying on gimmicks straight out of the gate. This goes to show that these cards will not age well and you will be forced into an upgrade sooner rather than later if you ever want to see any real frames again lol.
@@leanlifter1 I get your point and I agree the 8 gigs is a bit anemic for 2023 but it's power efficient for a 1080p card and is the cheapest way into dlss3.5 and frame generation. I agree that 12 GB of vram would help a lot, but to me (who hasn't bought a GPU since 2007) it was a nice upgrade for under 250 and with every new game that uses dlss3 it makes it better value than older cards in the same price range (for me- for others maybe it isn't)
@@leanlifter1 I don't disagree. I have a pc I put together at the moment for 400 pounds (i7 6700k, 32gb DDR4, 512gb nvme, 1tb HDD, Asus Rog hero VIII motherboard and a 4060) I'm gonna upgrade the CPU, ram and motherboard at Xmas and resell the old case and components then- should make most of my money back before I've even bought it. So I can probably get a low to mid range new budget build done for about 500 quid. (12gen intel or AMD with the 4060 not sure yet)
Point is I'm happy with 1080p gaming. I don't need to have every game maxed out at 1440p or 4k. As long as I can get 60fps in most games I'm cool with it. Too many people just regurgitate verbatim any crud that they read online without actually thinking about it first. Like I said for someone like me that hasn't pc gamed for 15 years, this will be a nice little upgrade. Personally I'll be shocked to even see a game run on my computer that isn't minesweeper 😂
While the 8x slot makes little to no difference on PCIE4.0, I'd like to see how it would hold up in a 3.0 slot.
I would think that most people don't upgrade their whole rig when they get a new GPU.
I don't know, but I expect the cards to be too slow for it to matter. My old 1080Ti hardly performed better in a PCIe 3.0 X16 slot than in an 8X slot. But they are different architectures, and Pascal wasn't very powerful at random compute, so it may have needed less PCIe bandwidth. Only testing could tell. And I do think it's relevant because a 5800X3D could easily run in an old X370 or X470 board.
3.0 slot is present paired with probably slower than DDR4 3200 and slower than Ryzen 5600 (except for 5700G) and older motherboards so you may be getting direct hits on FPS more worrysome to number of channels on your very slow PCIe link between cpu and gpu... there is no need to go any more than rx 6600/5700xt or at much 3060 on such platform...4060 may be good enough most of the time, RX 6700 with 12Gb a x16 wil lalso be on the brink of delivering 100% of the availabel perfromance always but you will ahve a bigger bottleneck somewhere else... it will depend on every case which is the cheaper way to upgrade only one thing to get most bunk for your buck for some months or a year at least...
Makes no difference either. Other youtubers have done side by side PCIE 3 vs 4 on these GPUs. Difference was less than 1% which is within margin for error
@Argedis the difference is bigger than what you claim. Derbauer did a test on a 4060 ti and 3060 ti using PCI-E 3.0 and the 3060 ti ended up being as fast as the 4060 ti. Type "derbauer 4060 ti" to see his test. He also talks about how little the power saving really saves for you.
Edit: This means the 15% lead the 4060ti had against the 3060ti with PCI-E 4.0 is reduced to 0-5% when using PCI-E 3.0. Derbauer showed even 1 game where the 3060ti beats the 4060ti at 1440p.
Yeah I'm on a AMD b450 board which doesn't support pcie4 no matter what cpu is in it.
Steve, grat video but I must say that if someone is juggling between the 4060 and the 4060ti it would be really interesting to take a look at DLSS Framegen differences, if any on both 1080p and 1440p as for better or worse, it's one of the biggest selling points of the 40 series.
newer DLSS just might be the only reason to get 4060 over 3060 or 6700
So which one is should get ?
When will you guys upload the 4050Ti to 4050Ti 16GB comparison ?
And another gen that makes me pray that my 1070 survives until another launch.
Buying 1070 or 1080 at that time was a great decision. They were high end yet affordable.
If they release a card that will let me play 4k 144hz without evaporating my wallet, I'll buy it. Until then I'll stick to 6600.
Purely between them, I'd definitely pick the 4060, but 6700 XT destroys them both.
3060ti too if the person doesnt plan to change the gpu on next 3years because of the bis for upcpming games , if they need nvidia for productivity tasks
Unfortunately, that is not true. The 4060ti can easily match the 6700XT.
@@jasonking1284 In terms of value, doh.
Ti stands for Tiny improvements
Nvidia really said "EFF the gamers!!" and dropped the RTX 4000 series.
If/When NVIDIA makes the prices right they will be good products. It’s too late to change the naming but they can always drop the price.
The 4060 will be a turd at any price. Drop the price and it becomes a shiny turd.
Nvidia can be the only one who can bring laptop graphics to those who want it in their desktops!
Just look at their laptop graphics also 😮 : power constrained 4080M being called 4090 etc.
Thanks, Steve!
Just love the outro music! 😂
I would say that acceptable pricing of RTX 4060 cards looks like this:
RTX 4060 8GB Base - $249.99
RTX 4060-Ti 8GB - $299.99
RTX 4060-Ti 16GB - $339.99
As you said, the jump from the Base Model to Ti 8GB is certainly not worth $100. But it could be worth $50 to some. Even if some specific scenarios show very small differences, some scenarios show as much as 30% - so you gotta balance that out.
A couple years ago people were bying the RX 580 as the "bargain-bin last-gen" 1080p gaming option at the time for $200.
With the RTX 4060 Base being a current-gen card offering much more performance than the RX 580 did even when it was new, coupled with the DLSS-3 feature, I feel a $50 upcharge is acceptable as the new "bargain-bin" option.
The RTX 3050 had an original MSRP of $250 anyways and since these RTX 4060s use 50-class GPU dies and BUS widths - well having the same price seems perfectly reasonable now doesn't it?
The $100 jump from Ti 8GB to Ti 16GB is absolutely assinine. It costs Nvidia literally less than $20 for 8GB of VRAM. So in this new pricing structure we give Nvidia 100% profit on the extra VRAM and call it $40 to go from Ti 8GB to Ti 16GB - again seems perfectly reasonable, doesn't it?
Now you might be thinking if the 4060-Ti 16GB only costs $340, then the RTX 4070 Base pricing seems stupid as hell - and you are exactly correct. We didn't even get an uplift in Cuda Core count but we did get 30% more performance through other means and an extra 4GB of VRAM - so the acceptable price would be $500 - same as the 70 series has been for 3 generations now.
You gotta remember that ALL GPUs are still overpriced right now despite the fact that GPU sales are at an all-time low. Its stupid but its just the way it is. Unfortunately absolutely NOTHING from the RTX 4000 series is even a remotely decent value at MSRP, companies are still looking to benefit from the scalping days.
bruh where u gettin a 4060ti at for 300
@@rileymaynard2132 Im saying thats where the prices SHOULD be before people actually buy them
@@rileymaynard2132??
The 4060ti is so close to being an actually great card. How much would it really have cost Nvidia to give it a better bus and 12gb for $400? It would have sold orders of magnitude better for barely less margin
It isn't just memory. Remember 3060Ti performed similar to 2080S
@@fruitcake4910 it wouldn't be as good as it could be. It'd still have the performance of a skew lower than it should, but it would go from an incompitent chunk of ewaste to an actually great option for 1440p gaming.
The only 12gb options Nvidia has are the 3060 which is starting to fall behind in power for 1440p, and the 3080 is still much more expensive.
So HOW do they give it 4GB more, mmm? Should they gimp the bandwidth more to 3 lanes of 4GB, or should they complicate routing because half of the lanes are twice the memory capacity of the other half, or should they throw those chips out entirely to the skip and make a new mask for a new chip? Come on, you have SO MANY IDEAS about how it HAS TO HAVE an extra 4GB, you MUST have thought HOW, right? Right????
@@existentialselkath1264 And the 3060 only got 12GB because the 6600 did an exist at 8GB, so 6GB was off the marketing table.
@@markhackett2302 they could have just designed a better chip from the start instead of cheaping out?
Clearly it's too late to change now, that's why they've messed everything up with the 16gb to compensate, similar to how they screwed themselves over with the 12gb 3060.
Nvidia keeps cheaping out on the bus width just to realise too late and be forced to overcompensate in Vram because they can't change it after it's done.
In my country there is about 50 usd defference between them, is worth it for that price?
None of them
for 50 usd difference if you only want an Nvidia then i would take the Ti unless power comsumption is live very important to you. because the 4060 is only around 110 Watt i think.
I'm kind of doubting it's memory bandwidth that's the issue in some titles, but rather pcie4 bandwidth. When you go over 8gb they both have to use the same x8 pcie4 bus.
No Nvidia GPU is "worth" it for its price at the moment, not even the 4090.
Agreed. Their whole lineup is priced 1 tier higher than it should be. It's intentional.
The 4090 doesn't even have displayport 2.1. So there is no futureproofing. What a rip off.
3060 ? It is 260$, literally the cheapest GPU with 12GB VRAM. All others are trash but this one's good.
@@DragonOfTheMortalKombat while the 3060 was a very weak generational uplift, the 12gb make the card actually quite ok, but i would still prefer a 320$ 6700xt, because it has more raw horsepower and raytracing is a nono on the 3060 anyway. Otherwise the 3060 is fine.
For Nvidia's 40 series it basically boils down to either 1) you have the money to just throw at the 4090, or 2) wait until 2025 and the 50 series.
hello i have one question i am building a new computer from scratch and what would be better for me to buy? 4060 in gigabyte gaming or 3060ti asus tuf? At prices in my country 390 for 4060 and 460 for 3060ti the main condition is that I do not want to update the computer for a long time.
my processor its ryzen 5 5600
How about AMD GPU's(6600xt,6650xt,6700,6700xt,6750xt), have you also considered used?
@@TonyChan-eh3nz I thought about AMD's GPU's but came to the conclusion that I want to start basic work with 3d graphics and this option is not for me and the used market is also out because I'm not the most careful user so it's better to have everything under warranty😁
@@user-lv9gb6gd8n Get the 4060, the 3060ti is not worth that much more.
The really confusing aspect of almost the entirety of the RTX 40 series is just how good Ada Lovelace is as a uArch, yet Nvidia has deliberately sabotaged every card from the RTX 4070 Ti and downwards by nerfing the hell out of the memory subsystems. What is the point of this? It's like buying an exotic sports car and replacing the engine with a four-cylinder Hyundai juicebox.
"Value is poor" true words said by a wise Man
Battle of the E-waste
Would be interesting to see how much you can OC 4060 Tis memory and how that scales considering the narrow interface... Could be worth a try?
If it's ever cheap enough to justify buying, maybe, but don't hold your breath for that. It's never going to come anywhere close to the memory bandwidth of a 6800 XT.
@@syncmonism The PATA to SATA principle...
Hi Steve can we get a comparison between the 4060ti vs rx 6700 xt
The 4060 & 4060Ti should've indeed be called 4050 & 4050Ti. But Nvidia is trying to push the market, quite unsuccessfully so far. I will not upgrade from my 3060 12GB to a 8GB card, even if the 4060Ti is in theory much faster, especially as I've been on 1440p monitor for a few years now.
Also, the 4.0 x8 is a disaster, not everyone who wants to upgrade has PCIEX 4.0. I'm on z490 and a card running 3.0 x8 would not run optimally.
I was so impressed with the 4060s of both stripes I went out and bought a RTX 3060 12 GB. Locally the RTX 4060 is 15 to 20 percent more expensive than the RTX 3060. it would be interesting to see how these cards would perform if they were clocked the same as RTX 3060 / Ti.
Bruuuh the clock speed is 50% higher, but only 10% perf increase. That would mean that at the same clock, the cards are around 40% slower...
The problem with that is it's what Nvidia wants you to do. You are buying a cheap to produce last gen GPU that Nvidia needs to get rid of, only at a much higher price than typical. Any other GPU gen, if you buy a last gen product you got a steep discount.
@@giglioflex yeah but now he isn't upgrading for a while and Nvidias existing 4000 series stock will gather dust on the shelves until they bring serious cuts. Other than buying used, its the best option in a shitty situation.
Hopefully the GPU market starts becoming competitive again soon
@@scroopynooperz9051 Dunno if everywhere, but what I see in Ebay are sellers of used cards that are high or something, they are asking 50 dollars less for used cards than what they go for new. The RTX 3050 the biggest offender.
@@scroopynooperz9051 The problem is that Nvidia has simply reduced 4000 series GPU production to meet demand. Nvidia simply allocated more wafers to AI chips instead. It's a win for Nvidia either way.
So should i get the 4060 or 4060 ti as my first pc(I don't own previous pc)? With difference of $65
Neither
Why do you review the 4060ti 8g version on the day the 4060ti 16g comes out? Are you fishing now?
If AMD keep their prices down on the upcoming 7700 and 7800 and VRAM is 12 or 16 GB then Nvidia aren't going to sell many more 4060 Series...
Roll on Benchmarks/Release Day in the coming months
go check European prices for 6700, 6800 and 6900 (and 7900) AMD cards and answer yourself. My may end buying an NVIDIA card sooner than expected even without wanting to...check also prices and availability of Intel cards in Europe and cool down your hopes and wishes
I would love to see AMD be willing to take a bit less profit and completely wipe the floor with Nvidia with the 7700 and 7800 series cards. Nvidia has left that opening with a really disappointing 4060 lineup. AMD Radeon could win a lot of favor and mind share that way.
As always you benchmark GPUs fast Steve! And I've come to watch the video very fast as well!
you peasant
Would you consider adding Diablo 4 to benchmark list?
Lets race these horses. They've both been starved. We broke 2 of this one's knees.
4060 should have 192bit bus and 4060Ti either 192 or 256bit bus to reap full performance. There also needs to be $75-100 drop for each.
8GB VRAM isn't worth more than 300$, and this level of performance isn't worth more than 400$. And at those prices it is "meh" AT BEST.
But then they should rename them to what they are: the 4050 and 4050ti.
@@Velerios No, naming is, for us, not an issue. Call it the 4060 and 4060Ti, but price them according to the 3050 (250$) and any hypothetical 3050Ti (because it wasn't made). If this is their bottom tier, it really doesn't matter if they "leave a hole at the bottom" or leave a hole in the middle, but doing the former is more reasoned, so call their bottom tier 4060s, just PRICE them as bottom tier. It isn't "4050, so it is their slowest card", it is "It is their slowest card".
@@markhackett2302 The naming is very important. It sets the market price expectation. That is why they have named this a 4060/TI because it would be easier to price it higher than if it was labelled it's true name, 4050/TI. Then the prices as they are set now would be obviously ludicrous. It's a scam any way you look at it.
I'm going to assume no, will be a fun watch.
What is the CPU used in the video for the results
2:54
Did you consider to test 4060Ti 16GB?
At this point I'm no longer watching the videos for information, instead just for the Nvidia roasting.
On that note: Looking forward for the 4060 Ti 16GB review.
This is the problem. HUB's content comparing GPUs is getting dry, particularly ones that they don't really recommend.
In my opinion, it's just a bunch of benchmarks. We all understand that current gen GPUs are bad value for money, but it's time to shift the focus onto something else. Like, why are they not talking about video game tech? For instance, the advancements of GI seem to be moving the needle forward, making games look significantly better, as accurate lighting is extremely important in creating an immersive experience. That's much more exciting than "moar fps".
Maybe it's not their area of expertise, but they can at least try to be excited about the development of graphics, especially the UE5 tech, or even what's being achieved in games like Starfield.
@@someasianguy8493 First, HUB is more about _hardware_ (the channel is called *Hardware* Unboxed for a reason). Second, it's hard to get excited about global illumination when the processors needed to render said global illumination are too expensive, too weak, or both.
@@sonicboy678 I knew someone would mention this, but ray tracing capability is still something that needs to be considered on hardware. They don't really have much content on ray tracing, and that does colour our perception.
Tim advocates on buying OLEDs to have the best possible picture quality, but when you play Dying Light 2 without ray traced GI, then how are you getting the best visual experience? Like, that game just looks terrible once you turn off ray tracing.
@@someasianguy8493 Uh, Dying Light 2 looks terrible full stop. With or without Global Illumination Ray Tracing. It isn't a game that I want to play or look at. Now YOU insist that it looks good, but only with GI and RT. But others say it looks good without GI and RT. Why do you get to define what reality is and not others?
Now add on the 12FPS you would get, oh sorry, doubled up to 24FPS, sorry, sorry, and it is fine because you like its looks, but without RT and GI you get 120FPS.
Haven’t watched the video yet but I’m gonna guess the recommendation is AMD based on the current pricing structure.
Could you make it more obvious what monitor your using, as that makes the most difference.
Nice comparison of the 4050 and 4050 ti
I would be interested in seeing a comparison of these cards from an all-round point of view, not just "gaming", who is pushing more fps. I mean for gaming, productivity, AI generation, power consumption. I may be gaming quite often on my free time but for work I need to do some Photoshop, hence AI generation with StableDiffusion, some Premiere for short videos and mostly code generation with local language models. I also work from home 90% of the time so I'm watchful of my electrical bill. I hear that this generation is not crazy, but if I were to build (not upgrade) a new rig, how do this generation compare when you put all this in the balance? Who is the all-around champion in this category of price ?
Ah, yes, AI, which you don't do on these cards, and anyone "trying" (nee arguing) to do so is just trying to find a justification to say "Not AMD".
If it's for work, buy a 4090. Everything else will be a compromise.
you won't see that on the internet. hardware reviews have been taken from tech media journalist and you will only find infomercials about visceral gaming buy now things and sponsors... Almost any DDR4 3600 CL16 capable cpu paired with a A380/RX 6400 or 4060 will deliver a good bunch of technologies to use/work on with a very limited power/heat budget. You may even consider Mac Mini (typical
@@dat_21 that's not how commercial use works though, is it? You won't find beginner carpenters buying the top end tools, because they won't have enough business to pay for it. I'm pretty sure Mathieu would have already known that the 4090 is the best consumer card, and that the server cards are even better. But VRAM is massively important for Stable Diffusion so going from an old 8GB card to a 4060 might dramatically improve his workflow and yet be within his budget, which does not magically balloon just because you're making money off the results.
I've run Stable Diffusion on an 8gb 1080 and it's doable but incredibly slow compare to people I know with more recent cards, particularly with more VRAM. I'm absolutely interested in hearing how it runs on a 4060 ti 16GB which I think is dramatically more powerful than my 1080 and might become the best bang for buck for that application. Not to mention it would probably be far better than my 1080 for all other purposes as well.
I don't think HUB is going to test Stable Diffusion or productivity (more's the pity but I get why) but hopefully some of the SD channels will at least test that aspect for us.
Any commenters who have actual answers for you would probably need to know what you're upgrading from - if it's a 30 series card with lots of VRAM it'll be wildly different to my situation for instance.
The main reason Nvidia isnt giving you an RTX 4060ti 16gb for review is because you need to buy it. They aren't selling much of those you know.
Those will be the only sales they will have-from reviewers
@@alrizo1115 I'd like to think that if it's truly awful (I don't see a reason to think it won't be I just prefer to count hatched chickens) that average consumers won't buy it. But... I have friends who don't read these things called 'reviews' of PC games or hardware, or won't trust sources I suggest like HUB or LTT or Gamers Nexus, and just buy all sorts of stuff then moan about how bad it is (or never even realise). A fool and their money etc etc (also the well off, the ignorant, the lazy...).
@@jonevansauthor it's true that some people just watch reviews after purchasing the hardware then gets disappointed.
@@alrizo1115 If the 16GB was priced to the same MSRP as the 4060Ti 8GB SOME sales would arise. And dropping the price of the 8GB version is "justified" by that price, so drop its price 50$. At that price it won't sell well, but it WILL sell some.
@@markhackett2302 yes. Maybe. Man. 192 bit bus, 16x express lane and 12gb vram could've saved the 4060ti even with less cuda cores
I use a 4060ti 16gb. I just got it so i have no idea how "well" itll perform. I am bias for it because my prior card was a 6gb GTX 1060 SSC. I plan on upgrading in the future but at the moment, i am good with what i have.
Thank you
"4060" is 62% faster than the 3050
"4070" is 75% faster than the 3060
"4070ti" is 73% faster than the 3060ti
"4080" is 78% faster than the 3070ti (1440p because at 4K 3070ti is limited by the VRAM)
4090 is 73% faster than the 3090 (78% faster than the 3080ti)
Specs, performance, and the VRAM amount are OK, Nvidia just put the wrong cards into the wrong boxes with crazy prices on top.
From the hardware perspective is the same
4060 in reality is a cut down 4050 GA107/OEM
3050 - 107 die, 21.4% of full die cores, 200mm², 128-bit bus, 8GB GD6, PCIe4 x8
4060 - 107 die, 16.7% of full die cores, 190mm², 128-bit bus, 8GB GD6, PCIe4 x8
Laptop 4060 - 32MB L2 vs Desktop 4060 - 24MB L2
4060ti in reality is the 4050
_3050_ - 106 die, 23.8% of full die cores, 276mm², 128-bit bus, 8GB GD6, PCIe4 x8
4060ti - 106 die, 23.6% of full die cores, 190mm², 128-bit bus, 8GB GD6, PCIe4 x8
Where is the 4060 you will ask? Inside the 4070 box. And is 75% faster than the 3060 as it should be because ADA is over 75% faster than Ampere
3060 - 104 die, 33.3% of full die cores, 276mm², 192-bit bus, 12GB GD6, PCIe4 x16
4070 - 104 die, 31.9% of full die cores, 295mm², 192-bit bus, 12GB GD6X, PCIe4 x16
sad to know what could have been, but we live in a world aye
Thing is, if the 4060 were CALLED the 4050, dropping its price SO IT COULD SELL would not make customers think that a 4060 should be cheaper each generation.
NVidia's greed was they wanted that cards' price to be compared to the price of the 60 class that came before.
I ended up getting a 4060 as price wise it made sense where I am, ppl trying to sell 3060s for more still! Absolute madness.
I've been very happy with it as much as people seem to dislike it. It should be cheaper but still makes sense right now price to performance wise also including AV1/rtx/dlss.
old amd cards like 6700xt are almost non existant and if they are they cost more.... I'm just happy we're talking about 50-100$ of pricing and not 500+ from a couple years ago
If you are happy with your purchase that's all that matters. But the card still deserves most, if not all the critique it's getting imo
@@iyaramonk Yup, problem is a lot of people then go and require other people tell them they had the right choice. The 4060 is not worth any more than a 3060. But if you see a 4060 and get it, well, you got it. You didn't get a 3060, you didn't get a 6700XT, you didn't get a 6600. If price to performance were a thing, that is the order of increasing usefulness in buying. BUT if someone wants current gen 1440p, then the 6600 is off the table, but so is the 4060. The frame rates for "preferred games" may be "too low" for the 6600 but "enough" for the others. It all depends on what that specific person WANTS TO PLAY. "Ah, but Raytracing!!!" is not an argument, the 6700XT is faster at it, and much faster at rasterised workloads. And any workload you'd put RT on for a 4060 is also RT for the 7600 et al. THAT, however, isn't a "You made the WRONG choice" but "THIS is how NVidia gets away with overpriced tat", if "but muh RT" wasn't an excuse, NVidia wouldn't be able to price their cards quite so high.
And prices in the specific region on a specific day with a specific site someone prefers to look and shop at are why someone might find that FOR THEIR SPECIFIC SEARCH the 6700XT is more expensive than a 4060 and a 3060 just plain unavailable, so left with a hobson's choice of one: the 4060.
when is the 4060ti 16gb video?
yo
would a 6650xt beat a 4060 without dlss?
It must take a lot of thought and prep to make an interesting video about the most boring gpus probably ever. Good video, man!
I honestly use you guys before I purchase GPU's. It led me to the RX6950xt I use now.
I got the XFX 6750 XT because it was the same price as the 6700 XT and also a better model with three fans.
It's awesome, nothing else under £350 made sense.
No idea what you tested, but my 4060 Ti 16BG does produce fluid frame rates (40+) in Cyberpunk at 3440x1440 with Ray tracing overdrive. So, yeah... Did you disable DLSS? Why? Combine that with the fact that the 16GB are enough to run 13B LLMs locally (in contrast to a 12gb 4070 which cannot) and your conclusion is just plain wrong. But you do you.