I just used this service and it was great, sold my old 2060 getting offered a quote and hassle free shipping. Got 80 for 2060 and sure I probably could have gotten a bit more if I sold it on my own, but didn't want to go through the hassle of listing, paying for shipping ect.
hello mb you can help me pleas im from Belarus no money me poor only food i need GPU 4060 mb you can gift me GPU pleas ) Belarus Minsk lane Kaliningradsky 13-3 post index 220012 Gleb Zhdanovich (sorry)
I made a daniel with index finger pointer for my windows 11 main PC. Now I use his index finger for everything, from browsing to turning up the volume in WINAMP!
Год назад+6
one of the best methods to tell the viewers where to look, really appreciate it, not many people do this
Let's see.. if you actually need those 16 GB, you probably also need a faster GPU. So you might as well grab the 4070 (or something faster) or, perhaps go AMD. I ... kinda don't get what they were thinking with the 4060 Ti 16GB. Ideally, the 4060 should have 8 GB, the 4060 Ti should have 12 and the 4070 (and 4070 Ti) should have 16 GB. That would make a bit more sense and would make Nvidia's mid-to-high range a bit less confusing.
4060 should be 12 gigs as the previous 3060 8 GB is not enough... But yeah, 4060 ti could be 12 or 16 And the 4070's should be 16 GB But for 4060, 8 GB is not enough
That's exactly what they did with the 3060 12GB . Back when it came out it was a lot weaker than the e.g. 3070Ti, so people where saying it's stupid for such a weak GPU to have so much VRAM. I feel like they want their more expensive higher mid-end cards like the 3070(Ti) and now 4070(Ti) to become obsolete faster, to force the consumers to upgrade earlier. With the lower end they don't care as much about VRAM as those cards will begin to struggle earlier either way due to being weaker and low end consumers may be more succeptible to "more Vram must mean more power" marketing.
Doesn't work like that. If you have higher VRAM you can use better quality textures which doesn't affect performance if you have the VRAM for it. That's the main difference people just don't know about and textures transform the visuals the most obviously. And 6800xt completely shits on 4060ti even in ray tracing as tested by hardware Unboxed making the 4060ti obsolete for gaming.
@@yellowflash511 That's true, Textures quality is very VRAM intensive so having a GPU with a lot more VRAM will help a lot in that regard, and its also the thing most noticeable in games, that's why I believe in consoles to keep a consistent experience most titles play on medium while having texture related settings maxed out, since the PS5 has like 16 GB and VRAM isn't an issue so they look great while performing consistently
nah it should start at 12, no 8 gb unless its a 4050 or 4030 :P, but yah 4070 /ti should have 16 cause it is fast enough to use it, then 4080 with 20 and 4090 with 24, would've been great
@@md.fazlerabbi5265 To make a long answer short: Because AD103 exists. If you go back and look at every previous generation there are no uneven numbered dies. 40-series is a sham.
Great work on the review. It's a little depressing how Nvidia is using VRAM to add another layer of segmentation to a market that is already sliced and diced in to a dozen performance tiers. I miss the old days when every new GPU was better than the previous one. Now there's overlap between several generations with crap like this.
@@jstegall44I used ATI/AMD from 1995 until I switched over at the 1080 generation due to the completely lacking performance on the AMD side. I hope to go back some day, but AMD keeps pulling defeat from the jaws of victory.
4060 ti 16GB is great by performance but not for price... Actually, the 16GB should be the only version, 8GB shouldn't exist... 4060 non-ti should be 12GB and the 4070 nornal/ti should be 16 gigs
some people realized of that some weeks ago, not only with first actual perfromance leaks but with confirmed specifications that would be extrapolated to know expected perfromance thanks to the existence of previous RTX cards in professionals versions or modded with more vram... RTX 30 was already vram bottlenecked by size, and not so much pcie 3.0 to 4.0 and attention to make seem RX 6000 or RTX 30 as better gpus to buy over RTX 40... youtubers were too long and too focused on helping sell old stock and not saying what was coming as better performant...some were accused of fanboys...when the rality is that lower (still too expensive) 4060 and 4070 are good technically (with the faults you point, 4070 has 16 Gb ram stock when is sold as RTX 4090 mobile 16GB) and solved, why nvidia has launched limited by design gpus? why noibody points the evidence until after they're launched and tested? it was obvious... 4060 Ti 8Gb to 16GB got and get so much trash reviews...whne the important thing is to test >8gb vram to 12GB and the >12 to 16GB of vram use scenarios... it's been weeks we now 4060 ti 16GB can be as fast as 4070 12 in some cases like 0:01 shows...
@@Atheismo9760 But Its a good alternative if you're thinking to upgrade, also much more power efficient, draws way less power and giving slightly better experience for ppl who care about electricity bills
Hope fanboys wake up sooner, Nvidia is ripping them off big time , 8gb GPU was planned so its obsolete within a year or 2 , your hard earned $400.00 is only good for 2 years use , it's like subscription system , Nvidia dirtiest company , wake up fanboys, learn to say no .
You are the best. I love that you go in depth on the experience of actually playing the titles rather than simply giving us the results. A perk of being a teacher I guess!
6800 excellent, but not a tottaly. This card cannot be bought for $ 500 at a local store in 5 minutes, as it can be done with 4070, which means that the comparison is incorrect. And the 6800xt is a hot video card, you will have to tinker with it to make it quieter and you can run into an unsuccessful cooling system and then you can lose time and money altogether. In addition, the 4070 has FG, and the 6800xt has +4 GB, which most likely will never be needed.
@@Aleksey-vd9ocFor someone liek me who needs more vram the 4060ti w 16gb is perfect since it also allows me to use nvidia features that the games I play often have
@@Aleksey-vd9oc I have a 4070 and FG had zero to do with my purchase. Could care less about that. It was power draw, size (would fit into my case) I wish it had 16GB though. 12GB is not an issue though at least today.
@@sonatafx the 6800xt is as good if not better than the 4070 even though it is last gen. and the non-xt is not a bad gaming card either. The 4070 is okay it's not 600 dollars okay though.
Yes, absolutely. The 4070 ti in particular is god awful for 800. It only looks good if you compare it to the new pricing of the 3090 and 3090 ti, but those cards can now be found for somewhat reasonable prices on the used market, and they have DOUBLE the Vram of the 4070 ti, and much higher memory bandwidth too.
Here in Oz, using the cheapest local prices I can find, it's $813AUD for the 16GB 4060 Ti, $849 for the 6800 XT and $885 for the 4070. The $36 difference between the 6800 XT and 4070 makes for arguments either way depending on use, but the 4060 Ti for $36 less than the 6800 XT is a bad joke.
I also made this exact decision but unfortunately my 6950xt was black screening at 4k high fps so sent it back and got the 4070..definitely not as strong in brute force but something has to be said for DLSS and almost half the power usage.. After trying both I'm glad I ended with the 4070..for now until vram hurts me 😅
At least now all games that had vram issues finally been fixed by their devs, and not a single game exceeds 12gb of vram @1440p. Well just for now of course! 😂
@@TwinkleTutsies DLSS will not help you in the upcoming games like Starfield that will not have DLSS support. Adding DLSS is an extra headache for developers. The time and effort saved by not including DLSS seems to be worth it for more developers. Unless Nvidia starts paying devs to include their proprietary stuff instead of expecting devs to do extra work for free.
@@sammiller6631 if only that were the case.. AMD seems to be stopping AMD sponsored games from implementing DLSS it has recently come out if you are keeping up with youtube channels like hardware unboxed
@@df8340 And that would have made a lot of sense to do, especially with the 4070, maybe not so much with the 4060 ti, especially because the 4060 ti was actually designed with laptops in mind, so Nvidia needed that 128-bit version regardless of what they were going to do with desktop designs.
The 4070 in terms of value isn't that bad, at least compared to the 4060 Ti 16 gb. A decent performance jump from the 3070 and is now comparable to a 3080, 50% more VRAM than last gen, a higher memory bus able to use the VRAM more efficiently (still lower than the 3070 because screw Nvidia), draws much less power etc... I wish it had 16 gb instead of 12 gb and that it was priced like last gen, but for what it is it could be worse (Like the 4060 Ti 16 gb lol)
The 4070 has higher memory bandwidth than the 3070 despite having a narrow bus width, and that too without the L2 cache "effective" bandwitdh shenanigans 🤷
I went with the 4070 for my new build. I have no regrets. Did my research before purchasing and went with the best value for the money out of current generation graphics cards. I wanted to buy new and current gen for my brand new build. Works fantastic with my 1440p monitor.
@@ObamaPhoneProMax5G same with me. I didnt have to replace my 550w PSU and small case by buying dual fans rtx 4070. if I bought the AMD counterparts I have to upgrade my PSU and case, and the electricity is much higher too. and I only play AAA games mostly, so I need those Frame Gens and DLSS
I just saw a big Brazilian tech youtuber I follow mention him. Kind of a weird experience considering this guy has 10x more subs than Daniel and it's more than a decade on youtube. That's a testament to Daniel's hard work.
Love your analysis!! 👍 I may be a minority here but I would like to see how the 4060 Ti 16GB performs in 4K RT DLSS-Q FG across the most demanding games that support such settings. Does it manage to stay above 30fps? 4070 ran into VRAM limitation in R & C 4K RT gave me a chill. Would you please run a detailed test of 4K RT with Frame Gen only (stress testing VRAM capacity), comparing to 4060 ti 16GB, to find if there is any other game that poses issues with 12GB cards?
Wow I learned so much watching this. Thank you for your time and effort in explaining. Your final analysis is very helpful and the viewer should still watch all the segments in this video. Thanks again. 🇨🇦
@@eric_so_good Depends what exactly you are planning to do with it. My current build is now dual gpu, 1x 4060 ti 16gb and 1x Tesla P40 24GB, for a total of 40GB of VRAM, both of those are of nearly equal performance, individually.
@@gaius100bc Thank you a lot for your fast response! I'm buying the 16gb in the next few hours so fast response would be soooo appreciated! I heard that the cuda core compared to 3060 was reduced as well as the memory bandwidth, how is your experience regarding this on dl/ai tasks related? Thank you! And also I also have a tesla gpu too haha!
@@eric_so_good it's not enough VRAM from training, so I never tried training or fine tuning with that card, instead I tend to focus on inference. As for inference, 4060 ti 16gb - is quite cut down, but usable, 7b parameter model with llama.cpp at 4bit quantized will get you inference speed of ~40 tokens per second (with low context size), which is perfectly fine imho.
As a new RTX 4070 owner myself, its worth mentioning that the RTX 4070's power usage in Rift Apart is substantially lower than its 200W TDP or Nvidia's rated "average gaming power consumption" of around 180-190W. So obviously the game isn't using the GPU fully, despite the fact that it shows 95% GPU usage.
The 4070 is the only card with consistent significant advantage among these three. Yes it costs more, but you get more in every case. Can't say that about the 4060ti with 16Gb VRAM. I can't see buying a mid tp high tier 8Gb card today. I am currently building a PC with the 4070.
4070 user here, the card is amazing go for it! I ended up being extremely pleasantly surprised how good DLSS Frame gen is in singeplayer titles ( it is pretty garbage in multiplayer though from my testing )
I agree that the 4070 is a better value than the 16GB 4060 ti at 100 USD more. However, the 16GB 4060 ti got to within 1 FPS, and tied in 1% lows, at 1440p Very High (with RT), vs. the 4070, and actually beat it at 4k with those settings though neither card ran fast enough at 4k with those settings for it to matter. Given that there's even 1 game where the 16GB 4060 ti was able to match the 4070, that means that the 4070 would have FOR SURE been faster if it had more Vram. If that's already happening only a few months after the card was first released, then it's likely to happen more and more over the next few years. Higher capacity versions of graphics cards virtually always end up proving to be a lot better, sooner or later. Usually it takes around two years or more, such as with the 8GB versions of the RX 480 and 580, as well as with the 6GB GTX 1060. If a graphics card only has barely enough Vram when it first launches to reach it's full potential, that's not good, and will likely limit what settings you can use with it significantly in the future. The 4070 won't be a terrible card, and won't become obsolete until a very long time from now, but it would for sure be a significantly better card if it had been designed to have 16GB of Vram. The 16GB AMD cards are going to age well against the 4070, but most people will just not care enough about that Vram difference to care about it, especially if the 7800 XT ends up being slower than the 4070 and the 6800 XT in standard raster performance, but still costs 550 (as rumored).
If the 4060Ti 16GB was $450, it would make the 4070 look just as stupid, bc that's 30% more money for the same 30% more performance but with less VRAM, and that's not how it works on the midrange. But the 4060Ti 16GB isn't $450, so that makes the 4070 look actually kind of good, even though it isn't.
I wouldn't go that far to say that it would make the 4070 "look stupid" at 450, but it definitely would make a lot of people ask why the hell the 4070 doesn't also have 16GB. However, I think the 6800 XT already does a better job of making the 4070 look stupid.
Kudos on the presentation of fps as well as their relative percentages its something that I like more in comparison to digital foundry's layout in which they only show one or the other at a time. Great analysis Daniel
Biggest problem aside from price on the 16gb version is you’re not getting a larger bus, not getting more cuda cores, nothing else is better to justify the price bump aside from the vram. 3060 8gb vs 12gb has a more noticeable difference
The 8GB 3060 also has a narrower memory bus and less memory bandwidth than the 12GB version, which really hurts it. Also, it's generally two or more years later when the higher capacity Vram cards start to really show their value. The 4070 is great right now, but it's not going to age as well as cards in the same performance tier which have 16GB of Vram. It's already coming up against Vram limitations, as seen in Ratchet and Clank at 1440p Very High with RT, where it only manages 1FPS better than the 16GB 4060 ti, and the same 1% lows. That's a clear case of its Vram holding it back. It should be a lot more powerful than the 4060 ti in RT at 1440p if it has enough Vram to work with. If you look back to cards like the 4GB RX 480, they were totally fine for years, but eventually the 8GB version pulled way ahead of it in a lot of games. The 4GB 480 was still not a bad value, but the 8GB version definitely proved to be at least as good of a value in the long run. If the 4070 is coming up againt Vram limitations in games only a few months after it's released, just imagine how much worse it's going to get in two or three years from now. If you upgrade your card every two years, it doesn't matter as much, but it will still affect the resale value of your card, and GPU prices can fluctuate a lot, so you can't count on there being good upgrade options available every two years anymore. Keep in mind also that the first mainstream 8GB graphics card was released NINE years ago now, for 330 USD, which is equivalent to around 410 USD today if you adjust for inflation. Nvidia's first mainstream 8GB card was released 7 years ago, for 380 USD at the time. A 500 dollar PS5 has an 8-core Zen 2 CPU, a GPU, a very fast SSD, 3.5 GB of system memory, and 12.5 GB of dedicated Vram. Vram costs Nvidia less than 4 dollars per gigabyte. Considering these things, I think we should be expecting more than 12GB of Vram from a 600 dollar graphics card.
Yea the 4050 is a really fast card for 220$. Let's hope the 4060ti 16gb has faster memory bus and more cores. Maybe even cheaper like 350$. How can nvidia even screw up their launches anyway?
It's a lower-mid-range card at best, and I'd say that it's well worth 350 USD right now, at least relative to other products in the current market. If they had called it the "4050 ti", people would have been really really impressed with its performance. Even at 400, I'm sure it would be a very popular card, though it wouldn't really interest me very much at 400.
@@Aleksey-vd9oc Yeah man you are right, I don't know what I was thinking looking back at the last decade showing us what code names (die names) they use for the 50 cards along with the RAM bus width but some random dumb ass in the youtube comments changed my mind. Weird how these things work, it is almost as if you have no fucking clue what you are talking about.
I love this comparison! I managed to find a 4070 at Best Buy open box "excellent condition" for $480! Stress tested the hell out of it and it's working great!
3080 uses twice as much power, so its not worth it, also 10gb hits vram wall way more times than 12gb that only has issues with 3 4 games out of thousands. 4070 is better buy with frame gen and more vram also way more efficient. most ppl use that gpu with 550 600w psu, no need to upgrade like for 4080 3080 3090 4090 or amd power hungry gpus
@@NostalgicMem0riesare you dumb you can undervolt AMD GPUs with their on software lol nobody runs their cards stock. Only poor people complain about power consumption and if you're buying a top tier card power should not be a factor 🤡
@@NostalgicMem0ries The 3080 handles the VRAM-spill better though, even with just 10GB (and it's overall a slightly more powerful GPU). Ideally you'd get the 12GB version, but it's fairly rare. Either way you're more likely to be able to find a used card too. Not that I'd recommend a GPU with less than 16GB VRAM today (except for the 4060ti and 4080; they're terrible value), which generally means going AMD. If you can find a vanilla RX 6800 you'd get great raster performance, 16GB, and not-so-high power usage (especially if you undervolt; which you should). RT performance may not be great, but then again the tech is still in its infancy. No frame gen, but I personally don't see that as a negative. DLSS? Maybe, though it's not that useful unless you run at 4k. What you'd spend on electricity over the coming years, you'd might as well save on the initial GPU purchase; if we are to abide by that logic. I'd be more concerned with CPU/platform choice for that, since you're more likely to keep those parts longer.
@Navi_xx I think, like many things, "not worth the price" is always highly subjective. That's why these videos matter. Of course, the TI is and always should be faster. How much better in the real world is the question. Many are purchasing the 4070 TI, so I'd say it's quite clear many do not think the 4070 is quite worth it. A video breaking the two down would help people on the fence.
Have you ever done a comparison of the differences between nvidia’s one click game optimization settings across different GPUs? I’d be interested in seeing how much it actually tunes to the specific cards capabilities.
Ggs. Bought the pluses 7900xtx for 999$ 2 days ago and it's so awesome for 4k . Underclocks well and all temps are below 75c power draws around 250 to 330w
Power usage is the reason Nvidia has priced where they have, pay more for energy on AMD or pay more for the card NVidia, which could save the extra energy for the card over 1-5 yrs. Atm 4070 = £570, 7800XT = £700 (including 20% tax) but can use the extra 100W at £115 / yr over 5 yrs you save £570 (excluding 5% tax) if the 4070 lasts that long for say 25-30% less performance. Tough one as NVidia look to over price but average it out & it becomes competitive overall & reason why they are selling them & still feel both are over priced.
600$ for a card already hitting the Vram wall the same year of her launch i wont call that a reasonable beast. The only good point is indeed the efficiency. That will be an Ok 350$ card
As someone who was eyeing the 3060 Ti a year or so ago while compiling parts for an upgrade, I'm happy with my 4060 Ti 8GB. I reeeeeaaally needed a GPU upgrade from my GTX 970. I look at the 4060 Ti as more of a revision to the 3060 Ti since it was priced the same. Doesn't heat up my room, uses less wattage, has better ray tracing over the 30 series, and actually gets performance very close to a 3070. I basically only got the 4060 Ti 8GB because I really needed to replace my 970 and it will hold me over until the 50 series launches in 2025 because those cards will be absolute monsters. At that point I'll probably sell the 4060 Ti. If you don't need to upgrade now, don't. At least get something with more than 8GB VRAM like the 4070. I can already tell VRAM spill is going to be a problem. $600 is damn steep but 4070 will certainly have more longevity with its 12GB of VRAM. Maybe there will be a price drop? Wouldn't hold my breath but maybe? But I'm personally interested in Nvidia's features like Nvidia Broadcast and ray tracing. DLSS 2 if I ever get a 1440p monitor. DLSS looks a bit blury to me at 1080p but not terrible if I really needed the performance. DLSS 3 could be cool. If someone wasn't interested in those things, I'd probably point them over to AMD if they really want to upgrade because they have more VRAM at a better price point, but the 50XX cards gonna be good in 2025 so hold off if you can.
I will tell you and you will see, 50 series comes 2025 + Pcie 5.0 + DLLS 5 and FG + low power consumption and more higher price then now. It will release same logic with 40 series. You will desopoint. Monster? your delusional. This is Nvidia and they r starting to use a new tech, i dont thing they r giving up tht idea.
Only once during the whole video 4070 had an obvious problem with VRAM - it was Ratchet & Clank running at 4K maxxed out with RT. And even then the overall framerate was so low, that additional VRAM would not make any significant difference. 4070 does not have enough horsepower to run big modern games at 4K/60+fps maxxed out with RT anyway. And if you reduce settings/resolution or use upscaler, VRAM requirements lowers pretty substantinally. Modders are the only ones, who can benefit from 16+GB GPU of this tier, as running a game with a ton of mods increases the requirements for VRAM capacity.
@stangamer1151 Yeah the 4070 runs at its absolute best at 1440p. In my opinion 1440p is the best resolution because it has the best balance of good looking visuals, good performance, and cost. I have a 27 inch 1440p monitor and it looks so amazing compared to 1080p. 4K in my opinion is too performance hungry, to get good performance you need a 4080, 4090, or a RX7900XT. 4K also doesn't make too much sense on a 27 inch screen in my opinion
@@stangamer1151 That wasn't really a problem, given that neither card could run properly at 4k with those settings anyway. However, the 16GB 4060 ti got within 1FPS, and tied with 1% lows, at 1440p very high with RT (43 vs. 44 FPS), and that is a CLEAR case of the 4070 not having enough Vram. There's no other good explanation for how the 16GB 4060 ti could get that close in that situation. It's obviously not a CPU bottleneck, and it's obviously not a memory bandwidth bottleneck. This is also a clear example of an actually usable set of settings that WOULD have been able to give around 60 FPS or more, but instead, the Vram causes the 4070 to only perform to the level of a 4060 ti, which has a much weaker GPU.
@@syncmonism I think this game is just broken now, when RT is enabled. Because even at 1080p maxxed out with RT 4070 is just 7% faster than 4060 Ti 16GB. While it should be at least 25% faster. The devs will probably fix this issue in future patches. Same thing happened with TLoU and many other games. The best rule for a gamer these days - never play AAA games at launch, unless you are fine with suboptimal experience.
Another game that exceeds 12GB of VRAM is The Outer Worlds: Spacer's Choice Edition. Between the RTX 3070 vs. 4060, ti 16GB sometimes the performance was 2x with more VRAM.
Kinda sad to see a new $600 **70 class card in 2023 struggling with 1440P in recent titles and only including 12GB VRAM. But I will say that of all the mid range 4000 series cards, the 4070 is the least offensive. A **60ti card at $500? And only 10% faster than previous gen? Ughhh. Sadly no AMD product showing up to compete. 6800XT? almost 3 years old and only a good deal at $500 IMO at this point. I hope we see some ray of light from the 7800XT now. 7900XT is finally a good product since it dropped below $800. Please AMD. Start at the right price with this one... you still have time to make sure.
@danielowentech Hey Daniel, I wanted to say that people don't realize how difficult is to create content and edit videos. Stitching together clips with a full time job with voiceover, just want to say good luck man. Wish you the best!
I have a 4060ti and i completely agree with you. However, in India price of the gpu is decent and imo it is a 1080p card not a 1440p one. For a first time pc build, I can recommend this for 1080p gaming.
literally my only complain about 4070, if it was 16gb i would have bought on release date... perfect for my 4k 60 fps build, yet vram kills it in few years
@@NostalgicMem0ries im running 4k on 6950xt, u will be suprised how often i see allocated over 15gb...16gb is just ok for now Baldur3 can take over 10gb 😊
amd gpus tend to use way more vram, for example nvidia gpus on rdr2 use 9 10 at k, while amd uses 12 13. seen that in few other benchmarks. and game like diablo 4 uses all vram you have, 4090 and 7900 xtx maxes at 24gb at 4k@@marcinkarpiuk7797
I'm looking to get a new graphics card and I can't believe out of all the videos I've watched where people talk about the 4060ti's 8GB and 16GB variants so simply. Not one of them mentioned about the relationship the system ram and the video card ram has and you've explained it to an idiot like me! Thank you!
4050ti 16gb aka 4060ti 16gb would be a perfect al rounder card if it had 256bit bus with full x16 lanes on a some lower price point or around $320 with same specs
I'm on the verge of buying a new GPU. I'm waiting for the special offers in November, like I'm sure most people are. I was holding out since January to see what the 40 range was going to deliver but I think after this, my mind is pretty much set on the 4070. Thank you Daniel. You've really made my choice quite easy. I'll be using this GPU for the next 5 to 10 years. Do you think going from the 1060 3GB card, that now can only run new titles due to my excessive amount of RAM, that the 4070 will last me another 5 - 8 years?
depends on what titles, resolutions and settings, honestly i doubt it will hold up 5 - 10 years like usage in this video for new coming titles, if its struggling to at some games get 60fps on ultra or 1440p right now then give it another 3 years of bad game development and you will be running low settings to bounce around an unstable under 60 The 10 series was such a good marketing beacon for nvidia for long lasting reliable cards and the 20 series were good too, 30 & 40 series seems not good for new coming titles, i think the blame is on nvidia themselves taking advantage of the international shortages and game developers making shit games maybe look at amd alternatives with more vram or if you can afford it maybe a 4080 (real 4070)
@@tomglover98 Playing on ultra is honestly a waste though. And if you turn of RT and just run 1440p high, with DLSS enabled, even without additional frame gen on, you will look at 100+ fps in game like cyberpunk. I just don't see the value in paying $200 more for a Ti that can be said the same of. =/
I would suggest a used 3080-12gb. I think they are all LHR cards, so they are unlikely to have been used for mining. They were one of the last 30 series cards launched. It will beat a 4070 if you don't use frame gen. It's definitely going to be a bigger card than a 4070, so if case size is an issue go with a 4070. I bought a 3080-12gb in June. I'm happy with it. Another choice could be a 6800xt. Depending on what types of games you play and if you play at 1080 or 1440 these cards might last you 5 years.
@@walter274 yes, 3080 was on my mind also, but then I need to upgrade my PSU and at 1440p with dlss will for sure last me 5 years at medium to high setting with more than enough FPS. My main screen is a 32" 144Hz and I can't wait to play Howards Legacy without the marshmallow floors 😂
im torn between a 4080 and a used 3090ti right now, the only thing stopping me from pulling the trigger is the up to 400 watts costings in electricity and the ridiculous 1100 price tag of the 4080 (which has just under half the vram btw) @@walter274
It's funny because you can probably get Chinese repairshops to solder on extra vram for less. (not all will work, most famous example being rx480/580 4GB to 8GB where they didn't cut down the memory interface)
you're also playing roulette with whether the ram will be balanced or not. I don't think I'd try this myself but there would be nothing stopping you from just removing the added ram.
I skipped to the DLSS Quality sections. Benchmarking the latest games in 2023 on mid-range cards without upscaling makes no sense to me. Also. When are we going to see the AMD 7700/7800 offerings? Looking forward to seeing them getting incinerated like what is happening with nvidia. If I was AMD I'd just skip the pain. Keep selling their last gen high end cards at a discount and making nvidia look bad because of it. That's working out pretty well for them.
It's not trash because it only has 12GB, but it would have been a lot better if it had had 16GB. It barely has enough Vram currently, but cards which are released with barely enough Vram always suffer a lot more than cards which are released with more than enough Vram. Just compare how well the 8GB version of the RX 580 aged vs. the 4GB version, and they were released six years ago. You can't easily upgrade your Vram, so it makes sense to pay a little more to have more than what you need in the first year. But, with AMD, you can pay LESS and still get more Vram, and about the same amount of performance. I believe that the 6800 XT will prove to be a better value than the 4070, at 500, and the 7800 XT probably will too, especially if it's also 100 less than the 4070. The 4070 is certainly by no means a "bad" card though, I just think it's a bit pricey for what it is, and that it would have been a better optimized design to put 16GB of Vram on it, though not necessarily better optimized for Nvidia's bottom line, because most people don't seem to care or worry that much about how well their card is going to perform two or three years later. But, it's when you first buy your card that performance typically matters the least, because that's when your card is going to have the easiest time running games with nice settings and with high frame-rates.
@@syncmonism if you buy the 70’ series card you fall for Nividea trap. Just get the 7900xt 20gb like you said pay a bit more but future proof for a bit
Great quick explanation of everything before getting started…This is how it should be done every once in awhile to remind or introduce people of these stats…
The biggest red flag is that even the 12gb of the 4070 is spilling over to system ram. Not a huge amount but its there and that leads to spikes in the frame time. 14-16gb really is the minimum spec for Ultra settings Plus high Ray Tracing going forward. Which means every GPU with less than 14-16gb is a low end graphics card and not worth spending money on. Also note than on Ratchet & Clank that the reason why the 4070 is so close to the 4060ti's performance with identical 1% lows is entirely due to its 12gb of vram bottle-necking the card. if the 4070 was a 256-bit, 16gb card it would be way faster. Nvidia is selling us trash.
It's hard to say that any given amount of Vram is a hard minimum, but I definitely think that an additional 4GB on the 4070 would have made a very big difference, and that this difference would become especially useful in an increasing number of games over the next three years. And Vram costs Nvidia less than 4 dollars per gigabyte. Increasing the memory bus to 256 from 192 also wouldn't have actually cost all that much, but would have also come with further performance improvements.
@@syncmonism The die used for the 4070 is actually a 60-class die. They repackaged their RTX 4060 12gb as a 4070 and double the selling price. At ~350-380$ this 4060 12gb would have been a solid budget card but at 600$ its complete garbage. if your spending over 600$ for a graphics card you should only be looking at a 7900XT, at 700-750$ that seems to be only "mid-range" GPU thats actually worth the money your paying for it. It pretty much does 4K Ultra settings without any garbage FSR/DLSS and it will get you a 60FPS target lock in most optimized UE5 games. Unless Nvidia launches a $070-SUPER card with a 256-bit bus and 16gb of VRAM you should not even be looking ast team green in the midrange. Either spend 1600$ on a 4090 or buy a 7900XT for ~750$ or buy a beater card on the used market. Everything else is trash.
I bought one of these cards. I'm a 1080p player with an expensive Eizo Coloredge screen (which I will keep for 5 years) and I value energy consumption. Guess what I bought?
Can you please recheck your Ratchet&Clank results at 1440p Ultra with RT. Techpowerup made a test and they got 4070=62.6fps, 4060Ti=43.9fps, 4060Ti 8GB=38.7fps. Great work by the way! :)
Your price comparisons only makes sense when you already have a system excluding the Gpu, but most people probably buy new entire systems, and if the GPU is the bottleneck the value changes drastically. Would be sweet to have some comparisons from that POV aswell
So I tried the 8gb and the 16gb model. If for some reason you use an 4k monitor and you want a 4060ti do yourself a favour and look for a good 16gb deal. The extra vram saves you from a lot of headache. The msrp of the 16gb model is of course a bad joke and certainly not recommended.
Why? The 6800XT has already been compared to the 4070 and 4060 Ti and it was an obvious big win for the 6800XT vs the 4060 Ti and an even match up between the 4070 and 6800XT cause the 6800XT was slightly cheaper and and had more vram but was worse at everything else from RT, upscaling tech, encoding, workstation, consistent drive support and optimization, power efficiency, size and form factor. Pretty sure he said hed pick the 4070 but he doesnt think theres a right or wrong choice between the 4070, 6800XT, and 6950XT and completely depends on what you want and dont need from your card
@@Angel7black if you had a 650w psu you'd need to upgrade with a 6800xt, while you don't with the 4070 thanks to its insane power draw ;). i do respect this is a niche situation but it just means you can save money on a power supply.
@@khalednajjar3852 you are lying. The 6800xt spikes as high at 580w according to techpowerup. No way you aren’t bluescreening with a 650w psu taking roughly 50w to power everything else (also accounting for whatever cpu is in use)
@@WilFitzz Really bro, are you high ? . The 6800 xt doesn't draw more then 300w, and even the 4090 doesn't draw more then 500w. Where did you get your information from 😂even it techpowerup it says the same.
Thnaks Daniel as always for such a comprehensive comparision, however dont understand the advancement in GPUs, are we moving on to higher res gaming or coming backwards, a $350-400 gpus struggling on 1440p
Is it just me or in 1080p cards are very close but in 1440p the 4070 suddenly dominates. What makes it so efficient in 1440p and not constantly dominate?
8 gb of vram is not enough in todays and futures gaming as games tend to get demanding. But what I think is that it's unfeasible to play on ultra since you will not notice any differences in 99.99% of the gameplay time. Thus, it is more reasonable to lower your settings to high or very high, which will result in more stable frames. Gaming on ultra has very little sense. But I admit that getting above 60 fps while having everything tweaked to ultra feels incredibly satisfying.
Good video. I bought the 8GB version mainly for AI work but it does seem you get parity between these two cards when settings are reasonable. Even on my 4080 which I did buy hoping to do some RT, I never turn it on because I always end up tanking my performance and even if it doesn't then I seem to end up with weird stutters no matter what settings I use.
Seriously though, I really think that 12GB will hold back the 4070 quite a lot over the next few years. It's not that it will be useless or obsolete to have 12GB on the 4070 any time in the next four years, but it will really limit what it can do in an increasing number of games over the next few years, and in some games, textures will just not always load in. It will require you to spend more time messing around with settings to try to get the right balance, but you're also going to be more likely to find that some parts of some games end up slowing down, so you'll have to learn to change settings more often for different parts of the game, but it could be immersion breaking if it starts happening at some epic moment part way through a single player game.
i still rock a 1080 with 8gb. wanting to upgrad for a year now but still didn't. i don't know - i am on a 1440p 144hz screen nad may change to 3440x1440 wide but none of all this cards - since 20xx series really does make me change as i have to spend my money wisely. what do you guys think. is it the right time to boy an AMD 6xx/7xx series or a 4070 ... or should i wait? usually i feel confident after 4-5 years for an upgrade - but times changed. lemme know your thoughts please
The best cheap upgrade is the 6800xt or get the 12gb radeon card that’s even cheaper and still more than capable at 1440p. I feel ya bro I had the 1060 and now 3060ti and even I want to upgrade
@@Aaronnpool23 That's very good advice. If the rumors about the 7800 XT are true, it will be a bit slower than the 6800 XT, and cost around 550. However, the 7800 XT will probably also be a good option, even if it ends up being a little slower than the 6800 XT. I suspect that the 7800 XT would eventually catch up to the 6800 XT in performance because it has a newer architecture, and/or it might eventually end up with superior upscaling abilities, and it will be better at ray tracing than the 6800 XT on day one, though I really don't think that ray tracing performance is going to be all that useful at this performance tier, as it's going to be the first thing you want to turn off in order to improve performance in demanding games. Also, basic "Lumen" lighting, which is a type of ray tracing used in Unreal Engine 5, works just as well on RX 6000 series cards as it does on Nvidia cards, and that might be the most common and practical form of ray tracing in games over the next three years for this performance tier.
@@syncmonism well the 7800xt is over 3 year newer so it should be better, but I’d choose the 6800xt over it. 16gb of Vram is 16gb regardless of whatever “ upscaling ability “ or more units the card comes with. Both have 256bit speed so really only paying for more processing power
i'm stuck with a 1440p monitor, and only willing to pay for a $400 card. i think the 4060ti is an upscaler card at that resolution. so it will be a fun 960p/720p gaming experience for me in the next 5 years
Sell your old GPU to fund your upgrade at Jawa! bit.ly/JawaOwenGPUAug23
no 🙂
I just used this service and it was great, sold my old 2060 getting offered a quote and hassle free shipping. Got 80 for 2060 and sure I probably could have gotten a bit more if I sold it on my own, but didn't want to go through the hassle of listing, paying for shipping ect.
Tekken 8 CNT has forced upscaling. Also you should review that email from me. You need to test VSM, Nanite and Lumen.
You should compare D.L.S.S 3 frame generation to S.L.I in Mount & Blade II : Banner lord.
hello mb you can help me pleas im from Belarus no money me poor only food i need GPU 4060 mb you can gift me GPU pleas ) Belarus Minsk lane Kaliningradsky 13-3 post index 220012 Gleb Zhdanovich (sorry)
I love how Daniel used his actual index finger as a pointer on screen by moving his camera around
Me too. That's why I prefer to follow this guy's Channel
🤓☝️
I made a daniel with index finger pointer for my windows 11 main PC. Now I use his index finger for everything, from browsing to turning up the volume in WINAMP!
one of the best methods to tell the viewers where to look, really appreciate it, not many people do this
As long as we don't get the middle finger, it's alright :)
Let's see.. if you actually need those 16 GB, you probably also need a faster GPU. So you might as well grab the 4070 (or something faster) or, perhaps go AMD. I ... kinda don't get what they were thinking with the 4060 Ti 16GB. Ideally, the 4060 should have 8 GB, the 4060 Ti should have 12 and the 4070 (and 4070 Ti) should have 16 GB. That would make a bit more sense and would make Nvidia's mid-to-high range a bit less confusing.
4060 should be 12 gigs as the previous 3060
8 GB is not enough...
But yeah, 4060 ti could be 12 or 16
And the 4070's should be 16 GB
But for 4060, 8 GB is not enough
That's exactly what they did with the 3060 12GB . Back when it came out it was a lot weaker than the e.g. 3070Ti, so people where saying it's stupid for such a weak GPU to have so much VRAM.
I feel like they want their more expensive higher mid-end cards like the 3070(Ti) and now 4070(Ti) to become obsolete faster, to force the consumers to upgrade earlier.
With the lower end they don't care as much about VRAM as those cards will begin to struggle earlier either way due to being weaker and low end consumers may be more succeptible to "more Vram must mean more power" marketing.
Doesn't work like that. If you have higher VRAM you can use better quality textures which doesn't affect performance if you have the VRAM for it. That's the main difference people just don't know about and textures transform the visuals the most obviously.
And 6800xt completely shits on 4060ti even in ray tracing as tested by hardware Unboxed making the 4060ti obsolete for gaming.
@@yellowflash511 That's true, Textures quality is very VRAM intensive so having a GPU with a lot more VRAM will help a lot in that regard, and its also the thing most noticeable in games, that's why I believe in consoles to keep a consistent experience most titles play on medium while having texture related settings maxed out, since the PS5 has like 16 GB and VRAM isn't an issue so they look great while performing consistently
nah it should start at 12, no 8 gb unless its a 4050 or 4030 :P, but yah 4070 /ti should have 16 cause it is fast enough to use it, then 4080 with 20 and 4090 with 24, would've been great
rtx 4070 12gb would have been an awsome rtx 3060 12gb replacement as a 60 class card.
it is a 60 class die lol
@@GewelRealI was so confused for a sec. I thought u were talling him to die.
@@GewelReal how is AD 104 60 class??
@@md.fazlerabbi5265 To make a long answer short: Because AD103 exists.
If you go back and look at every previous generation there are no uneven numbered dies.
40-series is a sham.
@@PQED GA 103 exists too.
the whole generation exists to upsell you to the 4090💀
Great work on the review. It's a little depressing how Nvidia is using VRAM to add another layer of segmentation to a market that is already sliced and diced in to a dozen performance tiers. I miss the old days when every new GPU was better than the previous one. Now there's overlap between several generations with crap like this.
Yup nvidia kinda blows. I have gone AMD and have no regerts.
@@jstegall44I used ATI/AMD from 1995 until I switched over at the 1080 generation due to the completely lacking performance on the AMD side. I hope to go back some day, but AMD keeps pulling defeat from the jaws of victory.
@@jstegall44regerts.. lol
it's actually a smart marketing tactic as much as people hate it :)
Nvidia: hold my 4070 Super and 4080 Super!
4060 ti 16GB is great by performance but not for price...
Actually, the 16GB should be the only version, 8GB shouldn't exist...
4060 non-ti should be 12GB
and the 4070 nornal/ti should be 16 gigs
some people realized of that some weeks ago, not only with first actual perfromance leaks but with confirmed specifications that would be extrapolated to know expected perfromance thanks to the existence of previous RTX cards in professionals versions or modded with more vram... RTX 30 was already vram bottlenecked by size, and not so much pcie 3.0 to 4.0 and attention to make seem RX 6000 or RTX 30 as better gpus to buy over RTX 40... youtubers were too long and too focused on helping sell old stock and not saying what was coming as better performant...some were accused of fanboys...when the rality is that lower (still too expensive) 4060 and 4070 are good technically (with the faults you point, 4070 has 16 Gb ram stock when is sold as RTX 4090 mobile 16GB) and solved, why nvidia has launched limited by design gpus? why noibody points the evidence until after they're launched and tested? it was obvious... 4060 Ti 8Gb to 16GB got and get so much trash reviews...whne the important thing is to test >8gb vram to 12GB and the >12 to 16GB of vram use scenarios... it's been weeks we now 4060 ti 16GB can be as fast as 4070 12 in some cases like 0:01 shows...
Actually the performance is very poor, given that it's just slightly better than the 3060 ti.
@@Atheismo9760 But Its a good alternative if you're thinking to upgrade, also much more power efficient, draws way less power and giving slightly better experience for ppl who care about electricity bills
@@Abi_Officialthe only reason the 4060 ti is slightly better than the 3060 is dlss 3.
Hope fanboys wake up sooner, Nvidia is ripping them off big time , 8gb GPU was planned so its obsolete within a year or 2 , your hard earned $400.00 is only good for 2 years use , it's like subscription system , Nvidia dirtiest company , wake up fanboys, learn to say no .
You are the best. I love that you go in depth on the experience of actually playing the titles rather than simply giving us the results. A perk of being a teacher I guess!
He should consider making a cursor theme pack with him as the default pointer. The 6800xt was/is a totally excellent card, the 4070 is also okay.
6800 excellent, but not a tottaly.
This card cannot be bought for $ 500 at a local store in 5 minutes, as it can be done with 4070, which means that the comparison is incorrect.
And the 6800xt is a hot video card, you will have to tinker with it to make it quieter and you can run into an unsuccessful cooling system and then you can lose time and money altogether.
In addition, the 4070 has FG, and the 6800xt has +4 GB, which most likely will never be needed.
@@Aleksey-vd9ocFor someone liek me who needs more vram the 4060ti w 16gb is perfect since it also allows me to use nvidia features that the games I play often have
@@Aleksey-vd9oc I have a 4070 and FG had zero to do with my purchase. Could care less about that. It was power draw, size (would fit into my case) I wish it had 16GB though. 12GB is not an issue though at least today.
@Aleksey-vd9oc unfortunately where I live in europe, even now 5 months later the 4070 is still sitting at 640 while the 6800 is 440 and 6800xt is 519
@@sonatafx the 6800xt is as good if not better than the 4070 even though it is last gen. and the non-xt is not a bad gaming card either. The 4070 is okay it's not 600 dollars okay though.
Imagine paying $600 in 2023 for a GPU and you cant play some games at max settings 60fps in 4k or 1440p. ;_;
My old RTX2080ti, from 2018 for $1300, does 30 FPS in the same game. 🤡
@@funkonedemy god pc with a gt 630 gets less 20 fps in 1080p on Fortnite 😮
Conclusion: all these cards are overpriced to the performance they have and the 4070 (and for sure 4070ti) should have 16GB.
Yes, absolutely.
The 4070 ti in particular is god awful for 800. It only looks good if you compare it to the new pricing of the 3090 and 3090 ti, but those cards can now be found for somewhat reasonable prices on the used market, and they have DOUBLE the Vram of the 4070 ti, and much higher memory bandwidth too.
Me when I pay 100$ more for a GPU with more VRAM but still can't target 60fps constantly: 💀
For those buying that card, it shouldn't be too difficult getting the right settings with pleasing graphics.
And you stop listening to assholes, and just turn on DLSS ;)
Here in Oz, using the cheapest local prices I can find, it's $813AUD for the 16GB 4060 Ti, $849 for the 6800 XT and $885 for the 4070. The $36 difference between the 6800 XT and 4070 makes for arguments either way depending on use, but the 4060 Ti for $36 less than the 6800 XT is a bad joke.
That is the pricing here in the states as well, the 6800XT is the better deal.
$885 for the 4070? I live in Australia and I have yet to see it at that price!
@@Valkian24 Umart/MSY for the PNY 2 fan version.
Honestly probably would have bought a 4070 if it had 16GB, went with 6950XT instead.
Your becoming a staple alongside GN and HUB for me, nice work!
I also made this exact decision but unfortunately my 6950xt was black screening at 4k high fps so sent it back and got the 4070..definitely not as strong in brute force but something has to be said for DLSS and almost half the power usage.. After trying both I'm glad I ended with the 4070..for now until vram hurts me 😅
At least now all games that had vram issues finally been fixed by their devs, and not a single game exceeds 12gb of vram @1440p. Well just for now of course! 😂
@@TwinkleTutsies DLSS will not help you in the upcoming games like Starfield that will not have DLSS support. Adding DLSS is an extra headache for developers. The time and effort saved by not including DLSS seems to be worth it for more developers. Unless Nvidia starts paying devs to include their proprietary stuff instead of expecting devs to do extra work for free.
@@sammiller6631 if only that were the case.. AMD seems to be stopping AMD sponsored games from implementing DLSS it has recently come out if you are keeping up with youtube channels like hardware unboxed
@@sammiller6631And developers rather shift resources on tech that can be used on a wide scale of gpu's and benefits most of us.
Its weird how they put 16gb on a card that doesnt have the power for higher settings and need the gb. 4060 should have been 12 gb, 4070 16 gb.
You'd have to increase the bus to do each, at that point it's just renaming the 70 -> to 60 ect
The 70 class is basically a 60 class die so they definitely did a renaming already.
@@toxicavenger6172ad104 is far from 60 class chip. 4060 ti is more like it
@@df8340 And that would have made a lot of sense to do, especially with the 4070, maybe not so much with the 4060 ti, especially because the 4060 ti was actually designed with laptops in mind, so Nvidia needed that 128-bit version regardless of what they were going to do with desktop designs.
The 4070 in terms of value isn't that bad, at least compared to the 4060 Ti 16 gb. A decent performance jump from the 3070 and is now comparable to a 3080, 50% more VRAM than last gen, a higher memory bus able to use the VRAM more efficiently (still lower than the 3070 because screw Nvidia), draws much less power etc... I wish it had 16 gb instead of 12 gb and that it was priced like last gen, but for what it is it could be worse (Like the 4060 Ti 16 gb lol)
+ you get free diablo iv if you buy it on launch, so basically it was $530 lol
The 4070 has higher memory bandwidth than the 3070 despite having a narrow bus width, and that too without the L2 cache "effective" bandwitdh shenanigans 🤷
I went with the 4070 for my new build. I have no regrets. Did my research before purchasing and went with the best value for the money out of current generation graphics cards. I wanted to buy new and current gen for my brand new build. Works fantastic with my 1440p monitor.
@@ObamaPhoneProMax5G same with me. I didnt have to replace my 550w PSU and small case by buying dual fans rtx 4070. if I bought the AMD counterparts I have to upgrade my PSU and case, and the electricity is much higher too.
and I only play AAA games mostly, so I need those Frame Gens and DLSS
@@raihanirsyadi7516which dual fan model did you get, I’m looking at the dual fans as well, hows the cooling performance?
Full GARBAGE vs Overpriced GARBAGE vs Just GARBAGE.
😂🤣 lmao
Daniel you have carved yourself out a nice spot as a tech RUclipsr with your work and perspective, well done
I just saw a big Brazilian tech youtuber I follow mention him. Kind of a weird experience considering this guy has 10x more subs than Daniel and it's more than a decade on youtube. That's a testament to Daniel's hard work.
Love your analysis!! 👍
I may be a minority here but I would like to see how the 4060 Ti 16GB performs in 4K RT DLSS-Q FG across the most demanding games that support such settings. Does it manage to stay above 30fps?
4070 ran into VRAM limitation in R & C 4K RT gave me a chill. Would you please run a detailed test of 4K RT with Frame Gen only (stress testing VRAM capacity), comparing to 4060 ti 16GB, to find if there is any other game that poses issues with 12GB cards?
Wow I learned so much watching this. Thank you for your time and effort in explaining. Your final analysis is very helpful and the viewer should still watch all the segments in this video. Thanks again. 🇨🇦
With 3060Ti being completely gone from the market now, 6700XT is getting out of stock now
4060Ti is the last option for 350-400$ range for new gpu😮
tragic
AMD is supposed to launch their lower tier GPU in a month or so.
@@AaronShenghao they already did- the 7600
@@kennethpereyda5707he's talking about the 7700XT and 7800 XT which was reported as releasing in a month.
@@pkpnyt4711 um the 7800 isn't a low end GPU should compete against the Nvidia 3ard tier card the 4070 Ti
Going from a 1660 ti to a 4060ti, Gonna gift my lil bro my old gaming rig this chrismas. I cant wait to see his reaction!
16gb 4060ti is the best of the 3 for deep learning/AI tasks, and it's absolutely worth few extra dollars over 8gb version.
I'm here for this. Was really debating 3060 (12gb), 4060 (16gb) :(
@@eric_so_good Depends what exactly you are planning to do with it.
My current build is now dual gpu, 1x 4060 ti 16gb and 1x Tesla P40 24GB, for a total of 40GB of VRAM, both of those are of nearly equal performance, individually.
@@gaius100bc Thank you a lot for your fast response! I'm buying the 16gb in the next few hours so fast response would be soooo appreciated! I heard that the cuda core compared to 3060 was reduced as well as the memory bandwidth, how is your experience regarding this on dl/ai tasks related? Thank you! And also I also have a tesla gpu too haha!
@@gaius100bc * my tesla gpu is the M40 with 24 GB VRAM
@@eric_so_good it's not enough VRAM from training, so I never tried training or fine tuning with that card, instead I tend to focus on inference.
As for inference, 4060 ti 16gb - is quite cut down, but usable, 7b parameter model with llama.cpp at 4bit quantized will get you inference speed of ~40 tokens per second (with low context size), which is perfectly fine imho.
You could tweak the graphics settings in Ratchet and Clank: A Rift Apart, and get 4K 60fps on a 4060 TI 16GB.
What’s your setup look like? Also have u tried this on any heavy games like warzone?
As a new RTX 4070 owner myself, its worth mentioning that the RTX 4070's power usage in Rift Apart is substantially lower than its 200W TDP or Nvidia's rated "average gaming power consumption" of around 180-190W. So obviously the game isn't using the GPU fully, despite the fact that it shows 95% GPU usage.
copium
@@MagusslettewestbergCorporate bootlicker lmao
The 4070 is the only card with consistent significant advantage among these three. Yes it costs more, but you get more in every case. Can't say that about the 4060ti with 16Gb VRAM. I can't see buying a mid tp high tier 8Gb card today. I am currently building a PC with the 4070.
4070 user here, the card is amazing go for it! I ended up being extremely pleasantly surprised how good DLSS Frame gen is in singeplayer titles ( it is pretty garbage in multiplayer though from my testing )
I agree that the 4070 is a better value than the 16GB 4060 ti at 100 USD more. However, the 16GB 4060 ti got to within 1 FPS, and tied in 1% lows, at 1440p Very High (with RT), vs. the 4070, and actually beat it at 4k with those settings though neither card ran fast enough at 4k with those settings for it to matter.
Given that there's even 1 game where the 16GB 4060 ti was able to match the 4070, that means that the 4070 would have FOR SURE been faster if it had more Vram. If that's already happening only a few months after the card was first released, then it's likely to happen more and more over the next few years.
Higher capacity versions of graphics cards virtually always end up proving to be a lot better, sooner or later. Usually it takes around two years or more, such as with the 8GB versions of the RX 480 and 580, as well as with the 6GB GTX 1060. If a graphics card only has barely enough Vram when it first launches to reach it's full potential, that's not good, and will likely limit what settings you can use with it significantly in the future. The 4070 won't be a terrible card, and won't become obsolete until a very long time from now, but it would for sure be a significantly better card if it had been designed to have 16GB of Vram. The 16GB AMD cards are going to age well against the 4070, but most people will just not care enough about that Vram difference to care about it, especially if the 7800 XT ends up being slower than the 4070 and the 6800 XT in standard raster performance, but still costs 550 (as rumored).
If the 4060Ti 16GB was $450, it would make the 4070 look just as stupid, bc that's 30% more money for the same 30% more performance but with less VRAM, and that's not how it works on the midrange.
But the 4060Ti 16GB isn't $450, so that makes the 4070 look actually kind of good, even though it isn't.
I wouldn't go that far to say that it would make the 4070 "look stupid" at 450, but it definitely would make a lot of people ask why the hell the 4070 doesn't also have 16GB. However, I think the 6800 XT already does a better job of making the 4070 look stupid.
Kudos on the presentation of fps as well as their relative percentages its something that I like more in comparison to digital foundry's layout in which they only show one or the other at a time. Great analysis Daniel
Biggest problem aside from price on the 16gb version is you’re not getting a larger bus, not getting more cuda cores, nothing else is better to justify the price bump aside from the vram.
3060 8gb vs 12gb has a more noticeable difference
The 8GB 3060 also has a narrower memory bus and less memory bandwidth than the 12GB version, which really hurts it. Also, it's generally two or more years later when the higher capacity Vram cards start to really show their value.
The 4070 is great right now, but it's not going to age as well as cards in the same performance tier which have 16GB of Vram. It's already coming up against Vram limitations, as seen in Ratchet and Clank at 1440p Very High with RT, where it only manages 1FPS better than the 16GB 4060 ti, and the same 1% lows. That's a clear case of its Vram holding it back. It should be a lot more powerful than the 4060 ti in RT at 1440p if it has enough Vram to work with.
If you look back to cards like the 4GB RX 480, they were totally fine for years, but eventually the 8GB version pulled way ahead of it in a lot of games. The 4GB 480 was still not a bad value, but the 8GB version definitely proved to be at least as good of a value in the long run. If the 4070 is coming up againt Vram limitations in games only a few months after it's released, just imagine how much worse it's going to get in two or three years from now.
If you upgrade your card every two years, it doesn't matter as much, but it will still affect the resale value of your card, and GPU prices can fluctuate a lot, so you can't count on there being good upgrade options available every two years anymore.
Keep in mind also that the first mainstream 8GB graphics card was released NINE years ago now, for 330 USD, which is equivalent to around 410 USD today if you adjust for inflation. Nvidia's first mainstream 8GB card was released 7 years ago, for 380 USD at the time. A 500 dollar PS5 has an 8-core Zen 2 CPU, a GPU, a very fast SSD, 3.5 GB of system memory, and 12.5 GB of dedicated Vram. Vram costs Nvidia less than 4 dollars per gigabyte. Considering these things, I think we should be expecting more than 12GB of Vram from a 600 dollar graphics card.
the 4050 16GB has a good showing here in the games that use more than 8GB
Yea the 4050 is a really fast card for 220$. Let's hope the 4060ti 16gb has faster memory bus and more cores. Maybe even cheaper like 350$. How can nvidia even screw up their launches anyway?
It's a lower-mid-range card at best, and I'd say that it's well worth 350 USD right now, at least relative to other products in the current market. If they had called it the "4050 ti", people would have been really really impressed with its performance. Even at 400, I'm sure it would be a very popular card, though it wouldn't really interest me very much at 400.
When you open your own production of cards, then you will come up with names and form the price of the smart guy
@@Aleksey-vd9oc Yeah man you are right, I don't know what I was thinking looking back at the last decade showing us what code names (die names) they use for the 50 cards along with the RAM bus width but some random dumb ass in the youtube comments changed my mind. Weird how these things work, it is almost as if you have no fucking clue what you are talking about.
I love this comparison! I managed to find a 4070 at Best Buy open box "excellent condition" for $480! Stress tested the hell out of it and it's working great!
I’d love to see something like a 3080 10GB in the comparison too. Given a used one is also about 400 USD/EUR
3080 uses twice as much power, so its not worth it, also 10gb hits vram wall way more times than 12gb that only has issues with 3 4 games out of thousands. 4070 is better buy with frame gen and more vram also way more efficient. most ppl use that gpu with 550 600w psu, no need to upgrade like for 4080 3080 3090 4090 or amd power hungry gpus
@@NostalgicMem0riesare you dumb you can undervolt AMD GPUs with their on software lol nobody runs their cards stock. Only poor people complain about power consumption and if you're buying a top tier card power should not be a factor 🤡
@@NostalgicMem0ries The 3080 handles the VRAM-spill better though, even with just 10GB (and it's overall a slightly more powerful GPU).
Ideally you'd get the 12GB version, but it's fairly rare. Either way you're more likely to be able to find a used card too.
Not that I'd recommend a GPU with less than 16GB VRAM today (except for the 4060ti and 4080; they're terrible value), which generally means going AMD.
If you can find a vanilla RX 6800 you'd get great raster performance, 16GB, and not-so-high power usage (especially if you undervolt; which you should).
RT performance may not be great, but then again the tech is still in its infancy. No frame gen, but I personally don't see that as a negative.
DLSS? Maybe, though it's not that useful unless you run at 4k.
What you'd spend on electricity over the coming years, you'd might as well save on the initial GPU purchase; if we are to abide by that logic.
I'd be more concerned with CPU/platform choice for that, since you're more likely to keep those parts longer.
@@NostalgicMem0ries +1, I agree.
No point, AMD 6800XT. The end. £499.99, 16 gig of VRAM. Slightly slower than RTX 4070. But cheaper.
Can still buy brand new.
AyyyyMD 300W pocket heater
@@GewelReal Yes, also added bonus of winter warmth.
@@GewelReal my 6950xt dont take 300w...
@@marcinkarpiuk7797 liar
Indeed. Just take a 6800XT or 6950XT. Ada isn't worth considering at these prices.
I finished my all white 4090/7800x3d/64 gig build and I can't wait to play my games.
:O
Hope you enjoy it
@@khalednajjar3852 oh yes I will.
Im jealous. Have fun!
I'd like to see a 4070 vs 4070 TI comparison now. I'm curious if the value is there for the 4070 TI comparably.
@Navi_xx I think, like many things, "not worth the price" is always highly subjective. That's why these videos matter. Of course, the TI is and always should be faster. How much better in the real world is the question.
Many are purchasing the 4070 TI, so I'd say it's quite clear many do not think the 4070 is quite worth it. A video breaking the two down would help people on the fence.
So, on lower end cards where frame generation could potentially be useful, it's crippled in actual use because of the hardware. Ironic.
Have you ever done a comparison of the differences between nvidia’s one click game optimization settings across different GPUs? I’d be interested in seeing how much it actually tunes to the specific cards capabilities.
Just ordered the XFX Merc 310 7900XTX (1st AMD gpu for 18 years) as I’ve given up on Nvidia. I hope it works perfectly. Cheers Ps it was £938
Use DDU before and you'll be fine
@@yellowflash511 yes and thanks I was going to. Cheers
Ggs. Bought the pluses 7900xtx for 999$ 2 days ago and it's so awesome for 4k . Underclocks well and all temps are below 75c power draws around 250 to 330w
Hope it goes well for you. I'm thinking of getting the 7900xt for the same reason. Got no enthusiasm to get a gimped Nvidia card at an absurd price.
Same, Powercolor Red Devil 7900xtx, dont believe the hype about RT and drivers, card is amazing!
Power usage is the reason Nvidia has priced where they have, pay more for energy on AMD or pay more for the card NVidia, which could save the extra energy for the card over 1-5 yrs. Atm 4070 = £570, 7800XT = £700 (including 20% tax) but can use the extra 100W at £115 / yr over 5 yrs you save £570 (excluding 5% tax) if the 4070 lasts that long for say 25-30% less performance. Tough one as NVidia look to over price but average it out & it becomes competitive overall & reason why they are selling them & still feel both are over priced.
I swapped from a 4060Ti to a 4070.
The 4070 is a beast. Reasonable size (dual fan version) and low 200W power consumption.
600$ for a card already hitting the Vram wall the same year of her launch i wont call that a reasonable beast. The only good point is indeed the efficiency. That will be an Ok 350$ card
As someone who was eyeing the 3060 Ti a year or so ago while compiling parts for an upgrade, I'm happy with my 4060 Ti 8GB. I reeeeeaaally needed a GPU upgrade from my GTX 970. I look at the 4060 Ti as more of a revision to the 3060 Ti since it was priced the same. Doesn't heat up my room, uses less wattage, has better ray tracing over the 30 series, and actually gets performance very close to a 3070.
I basically only got the 4060 Ti 8GB because I really needed to replace my 970 and it will hold me over until the 50 series launches in 2025 because those cards will be absolute monsters. At that point I'll probably sell the 4060 Ti. If you don't need to upgrade now, don't. At least get something with more than 8GB VRAM like the 4070. I can already tell VRAM spill is going to be a problem. $600 is damn steep but 4070 will certainly have more longevity with its 12GB of VRAM. Maybe there will be a price drop? Wouldn't hold my breath but maybe?
But I'm personally interested in Nvidia's features like Nvidia Broadcast and ray tracing. DLSS 2 if I ever get a 1440p monitor. DLSS looks a bit blury to me at 1080p but not terrible if I really needed the performance. DLSS 3 could be cool. If someone wasn't interested in those things, I'd probably point them over to AMD if they really want to upgrade because they have more VRAM at a better price point, but the 50XX cards gonna be good in 2025 so hold off if you can.
"...but the 50XX cards gonna be good in 2025..." How do you know? Magical wishful thinking?
@@sammiller6631 Nvidia currently has the beefiest GPU on the market. The 40 series cards are a stop gap while Nvidia cooks up the 50 series.
@@OniCr0wpeople not tired of buying gpus every year?
I will tell you and you will see, 50 series comes 2025 + Pcie 5.0 + DLLS 5 and FG + low power consumption and more higher price then now. It will release same logic with 40 series. You will desopoint. Monster? your delusional. This is Nvidia and they r starting to use a new tech, i dont thing they r giving up tht idea.
Got a 4070 for $470 new in box, too good of a deal not to take.
wow how
Got a coupon from due to a previous issue, it was set to expire so I pulled the trigger instead of letting it go to waste. @@lukilsn
That's a much better value at that price. I'm guessing it was an open-box item, or maybe he's accounting for a Steam gift card that came with it.
4060 Ti 16gb should not cost more than $400. 8gb at most $250-300 AT MOST, it's a joke at $400.
WTF... manufacturers are giving us graphics cards that can't handle graphics well
4000 series is a failed GEN beside the 4090... and even that is failing with the burning shitty power connector.
For me that works with AI, I had to get the 4060TI 16Gb for the VRAM, sadly they don`t have many gpus with 16gb.
I wonder how many people actually use RT. I try hard but never actually see any difference lol
The 4070 could be around 47% if vram wasnt an issue if it had 16 gigs instead of 12 gigs of vram
Only once during the whole video 4070 had an obvious problem with VRAM - it was Ratchet & Clank running at 4K maxxed out with RT. And even then the overall framerate was so low, that additional VRAM would not make any significant difference. 4070 does not have enough horsepower to run big modern games at 4K/60+fps maxxed out with RT anyway. And if you reduce settings/resolution or use upscaler, VRAM requirements lowers pretty substantinally.
Modders are the only ones, who can benefit from 16+GB GPU of this tier, as running a game with a ton of mods increases the requirements for VRAM capacity.
@stangamer1151 Yeah the 4070 runs at its absolute best at 1440p. In my opinion 1440p is the best resolution because it has the best balance of good looking visuals, good performance, and cost. I have a 27 inch 1440p monitor and it looks so amazing compared to 1080p. 4K in my opinion is too performance hungry, to get good performance you need a 4080, 4090, or a RX7900XT. 4K also doesn't make too much sense on a 27 inch screen in my opinion
@@stangamer1151 That wasn't really a problem, given that neither card could run properly at 4k with those settings anyway. However, the 16GB 4060 ti got within 1FPS, and tied with 1% lows, at 1440p very high with RT (43 vs. 44 FPS), and that is a CLEAR case of the 4070 not having enough Vram. There's no other good explanation for how the 16GB 4060 ti could get that close in that situation. It's obviously not a CPU bottleneck, and it's obviously not a memory bandwidth bottleneck.
This is also a clear example of an actually usable set of settings that WOULD have been able to give around 60 FPS or more, but instead, the Vram causes the 4070 to only perform to the level of a 4060 ti, which has a much weaker GPU.
@@syncmonism I think this game is just broken now, when RT is enabled. Because even at 1080p maxxed out with RT 4070 is just 7% faster than 4060 Ti 16GB. While it should be at least 25% faster.
The devs will probably fix this issue in future patches. Same thing happened with TLoU and many other games. The best rule for a gamer these days - never play AAA games at launch, unless you are fine with suboptimal experience.
Another game that exceeds 12GB of VRAM is The Outer Worlds: Spacer's Choice Edition. Between the RTX 3070 vs. 4060, ti 16GB sometimes the performance was 2x with more VRAM.
Kinda sad to see a new $600 **70 class card in 2023 struggling with 1440P in recent titles and only including 12GB VRAM. But I will say that of all the mid range 4000 series cards, the 4070 is the least offensive. A **60ti card at $500? And only 10% faster than previous gen? Ughhh. Sadly no AMD product showing up to compete. 6800XT? almost 3 years old and only a good deal at $500 IMO at this point. I hope we see some ray of light from the 7800XT now. 7900XT is finally a good product since it dropped below $800. Please AMD. Start at the right price with this one... you still have time to make sure.
in my country 4060 ti 16 gb is 430 usd and 7600 xt is 420 usd. 4060 ti 16 is the better pick any day
@danielowentech Hey Daniel, I wanted to say that people don't realize how difficult is to create content and edit videos. Stitching together clips with a full time job with voiceover, just want to say good luck man. Wish you the best!
The 4060 Ti 8gb is so bad, that it makes the 16Gb version and the 4070 look like a good deal
4060 ti 16gb is a joke when it comes to value
RTX 4070 is not a bad deal at all. It's basically RTX 3080 with 12 GB of VRAM and DLSS3
@@Skulka BRO WATCH THE REVIEWS FIRST AND COMPARE IT TO THE 3070 WHICH CAME 2.5 YEARS AGO
@@Skulkait's pathetic go watch reviews
@@yellowflash511I don't need to watch reviews , I own it
I have a 4060ti and i completely agree with you. However, in India price of the gpu is decent and imo it is a 1080p card not a 1440p one. For a first time pc build, I can recommend this for 1080p gaming.
DLSS 3 is pointless at this rate of fps. Input lag becomes unbearable, and the gameplay feels like DLSS 3 was never used.
Always happy to see a new GPU comparison in the morning. Cheers!
Doesn't matter, did you see the HUB 6800XT vs 4060ti video? 😂
Didn't Daniel also do one of those?
The 4070 should have had 16GB too, that's the reality.
I dont expect NVIDIA to be more generous with new Gddr7 for 5000 series..
literally my only complain about 4070, if it was 16gb i would have bought on release date... perfect for my 4k 60 fps build, yet vram kills it in few years
@@NostalgicMem0ries im running 4k on 6950xt, u will be suprised how often i see allocated over 15gb...16gb is just ok for now
Baldur3 can take over 10gb 😊
amd gpus tend to use way more vram, for example nvidia gpus on rdr2 use 9 10 at k, while amd uses 12 13. seen that in few other benchmarks. and game like diablo 4 uses all vram you have, 4090 and 7900 xtx maxes at 24gb at 4k@@marcinkarpiuk7797
@@marcinkarpiuk7797 Yes but how much actually used? Some games just allocate as much as they can get without actually using it "just in case".
I'm looking to get a new graphics card and I can't believe out of all the videos I've watched where people talk about the 4060ti's 8GB and 16GB variants so simply. Not one of them mentioned about the relationship the system ram and the video card ram has and you've explained it to an idiot like me! Thank you!
The 6800XT it is then.
My GPU sold on ebay for £399 Jawa offered £166 :|
4050ti 16gb aka 4060ti 16gb would be a perfect al rounder card if it had 256bit bus with full x16 lanes on a some lower price point or around $320 with same specs
They dont care about us
I'm on the verge of buying a new GPU. I'm waiting for the special offers in November, like I'm sure most people are. I was holding out since January to see what the 40 range was going to deliver but I think after this, my mind is pretty much set on the 4070. Thank you Daniel. You've really made my choice quite easy. I'll be using this GPU for the next 5 to 10 years. Do you think going from the 1060 3GB card, that now can only run new titles due to my excessive amount of RAM, that the 4070 will last me another 5 - 8 years?
depends on what titles, resolutions and settings, honestly i doubt it will hold up 5 - 10 years like usage in this video for new coming titles, if its struggling to at some games get 60fps on ultra or 1440p right now then give it another 3 years of bad game development and you will be running low settings to bounce around an unstable under 60
The 10 series was such a good marketing beacon for nvidia for long lasting reliable cards and the 20 series were good too, 30 & 40 series seems not good for new coming titles, i think the blame is on nvidia themselves taking advantage of the international shortages and game developers making shit games
maybe look at amd alternatives with more vram or if you can afford it maybe a 4080 (real 4070)
@@tomglover98 Playing on ultra is honestly a waste though. And if you turn of RT and just run 1440p high, with DLSS enabled, even without additional frame gen on, you will look at 100+ fps in game like cyberpunk. I just don't see the value in paying $200 more for a Ti that can be said the same of. =/
I would suggest a used 3080-12gb. I think they are all LHR cards, so they are unlikely to have been used for mining. They were one of the last 30 series cards launched. It will beat a 4070 if you don't use frame gen. It's definitely going to be a bigger card than a 4070, so if case size is an issue go with a 4070. I bought a 3080-12gb in June. I'm happy with it. Another choice could be a 6800xt.
Depending on what types of games you play and if you play at 1080 or 1440 these cards might last you 5 years.
@@walter274 yes, 3080 was on my mind also, but then I need to upgrade my PSU and at 1440p with dlss will for sure last me 5 years at medium to high setting with more than enough FPS. My main screen is a 32" 144Hz and I can't wait to play Howards Legacy without the marshmallow floors 😂
im torn between a 4080 and a used 3090ti right now, the only thing stopping me from pulling the trigger is the up to 400 watts costings in electricity and the ridiculous 1100 price tag of the 4080 (which has just under half the vram btw) @@walter274
When testing in Ratchet & Clank Little Junk Town around the vendor is the hardest area to run. Best place to test performance
It's funny because you can probably get Chinese repairshops to solder on extra vram for less. (not all will work, most famous example being rx480/580 4GB to 8GB where they didn't cut down the memory interface)
you're also playing roulette with whether the ram will be balanced or not. I don't think I'd try this myself but there would be nothing stopping you from just removing the added ram.
I skipped to the DLSS Quality sections. Benchmarking the latest games in 2023 on mid-range cards without upscaling makes no sense to me. Also. When are we going to see the AMD 7700/7800 offerings? Looking forward to seeing them getting incinerated like what is happening with nvidia. If I was AMD I'd just skip the pain. Keep selling their last gen high end cards at a discount and making nvidia look bad because of it. That's working out pretty well for them.
Thank you for this video. RTX 4070 is a good card and not trash because of only 12 GB.
It's not trash because it only has 12GB, but it would have been a lot better if it had had 16GB. It barely has enough Vram currently, but cards which are released with barely enough Vram always suffer a lot more than cards which are released with more than enough Vram. Just compare how well the 8GB version of the RX 580 aged vs. the 4GB version, and they were released six years ago.
You can't easily upgrade your Vram, so it makes sense to pay a little more to have more than what you need in the first year. But, with AMD, you can pay LESS and still get more Vram, and about the same amount of performance. I believe that the 6800 XT will prove to be a better value than the 4070, at 500, and the 7800 XT probably will too, especially if it's also 100 less than the 4070.
The 4070 is certainly by no means a "bad" card though, I just think it's a bit pricey for what it is, and that it would have been a better optimized design to put 16GB of Vram on it, though not necessarily better optimized for Nvidia's bottom line, because most people don't seem to care or worry that much about how well their card is going to perform two or three years later. But, it's when you first buy your card that performance typically matters the least, because that's when your card is going to have the easiest time running games with nice settings and with high frame-rates.
@@syncmonism if you buy the 70’ series card you fall for Nividea trap. Just get the 7900xt 20gb like you said pay a bit more but future proof for a bit
Great quick explanation of everything before getting started…This is how it should be done every once in awhile to remind or introduce people of these stats…
There shouldn't be any 8gb options anymore.
Daniel thank you - we really appreciate your hard work, and I enjoy that you talk us through your presentation - Brava
The biggest red flag is that even the 12gb of the 4070 is spilling over to system ram. Not a huge amount but its there and that leads to spikes in the frame time. 14-16gb really is the minimum spec for Ultra settings Plus high Ray Tracing going forward. Which means every GPU with less than 14-16gb is a low end graphics card and not worth spending money on. Also note than on Ratchet & Clank that the reason why the 4070 is so close to the 4060ti's performance with identical 1% lows is entirely due to its 12gb of vram bottle-necking the card. if the 4070 was a 256-bit, 16gb card it would be way faster. Nvidia is selling us trash.
All the best goes to AI accelerators, and we just get trash...
Even 4090 is cutdown edition...
It's hard to say that any given amount of Vram is a hard minimum, but I definitely think that an additional 4GB on the 4070 would have made a very big difference, and that this difference would become especially useful in an increasing number of games over the next three years. And Vram costs Nvidia less than 4 dollars per gigabyte. Increasing the memory bus to 256 from 192 also wouldn't have actually cost all that much, but would have also come with further performance improvements.
@@syncmonism The die used for the 4070 is actually a 60-class die. They repackaged their RTX 4060 12gb as a 4070 and double the selling price. At ~350-380$ this 4060 12gb would have been a solid budget card but at 600$ its complete garbage. if your spending over 600$ for a graphics card you should only be looking at a 7900XT, at 700-750$ that seems to be only "mid-range" GPU thats actually worth the money your paying for it. It pretty much does 4K Ultra settings without any garbage FSR/DLSS and it will get you a 60FPS target lock in most optimized UE5 games. Unless Nvidia launches a $070-SUPER card with a 256-bit bus and 16gb of VRAM you should not even be looking ast team green in the midrange. Either spend 1600$ on a 4090 or buy a 7900XT for ~750$ or buy a beater card on the used market. Everything else is trash.
$200 between the 4060ti and 4070 lol what a time to be alive 😂! Spoiler just bought the 6800xt
Can you test a 3070 8GB in PCIe 4.0 x16 mode against 4060 Ti 16GB?
I went with the 4070. No ragrets
Thanks for all your hard work and comparisons! 😊
I bought one of these cards. I'm a 1080p player with an expensive Eizo Coloredge screen (which I will keep for 5 years) and I value energy consumption. Guess what I bought?
Love the quality and time put into the vids!
The 4060 ti 16gb is aimed more at content creation I would argue, where the larger vram is brilliant without the cost of the 4080 etc..
Can you please recheck your Ratchet&Clank results at 1440p Ultra with RT. Techpowerup made a test and they got 4070=62.6fps, 4060Ti=43.9fps, 4060Ti 8GB=38.7fps. Great work by the way! :)
Testing different areas of the game likely produce different results.
@@danielowentech Yes, they say there is always bottleneck somewhere in the system that is limiting final FPS.
Perfect video. Ty! And very well explained
It’s almost like it needed to be 4060 Ti 12 Gig for $400 and the 4070 16 Gig for $500 or $550
I play Ratchet on my 3090 at 4k ultra, high RT, DLSS quality, locked at 60 FPS. I see over 14gb of VRAM used. It is a beautiful game, but wow.
Talk about comprehensive! Excellent work again!
You should have the flying nimbus underneath you if you're going float like that😂
Your price comparisons only makes sense when you already have a system excluding the Gpu, but most people probably buy new entire systems, and if the GPU is the bottleneck the value changes drastically. Would be sweet to have some comparisons from that POV aswell
most people do not buy new entire systems
@@sammiller6631 most people do infact buy full systems, otherwise the mainstream chain stores selling prebuilds wouldnt exist?
@@tomglover98 Well if you buying a pre built, you're more then likely not comparing price to permanence of GPUs
Fantastic video! Thank you for making something like this! :)
Keep saving up money. Pc gaming is an expensive hobby 💀
So I tried the 8gb and the 16gb model. If for some reason you use an 4k monitor and you want a 4060ti do yourself a favour and look for a good 16gb deal. The extra vram saves you from a lot of headache. The msrp of the 16gb model is of course a bad joke and certainly not recommended.
Thank you for the hard work/information, not sure now if I am more knowledgeable or more confused, lol.
Love your reviews! It's not like other big name reviewers that don't even show the gameplay.
Great review as always, could have included 6800 XT
Why? The 6800XT has already been compared to the 4070 and 4060 Ti and it was an obvious big win for the 6800XT vs the 4060 Ti and an even match up between the 4070 and 6800XT cause the 6800XT was slightly cheaper and and had more vram but was worse at everything else from RT, upscaling tech, encoding, workstation, consistent drive support and optimization, power efficiency, size and form factor. Pretty sure he said hed pick the 4070 but he doesnt think theres a right or wrong choice between the 4070, 6800XT, and 6950XT and completely depends on what you want and dont need from your card
@@Angel7black if you had a 650w psu you'd need to upgrade with a 6800xt, while you don't with the 4070 thanks to its insane power draw ;). i do respect this is a niche situation but it just means you can save money on a power supply.
@@WilFitzz That's a lie, I have the RX 6800 XT with a 650w PSU and it works like wonders.
@@khalednajjar3852 you are lying. The 6800xt spikes as high at 580w according to techpowerup. No way you aren’t bluescreening with a 650w psu taking roughly 50w to power everything else (also accounting for whatever cpu is in use)
@@WilFitzz Really bro, are you high ? . The 6800 xt doesn't draw more then 300w, and even the 4090 doesn't draw more then 500w. Where did you get your information from 😂even it techpowerup it says the same.
Thnaks Daniel as always for such a comprehensive comparision, however dont understand the advancement in GPUs, are we moving on to higher res gaming or coming backwards, a $350-400 gpus struggling on 1440p
Just upgraded from a Nvidia GTX 1050 TI to a Nvidia RTX 4060 OC EVO and it is the best thing ever!!!
16gb?
Same thing happened with 1060 3gb that no one wanted
Is it just me or in 1080p cards are very close but in 1440p the 4070 suddenly dominates. What makes it so efficient in 1440p and not constantly dominate?
bus width
8 gb of vram is not enough in todays and futures gaming as games tend to get demanding. But what I think is that it's unfeasible to play on ultra since you will not notice any differences in 99.99% of the gameplay time. Thus, it is more reasonable to lower your settings to high or very high, which will result in more stable frames. Gaming on ultra has very little sense. But I admit that getting above 60 fps while having everything tweaked to ultra feels incredibly satisfying.
Good video. I bought the 8GB version mainly for AI work but it does seem you get parity between these two cards when settings are reasonable. Even on my 4080 which I did buy hoping to do some RT, I never turn it on because I always end up tanking my performance and even if it doesn't then I seem to end up with weird stutters no matter what settings I use.
nice vid was looking for this :)
I thought the 4070 would do better. 12gb is garbage at that price. I have a 3060ti and I'm going 16gb on my next card no exceptions.
I think that's wise, but what about 15GB? :D
;)
Seriously though, I really think that 12GB will hold back the 4070 quite a lot over the next few years.
It's not that it will be useless or obsolete to have 12GB on the 4070 any time in the next four years, but it will really limit what it can do in an increasing number of games over the next few years, and in some games, textures will just not always load in. It will require you to spend more time messing around with settings to try to get the right balance, but you're also going to be more likely to find that some parts of some games end up slowing down, so you'll have to learn to change settings more often for different parts of the game, but it could be immersion breaking if it starts happening at some epic moment part way through a single player game.
i still rock a 1080 with 8gb. wanting to upgrad for a year now but still didn't. i don't know - i am on a 1440p 144hz screen nad may change to 3440x1440 wide but none of all this cards - since 20xx series really does make me change as i have to spend my money wisely. what do you guys think. is it the right time to boy an AMD 6xx/7xx series or a 4070 ... or should i wait? usually i feel confident after 4-5 years for an upgrade - but times changed. lemme know your thoughts please
The best cheap upgrade is the 6800xt or get the 12gb radeon card that’s even cheaper and still more than capable at 1440p. I feel ya bro I had the 1060 and now 3060ti and even I want to upgrade
@@Aaronnpool23 That's very good advice. If the rumors about the 7800 XT are true, it will be a bit slower than the 6800 XT, and cost around 550. However, the 7800 XT will probably also be a good option, even if it ends up being a little slower than the 6800 XT. I suspect that the 7800 XT would eventually catch up to the 6800 XT in performance because it has a newer architecture, and/or it might eventually end up with superior upscaling abilities, and it will be better at ray tracing than the 6800 XT on day one, though I really don't think that ray tracing performance is going to be all that useful at this performance tier, as it's going to be the first thing you want to turn off in order to improve performance in demanding games.
Also, basic "Lumen" lighting, which is a type of ray tracing used in Unreal Engine 5, works just as well on RX 6000 series cards as it does on Nvidia cards, and that might be the most common and practical form of ray tracing in games over the next three years for this performance tier.
@@syncmonism well the 7800xt is over 3 year newer so it should be better, but I’d choose the 6800xt over it. 16gb of Vram is 16gb regardless of whatever “ upscaling ability “ or more units the card comes with. Both have 256bit speed so really only paying for more processing power
i'm stuck with a 1440p monitor, and only willing to pay for a $400 card. i think the 4060ti is an upscaler card at that resolution. so it will be a fun 960p/720p gaming experience for me in the next 5 years