It's a 60 class card with 80 class pricing. This was originally supposed to be 4070 TI. Think about that. Nvidia wanted to sell this card as two steps above where it actually performed. And their original MSRP was probably going to be $750.
i think nvidia did think that they can reduce raw power but can use DLSS 3 as an excuse for higher prices like yeah it sucks but with DLSS 3! 50 series has DLSS4 then they are weaker than last gen GPUs in Raw power but double as good with DLSS
This 4070 was suppose to be the 4070Ti. Did you forget the 4080 12GB. If Nvidia didn’t change that one to name it 4070 Ti, this 4070 was next in line and would have named by Nvidia 4070Ti and charge $750 for it. But the backlash on the 4080 12 GB was to big and Nvidia got cold feet.
@@luxaly9510 using DLSS 3 when you're paying $600+ is absurd.. Acting like frame generation is worth paying for when you should have native power at that price range is ridiculous
It’s almost as if the Ti was the 70 and the 70 was the 60. They didn’t get away with it when it came to the 80 but have just gone and done it with the 60/70. Nice bait and switch!
You guys read what Jensen recently said about pricing? I know this I am jumping ship as evga did. I may not be as fast in AI, or DLSS but I am no longer going to ride Jensen's boomstick. YMMV, but expect this to get even worse with the 5k series. As long as that man is alive, and CEO this is only going to get worse.
@@generalawareness101 That's why I'm looking at 7900 XTX's and trying to figure which to buy (must be white or light silver to fit the case and motherboard I'm getting/have got), they look decent and a slight upgrade from my laptop 3080. I won't buy Nvidia after the scummy things they've done so far.
i dunno for the performance of the 3060 it definitely isn't power efficient, a 6700xt uses the same amount of power and out performs it a ton. besides the fact a 2060/2060 super used far less power then the 3060 and they are barely out performed by the 3060
@@dustinharvey7394 3000 series were pretty power hungry, 4000 series definitely is a leap in power efficiency and overall a decent leap in performance, its just a shame that Nvidia decided to name them and price them the way the did.
@@dustinharvey7394 that’s a node difference. Compared to the higher tier cards the 3060 was more power efficient, because it wasn’t pushed as hard. If you look at it realistically the bigger cards are more power efficient if you just limit them because they have more cores…
Went shopping for a GPU upgrade yesterday, was thinking about the new 4070 for about 600-650 dollars. Left with a rx6950xt reference model which is at $649 right now. honestly, could not be happier. that was the best "change my mind on the fly at the store" i've ever had
@@SEGArianer it's the top end card from last gen....... Why would you compare tdp...? Obviously it's gonna be better on the newer and lower end model in the lineup... Weird.
My wife still rocks her Sapphire Nitro 6950 xt and laughs at these newer gen cards that can't keep up without fake frames. That being said AMD fake frames are coming lol
@@megadeth8592 Because of Form Factor an the power supply needed. The 4070 will fit in more cases and should not require an PS upgrade. Thats why you should not only compare fps.
Most reviewers, from the 5 or 6 I have watched, have said the "card is OK, but overpriced". They then usually go on to say how much better Ray Tracing and DLSS are, and point out power draw is low, tech is latest... Not really a bashing. Jay's video is a well constructed takedown, and well supported with data and methodology.
6950xt owner here. I switched from nVidia when I saw the 40 series pricing was simply unaffordable this generation. I’ve been nothing but impressed. The software is powerful and intuitive, the performance is next generation and the price was a steal. Give AMD a chance, you won’t regret it.
@@Gabu_Dono It's pretty loud and power consuming, I have the red devil variant. If you have a really low watt psu, the 4070 is honestly a pretty good alternative
Where I live this card is pretty cheap and amd cards are really pricy so I might actually jump the gun and go for it but yeah, I’ll wait a couple months
I may not have bought much EVGA stuff but this is why we need them in the video card market. EVGA would have found a way to unlock it forcing Nvida to unlock voltage for all cards.
I keep saying that this 4070 was originally meant to be the 4060 or 4060 Ti. What's making this so much worse was the MASSIVE increase we got in CUDA from 2000 series to 3000 series. If you look at all of the other 4000 series and look at CUDA core counts generational growth: 1070 (1920 Cuda) to 2070 (2304 Cuda) 20% Cuda core increase 2070 (2304 Cuda) to 3070 (5888 Cuda) 156% Cuda core increase 3070 (5888 Cuda) to 4070 (5888 Cuda) 0% Cuda core increase 1070 Ti (2432 Cuda) to 2070S (2560 Cuda) 9% Cuda core increase 2070S (2560 Cuda) to 3070 Ti (6144 Cuda) 140% Cuda core increase 3070 Ti (6144 Cuda) to 4070 Ti (7680 Cuda) 25% Cuda core increase 1080 (2560 Cuda) to 2080 (2944 Cuda) 15% Cuda core increase 2080 (2944 Cuda) to 3080 (8704 Cuda) 195% Cuda core increase (this was such a big upgrade) 3080 (8704 Cuda) to 4080 (9728 Cuda) 12% Cuda core increase 1080 Ti (3584 Cuda) to 2080 Ti (4532 Cuda) 26% Cuda core increase 2080 Ti (4532 Cuda) to 3080 Ti (10240 Cuda) 126% Cuda core increase 3080 Ti (10240 Cuda) to 4080 Ti? (11800 Cuda?) 15% Cuda core increase? TitanX (3072 Cuda) to TitanRTX (4608 Cuda) 50% Cuda core increase Titan RTX (4608 Cuda) to 3090 (10496 Cuda) 128% increase 3090 (10496 Cuda) to 4090 (16384 Cuda) 56% increase Future 4090 Ti, 18176 Cuda I took a guess that 4080 Ti will be 15% increase over 3080 Ti. It's pretty clear that Nvidia had no interest in giving such an enormous generational uplift again. They gave us an RTX 4080 that had just 12% more Cuda cores and 60% more VRAM than the RTX 3080 10GB FE, yet increased the price from $699 to $1199, or 72%, Nvidia expected to charge 72% more for 12% more CUDA cores and roughly 30% more non-RT/non-DLSS performance. When we went from the 2080 to the 3080, we kept the same $699 MSRP but got a 70% increase in gaming performance. The 4090 Cuda count was clearly grown simply to keep the performance at the top unattainable by anything AMD could launch. The 4080 is literally almost half of the 4090, it should have cost $799, which would have factored in inflation. (edit, accidently used the future 4090 ti Cuda count instead of the 4090, fixed)
@@SamfisherSam They are pretty comparable within the RTX generations, meaning, they perform roughly the same function. I understand that going from RTX 2000 to RTX 3000, the huge increase in the CUDA core count was due to the FP32 ALUs being doubled for each SM. That doesn't change the generational horsepower discrepancy, which when excluding Ray Tracing, DLSS and some 4K loads, is worse than the pitiful ~20% increase we saw going from GTX 1070 to RTX 2070. Nvidia is tiering this 4070 based on the minimum amount of GPU that they thought they could get away with while still being able to "sell" the story of large generational improvements, heavily reliant on DLSS 3 / Frame Generation, which is why all of their marketing materials discussing performance improvement over last-gen reference metrics that only use DLSS 3, not raw rendering performance. They took DLSS 3 / Frame Generation and decided to use it to sell us an RTX 4060 as an RTX 4070 for 20% more than they demanded for the RTX 3070, and 82% more than their MSRP for the ACTUAL previous generation RTX 3060 12GB. If Nvidia was making $30 gross profit on every RTX 3060 12GB (within industry ranges), then they are making closer to $300 in gross profit from the RTX 4070, and I don't know about you, but I'd personally feel pretty damned abused if I knew that a company decided to double their gross profit on a given product, but Nvidia has certainly more than quintupled their gross profit per unit on the 4070 compared to the 3060/3070. Nvidia's gaming revenue exploded in CY 2021 (FY 2022) due to mining and morons paying over MSRP and price increases, then they were down almost 30% year over year for CY 2022 (FY 2023), yet net income (before a nearly $4B increase in operating expenses) remained nearly flat due to increased prices, yet they made sure they had enough to run an $11B stock buyback and $2.5B in Stock-based compensation expenses (exec bonuses). I'm getting tired of people defending what Nvidia is doing, they make incredible products, yet they've gone anti-consumer. The best we can hope for is another couple of quarters with 25%-30% YoY gaming revenue losses to help them recognize that they still need consumers, although with the AI explosion that is just starting to take off, I'm betting they go from $15B in data center revenue in FY 2023 to $25B or more in FY 2024, the gaming segment won't really matter at that point and profit margins for datacenter will double, possibly triple because they can charge whatever they want at this point.
@@racerex340 wow, i heard that samsung offered Nvidia half the price for their 8nm node compared to Tsmcs 8nm node and in way they earned way more revenue and considering the gpu price crisis , they made some serious profits, i think their focus was more towards increasing the profit per gpu over increasing performance. it is a very pro invester move from a business standpoint , but their consumer are showing awareness now in not buying 4070 gpus
It was going to be the 4070ti but the whole two 4080s thing happened. That's all marketing anyway. Like all other generations new models perform around the level of the higher model in the previous generation, just like now. Marketing is not the tech.
@@bloxoss each generation the 70 tier cards were at least 20% faster than the 80 tier from the past, this time it barely matches the 80 tier card, and sometimes loses to it. Since when is it 3090 level?
@@bloxoss No shit. If there's no improvement then why would people upgrade to the new generation? You do realize that the entire point of a new generation is to make people upgrade from what they currently have because the new gen is better. A new generation being faster does not justify a price increase, the entire point of a new generation is that it performs better.
@@username8644 Yes, and nVidia CEO was promised to release new product every 6 months !!! Not every 2 years now !!! so 2 years long. nVidia has to provide at least 2x faster speed at same price point !!! so every 40 series are all overpriced !!!
And I won't buy it. I'm so pi$$ed at NVidia that I sold my one card from them and bought an Intel to replace it. I lost a tad of horsepower, but NVidia needs a good strong dose of competition and loss of sales to help them get the message, and Intel needs some encouragement to keep making cards.
@@ScottGrammer Honestly, Nvidia does not care too much about consumer graphics cards anymore. They are shifting over to AI, which is probably why they increased the prices so much for their graphics cards. If they can get much much larger margins from the same TSMC node by producing AI accelerators, why bother with consumer graphics cards ? Thus, they increase the price so much, that it is worth it for them. As much as I hate the game ( capitalism ), Nvidia is doing everything right within its rules. Don't hate the player, hate the game...
No. they giving more for less. 3080 price was like 1000$ and UP.. 4070 i 599. and uses at least 30% less power than 3080. SO how it's giving less? Yeah...less in terms of power bill.
@@SayWhaaaaaaaaaaaaaaaaaaaaaaat We talk about raw graphic power not just saving power. Using TSMC is more power efficient but nothing special. and 3080 was $699 not $1000. You just got scam on that price.
Dog, he’s gaming on a 4090 in his free time. Think about that. He doesn’t really give a crap. He’s feigning outrage to try to relate to the everyday poor person.
I got lucky with acquiring a evga 3080 ftw3 in the middle of all the madness a couple years back and was hoping to get a 4070 on launch thinking it would be a huge improvement. Glad I don't need to spend that money now.
@@mtrx1708 historically the 70's of the next generation are much better than the previous 80s. I was hoping the rt cores would be much better and they really aren't
This was much more informative than most review videos from launch day. (not a critic of the work done during that time crunch, just an applause for the work done for this video)
Was never planned to be sold at $750, that was a leaked memo saying "DO NOT EVER PRICE THIS OVER 750!" as a guideline for partners. It was never the MSRP.
i believe the price is more an inflation issue than a height issue ... money getting more worthless and poor people nowadays arent earning that much money than back in time ....a price is never a problem but the bank account always is one :)))))
@@LetsFixITJoe Inflation between 3070 release (2021) and 4070 release totaled 10%. That means adjusted for inflation, the 4070 should have been $549. At $599 that's double inflation not even counting the biggest problem ... it's not even close to the performance gains seen from both sides for all history. From 2070 to 3070 there was a 40% rasterization gain. From 3070 to 4070 it's only 27% despite the absurd price jump that we all know was supposed to be $750, aka a 50% increase. Compare that to the 980 ti vs 1080 ti. There was a 7.5% price increase and 100% performance gain. Screw Nvidia, so glad I bought AMD last year.
I commented on your last video that I thought this 4070 with its specs and performance looks like it should have been a 4060, everyone called me crazy, looks like I was right.
Especially since the 3060 has the same VRAM size and PCI bus bandwidth! Screw NVIDIA, I’m looking at AMD and Intel’s offerings at the end of this year.
I said the same thing. This is the "series" in Nvidia's portfolio that saw ZERO CUDA core increases generationally. 1070 to 2070 did, 2070 to 3070 did (big time), 3070 to 4070 = 0%. Shit, the 2080 to 3080 was almost a 200% increase in CUDA cores. Even the 3090 to the 4090 was a 73% increase, while 3080 to 4080 was a 12% increase. We were getting less of a bump this time no matter what, as Nvidia had always planned on DLSS 3 / Frame Generation with RT providing most of the generational uplift over 3000 series, as evidenced by their marketing materials only focusing on RT/DLSS/FG numbers and NEVER native rendering/rasterization improvements, because the 4080 is really only a 30% increase over the 3080, makes sense as the new node process gave them almost 20% while the increase of cores by 12% covered the rest. But with everything else, Crypto and now AI/ML boom, Nvidia is basically telling us that they do not want to give us more than a 20% increase in performance in frames per dollar over the original 3000 series..
@@racerex340 I have a 3080Ti, and so I’m not really looking to upgrade at all this generic. However, I think, I may be wrong, but I do think people who know about this stuff are seeing through the BS of NVIDIA giving us DLSS and frame generation instead of real hardware improvements in the mid to high tier range, the only exception this generation is the 4090 the price of that card here in the Uk from AIB’s are around £2000. Thus all the cards are to up sell. So NVIDIA gets away with it, because there is no competition in that tier of card, but I really think consumers should start buying AMD for a generation or two, I mean they are good cards, not the best no, but they can certainly get the job done until this bs gets sorted out.
Their customer base *IS* that dumb. Anyone who didn't see this happening years ago is blindly chugging Koolaid. AMD isn't much better in some regards, but their cards have longevity and better performance per dollar.
@@GrizzAxxemann Amd isn't better in any way. Their cards are always bad at launch until at least a year later due to their shitty drivers that they still can't seem to get control of. And all they do is match Nvidia's pricing and then lower their price a little bit to make up for their instability with their cards. Anybody who's bought a GPU since the 20 series is to blame for this. There hasn't been a single good deal since the 10 series and people keep buying cards, it's ridiculous. Too bad most of the population is stupid because the rest of us have to suffer due to their poor decisions.
@@username8644 It's a competition issue, not a customer issue. PC Gamers are enthusiasts. They don't care about cost as much as they do performance...if cost was the primary concern they would be console gamers and not pc gamers. The problem with this gen is that the performance does not even justify the cost, which is why sales are shit...make no mistake, if the costs were the same as they are, but performance for each card was 20% better, these cards would be sold out at these prices. I guarantee you when Nvidia releases next gen, they will keep the same pricing scheme, but the cards are going to be sold out because the performance gains that are missing right now, will be there in the next gen.
@@poofy0121 Customers are as much the issue as the companies right now. They are so desperate to play games at 4k 200fps instead of just settling on 1440p gaming with a few settings in game settings turned down from ultra. You don't need an rtx 4090. My 1070 ti still works great to this day and is totally usable even with my 3440x1440p monitor. An rtx 3080 is more than enough for anything gaming related. Yet these people think they need more powerful GPUs and keep giving money to companies that are scamming us. They are the reason the companies keep scamming us with every generation.
im sort of glad that i went for 6000 series amd (6800 XT) as my card on paper performs the same most of the time give or take as the 4070 except ofc RT performance which i dont think anyone really uses(at least i don't). Also having paid a lot less than a 3080 and 3080 ti im very happy with my card and its performance
If you've seen any of the UE5 games, RT is about to become far more important, particularly in the AAA space (search Unrecord for where we are heading). AMD MUST improve their RT compatibility as they are lagging in Third place. RT/PT will be standardised in game engines in the next two years, given NVs market share. Intel have already outdone AMD and their next gen cards could really damage NVs market share at the all important low-Mid range.
@@Takisgr07 Up until now its a valid argument as its hardly ubiquitous. However, every next gen engine is going to have some version of it baked in. You still don't have to use it but I don't think AMD can afford to be in third place, especially if Intels Battlemage cards continue to outclass them on that department.
@@zhardy323 It's not Nvidia who are the stupid ones - the stupid ones don't want to admit it though but still keep opening their wallets(cuz they "need" a new gen card of course)...
yeah, especially the fact that it has exactly the same number of CUDA cores as the 3070, and they have NEVER kept CUDA core counts the same generationally, always increased except in this one 3070 to 4070 case.
Oh good to know and I just got my strix 3080 OC and was worried I should have got the 4070 but the 3080 was only AUD$600 (USD380-400ish) so of course I couldn't pass that up. Looks like I made the right choice.
Nah, even AMD is overpricing their cards because of NVIDIA. No gpu company is your "friend" when they're clearly working with each other's interests. Just wait for the next generation and stop giving these companies your money
@@hhkl3bhhksm466 The next generation will likely be even more expensive. While you're right that no GPU company is your friend, I tend to want to go with one that's at least less my enemy. So I'm also looking hard at a RX6xxx card to be my next, probably a RX6950XT.
@@Calamity_Jack Wait just a couple more months. Amd is about to the release their new 7800 xt, which is the competitor to the 4070. It also uses RDNA 3 and is significantly faster than the previous gen
The skip method worked for the 2000 series. If a large chunk of consumers do that as well then Nvidia may be forced to come to their senses for next gen.
@@chilpeeps The PS5 has definitely given disappointed NVIDIA customers another option but NVIDIA's fundamental problem in the PC Gaming market is self inflicted.
Not going to work. NGreedia would sooner keep the pricing next gen and push gamers to buy their subscription service to use network GPUs. NGreedia couldn't give less of a shit about the gaming community or how 'affordable' their adult Legos are.
They are selling now mainly to people who use AI, stuff like that. Productivity related activities. Who do not care how much it costs, because they either make money with it or universities are paying for them etc. So as long as they don't have a competitor(AMD is not a competitor in that, at all), they can set whatever price they basically want. I hope I'm wrong, but I think I'm not.
Yeah I went from my 3070FE 8GB to a 3090Ti 24gb on a spontaneous "Oh that looks good price." thing just before they announced the 40 series and while I had a brief "Oh maybe I should have waited, now I'm kinda glad I got that 3090ti."
@@Kinvarus1 I went from 1080ti to 3090, that was a huge gap :D the auction it was on said "lightly used for simulation project" so for sure it was mining for past 2 years but I did try to murder it with Kombustor and other programms and it passed with no problems. Not the best bin considering max clocks but still undervolted performs nicely enough. Even made 3d printed shroud for it with 3 noctuas 92mm so it is maxing at 62c and close to perfect silence.
I bought the 3090 at a higher price than MSP (though not that much higher) and was kicking myself for wasting money, but I think all in all it was a good purchase. With DLSS etc there's no real need to upgrade for a long time now.
just got the 4070 fe for my mini itx. It is actually a pretty good card for a mini itx build. it is quiet, cool, efficient and small. However, if you are not building a mini itx then it has pretty bad value.
You thought the 4070 was a higher tier card that they limited? Well that's certainly a first I've heard, most people strongly believe this is a lower tier card that boosted "the number" to get more money out of it, just like they tried to do with the 4080 12GB edition
@@RacingAnt Sorry, typed wrong, I mean the 4080 12GB edition. Mind gets a little mixed up when thinking of generations of screw jobs Nvidia has been releasing.
People who believe that aren’t doing the math right. If there originally was a 4080 16 gb and a 4080 12gb and Nvidia got their way and no one complained, they had to have had a card lined up to be the 4070 TI. It should have been this card (the card now named the 4070). After they bumped the 4080 12gb down to be the 4070 Ti. They had to bump down all the cards below it as well.
@@hastyscorpion That's an interesting take, and certainly a feasible one, however another theory is they simply would not have come out with a 4070ti at all and simply had the 4070 card be the first *70 card out there after all except for the 3060ti all the other "day 1" cards where non-ti variants (or super in the case of the 20 series) and then later the ti versions came out. Then Nvidia could really push the idea that those super pricey flagship products are well worth the money, just look at the difference in performance! That said, the 4070ti wasn't a limited 4080, those cards were already produced by everyone (Nvidia and AIBs), already boxed and ready to be shipped to retailers, until word came down from on high and then boxes got changed out and any name plates or stickers on the cards that had "4080" were removed and swapped for 4070ti naming.
I had to rebuild my five year old system at the beginning of the year. Went from a 1050ti to a 3060. So far everything I've seen in the past few months tells me it was definitely better to get the 3060 rather than wait for a 4070 or any other 4000 series card. I'll be fine for the next few years when hopefully the graphics card market will finally fix itself.
If I upgrade from the 3060ti it’ll be to 6800 or 6900xt Prices will only keep rising thanks to the war mongers and countries that just keep printing money
It's not because the 4070 was destined for more, it's because it was destined for less! It's the actual successor to the 3060ti. So yeah there SHOULD be at least an extra card between the 4070 and 4070ti
Nope, they updsold all the cards by 1 tier. 4090 should be the 4080, the 90 series is a sham to make the flagships cost more. 4080 is the true 4070. 4070ti is the 4060ti. 4070 is the 4060. The sorry excuse of a card that the 4060 will be is actually the overpriced entry level 4050.
@@guilhermelopes986 the 4070ti performance with 16 GB of VRAM should be the real 4070 being that the 70 is always last gen top tier performance. Hence why the actual 4070 is more like a 4060ti. (3060ti and 2080 were basically the same).
As someone who's been using this graphics card for over a year, I've got to say that although the value of it could be better, the performance is pretty good for my standards. I do not have older GPUs to compare to, but what I did have was a laptop with a 3050 TI in it. As we all know, laptop GPUs are much weaker than their desktop counterparts. So a 4070 was definitely a massive upgrade from a 3050 TI laptop GPU. I will probably keep using this GPU until it just stops working at which point I will hopefully have enough money saved up to buy something better.
I bought PNY 4070 and have no issue turning on the voltage or any other settings. My 3D Mark Time Spy is over 17,000. Maybe its just the Founders Edition that's locked down. Have photos if you want them.
64bit bus, pci-e 5 2x, 6gb, and nvenc disabled. $469 ... will be 3050 speed on any system with less than pcie-5 ... i mean, im just plucking that out of thin air, dont wanna give um ideas..
was thinking of getting this gpu no thanks gonna stick with my 1080 ti and get the 6950 xt or 7900 xt both are looking real sweet now never tried amd before but its looking better and better
I guess I'll keep my 4070 Ti then. I was feeling a little guilty for spending $900 on it (Gigabyte Aero/White build), but since you can build a whole civilization in the valley between the 4070 and 4070 Ti, I guess it'll be worth it, lol.
@@RaptorFPV It ended up crashing games to I returned it before my window closed. Back to deciding what to get again. The 7800 XT is looking like a good candidate now.
@@Clint_the_Audio-Photo_Guy Crazy driver issues with 7800xt that I just got, returning it to get a 4070 tbh. Windows update loves to replace amd drivers with its own and fucks ur drivers. You then have to constantly rollback the driver och re-install it.
@@Clint_the_Audio-Photo_Guy I know it's a Windows issue. I tried literally every fix and nothing solved it, don't want to risk getting past being able to return it and endure future driver bs. I saw it as paying extra for something that works with no headaches.
Hey Jay and the Two Cents team! Big fan of the channel for years, thank you for the videos, I've have been pouring over them over the last few days. The reason being is that I'm being asked to travel to the US for business and I'm going to take the opportunity to visit Micro Center for the first time so I can upgrade my PC! So, the reason I write here is; I built my PC about 5-6 years ago, and have tipped away at upgrades over that time, upgrading the cooler, PSU, storage etc... Originally, I had thought to do the same and was going to buy a 4070ti, but, my i7 6700k probably isn't going to cut it... So I'm looking at a new Motherboard, CPU, Ram & GPU that give me a little flexibilty to upgrade later. The problem I'm seeing online is bottlenecking, compatibilty, power delivery, and most importantly, bang for buck. I've saved about $1,000 for this, but, I'm getting such great deals in the US (compared to home where the 12600k is $300-350) so I'm tempted to use the credit card to bump up a choice or two (like the 13600k) but I'm nervous that will balloon into unneccesary purchasing. Could you please cast your eyes over my build and nudge me in the right direction? Any help at all would be most appreciated! i7 6700k > i5 12600k NZXT Kraken x62 Asus z170 > z690 DDR5 16gb DDR4 3200 CL16 > 32gb DDR5 5600 CL40 Samsung 970 EVO Plus 1TB + 3TB Toshiba p300 EVGA FTW 1070 > MSI Ventus 4070ti EVGA Supernova 850w Gold NZXT s340 Elite I use Photshop, Illustrator & Premier for work and game at 3440 x 1440 144hz Thanks in advance!
Buy what you can afford Robert, don't go loading yourself with debt for an extra 10fps. I'll get savaged by the fanboys but if you are using the GPU for anything other than gaming I'd always go NVidia so I'm not going to advise getting a last gen AMD card, even though they are much better value. I'd also consider sticking with a DDR4 board to pick up the 13600 within budget.
I think what's illustrated best here is that NVIDIA straight up thought that the 4080 12GB scam was gonna fly, mainly due to that large performance gap. I think they originally planned to have the 4070Ti be the 4070, and the 4070 be the 4060, but then figured they'd be able to milk the consumer by doing a tier split on the 4080 class thinking nobody would notice. Gotta make up that mining money somehow. ¯\_(ツ)_/¯
I am stunned that nobody actually gives this card any points for the performance per watt (or silence) .... it dominantes in that cathegory. Are influencers target audience kids that do not care for energy bills or what ? I actually went for 4070 solelly that I can keep using my 600w passive PSU, undervolt the the GPU and have a very silent rig that consumes like a 2015 era PC and still have great 1440p ultrawide performance.
These youtube channels whilst giving out decent info also have to have hyperbole etc to get views. I gave my 3070 to my son and ended up getting g a new pc with a 4070. I wouldn't usually upgrade by one generation but wanted my own PC and to not share my sons. Seems like a decent card to me, a solid improvement from the 3070, better performance, better dlss, frame generation, and very good energy consumption....and will also benefit from future updates.
So, here's a crazy thing about this card. I bought a 4070 Gigabyte Eagle OC a couple of weeks ago (replaced my 1650 Super OC), and I am absolutely blown away. Now, keep in mind, everything else about my computer, is nearly 5 years old at this point; bought at the same time as my old 1650 Super OC. So I'm running a Core i5 9400, and 32GB of DDR4 3600 ram. Both slow compared to today's standards, and only low/mid tier. But yet still, I can manage to play Warzone 2 at 4K (I have a Gigabyte Aorus 48" 4K 120hz 'FO48U' monitor that I also bought at the same time as the 4070), and still average out at around 100FPS! On an i5 9400 CPU!! 😂I'm definitely CPU limited, but man, 100+FPS on Warzone at 4K(!!) is just insane. I'd average like 30FPS on the 1650 Super on this very same computer; nothing else has changed. I'm super happy with my purchase; although the price was definitely up there...
At this point these commenters are just crying babies. I decided to buy 4070 for the price/performance/power/availability combination. Of course if Nvidia can drop the price another 50 bucks or more it will be best. In my country AMD card ain’t cheap, these ivory tower RUclipsr don’t really know what they are talking about. Their eyes are only in the US market 😂
No they're not very interested in energy usage. As for silence some say most gamers use ear phones, so don't care. I undervolted an RX 6700xt, trimmed the max frequency to run cooler in summer to reduce with similar success on 1440p. Most of the most interesting games aren't the GFX intensive AAA types anyway. For the few others AMD game profiles made it easy to push the GPU harder.
Gonna be honest, I bought one of the 4070's, but I got the Asus non-OC'd one, not the reference model. I'm pretty happy with it, but I do feel it could've been a bit cheaper. Like $50-$100 cheaper.| And before anyone says "Why didn't you buy X instead???"... 1) It was going in an eGPU box, thus limited space. Otherwise I'd have just ripped the 3080ti out of my desktop. 2) Power usage. Electricity prices are nuts here in the UK, so i'm trying to limit power usage where I can (also why I don't use my desktop much right now with it's 3080ti). 3) What I use my rig for really benefits from CUDA cores to speed up rendering. I'd be very interested to see if there's a way to unlock the voltage to get it closer to the wattage it can pull max... I think it's 225W from the 8-pin and the PCI-E slot? Might give it a nice boost.
It seems like skipping the 4000 generation altogether is the right move financially. I'm building a new PC now and found a lightly used 3080 for $400 - it should be plenty good enough until 5000 gpu's get released!
Thinking on the same lines but getting hold of a low mileage 3080 at that price would be optimistic to say the least . Decent 3080's in the UK at $400 equivalent, is like trying to search for Unicorn Poo but much harder, harder to find that is, not harder Unicorn Poo which is even harder to find ( didn't want to get things confused, that's Nvideas job.)
1080Ti still running strong, but getting old and slowing down quickly! I usually build a New Top of the Line PC every 7 yrs roughly and it's time to do so but the prices are just atrocious on Graphics cards and MOBOs. Also Intels next CPU will be a different Socket, so building a 13900lk is just silly for future proofing. Between Prices and Tech changes that's happening just sucks to build a new TOTL PC.
Could I ask why you would take a 3080 over a 4070? They're still selling for more than the 4070 new anywhere I see, and are about the same on performance, with a higher wattage.
I wondered the same thing, but then got WAY more confused when he said he would get a 3070 Ti over a 4070. That makes absolutely no sense to me. Why would you get something slower, with much less VRAM, much higher power, and fewer features, for the same price?
The 3080 has more raw performance then the 4070. The 4070 has to use it’s gimmicks to match it. Especially at higher resolutions in most games the 3080 beats it. I’m sticking with my FTW3 3080
@@jedpratte Every comparison I've seen had them trading blows, but mostly the same. I could see the 4k maybe, but I only 1440p game anyway. But totally, if you already have a 3080, there would be zero reason to upgrade. I have a 1080 though, and had gone back and forth between the 3080 and 4070.
@@umbles7007 yea the ones I saw showed at 4k the 3080 wins the rest is similar performance. That’s kinda sad as usually the 70 series would trade blows with previous 80ti
0:25 🤣that was me about 2 months ago. I built a rig with 14 Corsair fans and 3 iCUE commanders. Took a while to get the cabling organized. Makes me curious as to what they have coming! 🤔
The 4070ti was suppose to be a watered down 4080 with less memory. This was already confirmed. They changed the labeling last minute when consumers complained. So this threw the whole line up out of whack. 4090 4080 4070ti (4080 12GB) 4070
There was something I saw with performance on another channel: Previous gen's 80 series becomes new gen's 60 series. eg 1080, 2080 became 2060, 3060. With this 4070, it mirrors the performance of the 3080. Lending more evidence to it should have been a 4060 card.
I worked my ass off and bought a 4080. Told myself to ignore the price tag just keep saving. I couldn’t be happier performance-wise. The card is a beast. But not everyone can pay that much, I’m fact most can’t, which sucks.
I really have no complaints about my new 4070 that being said I upgraded from a 1080 and just let me say wow it’s the diff between night and day for me. I don’t really understand why people complain so much it’s good for what I want to use it for
Yeah, the 4070 is bad vs the 4070ti, 20% more performance for 25% more price. Yeah this comment is on a video that is 5 months old, but when you're talking about price for performance, per dollar the 4070 is better. It's not really a fair comparison when reviews are not taking account of prices. If price isn't an option, 4070ti is better, if price is an option, you get more bang for your buck with a 4070. With the prices of GPUs the last 5 years or so, price is a huge selling point.
I have a RTX 4070 as well. But i can adjust the voltage slider just fine. I have mine running at 3050mhz. So that's 150mhz overclock on the core clock and 1000mhz on the memory clock. No stability issue's.
@@joegreezy Its my money. I'll use it how i please. I'm not paying a 1000+ for a 4070 Ti. 650 was my max budget. And i was looking for a GPU with 3080 performance.
Me too with my second-hand 2080 Ti. Was planning on getting the 3080 before its price jumped and now it just makes no sense to upgrade to it. The 40 series might as well not exist until the 4080 sells cheap second-hand.
I'm always on the move and really enjoy GeForce Now. But, decided not to go for the 4080 subscription as I did not see the point of it, considering that I can game in 4k 60FPS DTS with the 3080 stream subscription. Glad I was right.
The RTX 40 series (with the exception of the 4090/4080) is basically the RTX 20 series. Minor performance boosts inexchange for a new early beta feature that they're charging a major premium over the previous gen.
Performance difference massive but price difference massive too. Che ked Amazon earlier today and the cheapest 4070 is £560 and 4070TI starts at £850 here in the UK. I went for 4070 as its enough but I would go for 4070ti if it was cheaper
When the 4060 releases, I would love to see the power difference comparison between the 30 series cards. At this point I wouldn’t be surprised if 4090 would be the 3090 and the 4060 would be comparable to like a 2080ti.
So glad I got my 3080 12gb model last August for 800 after taxes and shipping costs to outside of the US. This is getting ridiculous. It took me 12 years to upgrade and I feel I made the best choice at the best possible time. Edit: Was coming from an EVGA 970 (got that one borrowed after my HD7950 burnt out) and an FX8320.
Yep me too. Didn't listen to the warnings about its 8GB vram buffer. I'm ok with the fact that every year new games will demand more GPU power so framerate drops a bit but lowering textures? ...nope.
Nvidia is smoking meth. This was *supposed* to be the 4070 Ti (as the current 4070 Ti was originally the "4080 12GB"). A 60-class GPU priced as an 80-class... Weaker than last gen in terms of raw power; "But guys, it's better with DLSS3, which we locked behind firmware of our new gen!"
Thanks for taking the trouble to do things like this Jay, it really helps a lot of us and is really appreciated. Even though I have already purchased a new graphics card recently i still found this very useful recently to help warn my friend to not waste his money when buying a new graphics card for his son on a limited budget. Its not right to slam people who can just afford something decent with such a huge performance gap.
@@raresmacovei8382 you cant compare core counts between generations, what.... You compare performance Its like comparing bus speeds when they increased cache by 3x this generation (meaning bus isnt needed as much with local cache)
I've been nvidia and intel for a long long time. Recently, I picked up an amd cpu and I was blown away by performance and lower wattage all around. Looks like I'll be going with AMD for my gpu next aswell.
I just got a Sapphire 6950XT Nitro+ last week. I never had a AMD card before. This card is a beast...runs everything at high speed. The only problem is the heat it generates (280W) but with an optimised airflow, it should be ok.
I think the 4070 is Exactly where nVidia originally planned it to be and that there was supposed to be a 4070ti bethween it and the 408012GB. When that got renamed to the 4070ti instead they've basically Dropped the middle card as I'n betting they already had a lot of the coolers, cards etc built for the 4070 and the middle card ti varient was to be made later. I could see it coming up as a 4070 Super tbh
nVidia is really just causing themselves extra headaches by compressing their naming scheme. Why does every usable card have to be either 60, 70, 80 or 90? Why not 4010, 4020, 4030, 4040, 4050, 4060, 4070, 4080 and 4090? Then push the Ti models as refreshes as they used to do.
I was going to get the 3070 to but I chose the 7900xt and my god I’m glad I did, atomic heart used 14gb vram while maxed out, 165hz not one dropped frame
Wish I would have watched this before building my pc. I was going for the 4080 or 4090 but ran low on money, so I went with the next one down because it was half the price of the 4090. Lesson learned! I will just have to replace it when I get the money.
I agree with every point made in the video, however, to be fair, a 7.4% performance increase by overclocking is actually pretty decent, especially considering this is an FE card. The 4070 would have been a very good deal at 450 - 500$
Still not a good deal. Decent at best. The "4070" is basically just a renamed 4060 which should at best have a price tag of $400. And ask yourself, would $400 be a good price for a 4060?
@@CanIHasThisName No you don´t. That is what Nvidia wants you to believe. You just need to take R&D and the production cost into account. And suddenly (as the R&D cost of the Graphics Unit division hasn´t changed significantly in 5 years ) it is not that good of a deal anymore.
If you're disappointed now, just think about next release season. Their best offering will be a 4080 at best I feel Nvidia has turned it's back on what helped build them up, same for AMD my hope is that Intel keeps their GPU for gerneral consumer lines going, I like what they got so far, hope they can better it next time.
This card is already just an overclocked 60 class card so it's not like there's any reason to expect much more in the tank. The '12GB 4080 / 4070 Ti' is the real 4070.
This reminds me of the Mini Apple pcs. Where they sautered in the ram so you couldnt replace it with better ram. Forcing you to pay alot more for higher and fastet gigs. I am trully despising these dirty tactics
I didn't know there was such a gulf between the two, its insane. I was lucky enough to find a 4070ti for $635 open box at my local Microcenter, Upgraded from a 2070 super. So It's a massive upgrade for me, but I would not have bought it at MSRP.
I'm going with the RTX 4070 because its the only card that doesn't draw more than 200W, has a single 8 pin connection, isn't the size of a cinder block and gets exactly double the frame rate than my 3 year old RTX 2060 on mature drivers gets. I'm not sad and I won't regret it.
I believe the 4070 was always supposed to be the 4070, and they just renamed the 4080 12GB to the 4070 Ti. The main reason is Nvidia often released non Ti cards, and then later the Ti cards. In other words, originally the "4080 12GB" was going to stay called that, and eventually a "4070 Ti" was going to be slotted in between.
buyers remorse is real 😢 though i have upgraded from an asua x480 8gb oc to a zotac 4070 trinity, im really starting to think i should of waited a few extra months before pulling the tirgger. its not like i havent waited 6 years to upgrade. though i must admit going from a lower tier to the 4070 has really impressed me but apprantly i was small minded
If I am remembering correctly, there was a significant gap between the 3060 and the 3060ti? I was actually hoping for that same gap with the rest of the 3k series cause a 10% gain does not seem to be worth jumping for.
It was a similar gap, yes. That Cyberpunk scenario for example: 81 vs 104 is a 28.4% lead for the 4070 Ti. 3060 is 39 fps, 3060 Ti is 52 fps. That's a 33.33% lead for the 3060 Ti.
It's dissapointing because you're doing the wrong thing. What this card is amazing at, is UNDERVOLTING. I'm literally sipping 165 watts of power with no performance loss. This card is amazing, and great at many things. Staying ice cold. Zero noise Great performance per watt and the ability to undervolt like an absolute CHAMP. Loving mine!!
bro how do you get zero noise my founders edition rtx 4070 is spinning 5 times per hour for 5 minutes each during office desktop use which is audible through the whole room
@@NoobProTV Wtf?! LMAOOO I got the cheap MSI ventus which costs the same as the default card. has 3 fans and a large heatsink (longer) Mine never goes above 55c in gaming. It's ice cold with zero fan noice. I'd return that one and get the msi ventus, do youreself a favor. Also undervolt your card, seriously. They undervolt VERY well.
It's a 60 class card with 80 class pricing. This was originally supposed to be 4070 TI. Think about that. Nvidia wanted to sell this card as two steps above where it actually performed. And their original MSRP was probably going to be $750.
i think nvidia did think that they can reduce raw power but can use DLSS 3 as an excuse for higher prices like yeah it sucks but with DLSS 3! 50 series has DLSS4 then they are weaker than last gen GPUs in Raw power but double as good with DLSS
I literally spat my coffee out when Jay said it was always ment to be a higher tier card in the intro. WTF has he been smoking...
This 4070 was suppose to be the 4070Ti. Did you forget the 4080 12GB. If Nvidia didn’t change that one to name it 4070 Ti, this 4070 was next in line and would have named by Nvidia 4070Ti and charge $750 for it. But the backlash on the 4080 12 GB was to big and Nvidia got cold feet.
@@stevemoon2136 Nvidia wanted to sell it as a 4070ti. If they hadn't renamed the 4080 12gb, this would have been your 4070 TI.
@@luxaly9510 using DLSS 3 when you're paying $600+ is absurd.. Acting like frame generation is worth paying for when you should have native power at that price range is ridiculous
It’s almost as if the Ti was the 70 and the 70 was the 60. They didn’t get away with it when it came to the 80 but have just gone and done it with the 60/70. Nice bait and switch!
Everything from the 80 down is a tier too high in the naming.
Nvidia were definitely going to charge us $600 for. 4060. 🤡🤡🤡
Ding Ding Ding! We have a winner
You guys read what Jensen recently said about pricing? I know this I am jumping ship as evga did. I may not be as fast in AI, or DLSS but I am no longer going to ride Jensen's boomstick. YMMV, but expect this to get even worse with the 5k series. As long as that man is alive, and CEO this is only going to get worse.
@@generalawareness101 That's why I'm looking at 7900 XTX's and trying to figure which to buy (must be white or light silver to fit the case and motherboard I'm getting/have got), they look decent and a slight upgrade from my laptop 3080. I won't buy Nvidia after the scummy things they've done so far.
There has never been a less inspirational range of GPU’s.
@travisclark2642 yea I'm also buying a 4090 next week!
Interesting never thought you would be interested in GPUs. Haven't watched your debates in a while.
Geforce FX has entered the chat.
The 4090 is atleast fun to window shop. But, the 70 and 80 are hot garbage.
@@BlackParade01 I can't even afford a 4090. 2000 euro here.
It is not a surprise that the 4070 is power efficient; I have never seen a 60 class card which is inefficient.
i dunno for the performance of the 3060 it definitely isn't power efficient, a 6700xt uses the same amount of power and out performs it a ton. besides the fact a 2060/2060 super used far less power then the 3060 and they are barely out performed by the 3060
@@dustinharvey7394 3000 series were pretty power hungry, 4000 series definitely is a leap in power efficiency and overall a decent leap in performance, its just a shame that Nvidia decided to name them and price them the way the did.
@@dustinharvey7394 source trust me bro.
@@Takisgr07 it's all information you can find online from multiple sources so what are you on about?
@@dustinharvey7394 that’s a node difference. Compared to the higher tier cards the 3060 was more power efficient, because it wasn’t pushed as hard.
If you look at it realistically the bigger cards are more power efficient if you just limit them because they have more cores…
Went shopping for a GPU upgrade yesterday, was thinking about the new 4070 for about 600-650 dollars. Left with a rx6950xt reference model which is at $649 right now. honestly, could not be happier. that was the best "change my mind on the fly at the store" i've ever had
Ok but 335W vs 200W TDP/TGP, hope AMD will drop TDP for their next GPUs.
No "OK but" here.. you made a great choice!
@@SEGArianer it's the top end card from last gen....... Why would you compare tdp...? Obviously it's gonna be better on the newer and lower end model in the lineup... Weird.
My wife still rocks her Sapphire Nitro 6950 xt and laughs at these newer gen cards that can't keep up without fake frames. That being said AMD fake frames are coming lol
@@megadeth8592 Because of Form Factor an the power supply needed. The 4070 will fit in more cases and should not require an PS upgrade. Thats why you should not only compare fps.
The only disappointing thing is seeing youtubers NOT bashing the hell out of this rip off 4060 for 600 bucks tbh!
I don't think I have seen a youtuber not bash it lol. It's a joke, a 4060 with a 4070 badge and price.
every single review ive seen of it said it was poor
not enough.
since the 4080 12gb bullshat its become almost silence
Most are being fairly critical. The only review I've seen sing it's praises is Ars Technica which is a written article.
Most reviewers, from the 5 or 6 I have watched, have said the "card is OK, but overpriced". They then usually go on to say how much better Ray Tracing and DLSS are, and point out power draw is low, tech is latest... Not really a bashing. Jay's video is a well constructed takedown, and well supported with data and methodology.
6950xt owner here. I switched from nVidia when I saw the 40 series pricing was simply unaffordable this generation. I’ve been nothing but impressed. The software is powerful and intuitive, the performance is next generation and the price was a steal. Give AMD a chance, you won’t regret it.
Is the high power draw not a problem? It’s has one of the highest power draws out there, and I’m concerned it might be a bit loud.
Same - never regret it so far.
@Gabriel - theyre quiet. The Coolers are pretty good... ive the Referencemodel.
Put an AMD GPU into a 3D environment and tell me how it goes... Not everyone is into the gaming stuff.
@@Gabu_Dono It's pretty loud and power consuming, I have the red devil variant. If you have a really low watt psu, the 4070 is honestly a pretty good alternative
@@sturmtruppen-1916 what or which 3D Environment?
It's nice to see the community speak with their wallet. The card ain't doing too hot sales wise
Still not enough. These ridiculous prices needs to go. the 80tis need to be at $699 and the rest will follow.
Nvidia will just make less cards to keep the price up lol
Where I live this card is pretty cheap and amd cards are really pricy so I might actually jump the gun and go for it but yeah, I’ll wait a couple months
Which is why Nvidia are reducing production instead of the price.
@@kenos911: try to get it used then, there ought to be some who want to move on to a better card
I may not have bought much EVGA stuff but this is why we need them in the video card market. EVGA would have found a way to unlock it forcing Nvida to unlock voltage for all cards.
Remember EVGA left because Nvidia was starting to punish people for doing that sort of stuff and because they kept screwing over board partners
Honestly voltage has been locked since 1000 series. That voltage slider just allows it just a few more millivolts.
I have a evga 980 sc and looking to upgrade right now, first ive heard of this
I keep saying that this 4070 was originally meant to be the 4060 or 4060 Ti. What's making this so much worse was the MASSIVE increase we got in CUDA from 2000 series to 3000 series.
If you look at all of the other 4000 series and look at CUDA core counts generational growth:
1070 (1920 Cuda) to 2070 (2304 Cuda) 20% Cuda core increase
2070 (2304 Cuda) to 3070 (5888 Cuda) 156% Cuda core increase
3070 (5888 Cuda) to 4070 (5888 Cuda) 0% Cuda core increase
1070 Ti (2432 Cuda) to 2070S (2560 Cuda) 9% Cuda core increase
2070S (2560 Cuda) to 3070 Ti (6144 Cuda) 140% Cuda core increase
3070 Ti (6144 Cuda) to 4070 Ti (7680 Cuda) 25% Cuda core increase
1080 (2560 Cuda) to 2080 (2944 Cuda) 15% Cuda core increase
2080 (2944 Cuda) to 3080 (8704 Cuda) 195% Cuda core increase (this was such a big upgrade)
3080 (8704 Cuda) to 4080 (9728 Cuda) 12% Cuda core increase
1080 Ti (3584 Cuda) to 2080 Ti (4532 Cuda) 26% Cuda core increase
2080 Ti (4532 Cuda) to 3080 Ti (10240 Cuda) 126% Cuda core increase
3080 Ti (10240 Cuda) to 4080 Ti? (11800 Cuda?) 15% Cuda core increase?
TitanX (3072 Cuda) to TitanRTX (4608 Cuda) 50% Cuda core increase
Titan RTX (4608 Cuda) to 3090 (10496 Cuda) 128% increase
3090 (10496 Cuda) to 4090 (16384 Cuda) 56% increase
Future 4090 Ti, 18176 Cuda
I took a guess that 4080 Ti will be 15% increase over 3080 Ti.
It's pretty clear that Nvidia had no interest in giving such an enormous generational uplift again. They gave us an RTX 4080 that had just 12% more Cuda cores and 60% more VRAM than the RTX 3080 10GB FE, yet increased the price from $699 to $1199, or 72%, Nvidia expected to charge 72% more for 12% more CUDA cores and roughly 30% more non-RT/non-DLSS performance. When we went from the 2080 to the 3080, we kept the same $699 MSRP but got a 70% increase in gaming performance. The 4090 Cuda count was clearly grown simply to keep the performance at the top unattainable by anything AMD could launch. The 4080 is literally almost half of the 4090, it should have cost $799, which would have factored in inflation.
(edit, accidently used the future 4090 ti Cuda count instead of the 4090, fixed)
That's not really how CUDA cores work... They are not comparable generation to generation.
@@SamfisherSam They are pretty comparable within the RTX generations, meaning, they perform roughly the same function. I understand that going from RTX 2000 to RTX 3000, the huge increase in the CUDA core count was due to the FP32 ALUs being doubled for each SM. That doesn't change the generational horsepower discrepancy, which when excluding Ray Tracing, DLSS and some 4K loads, is worse than the pitiful ~20% increase we saw going from GTX 1070 to RTX 2070.
Nvidia is tiering this 4070 based on the minimum amount of GPU that they thought they could get away with while still being able to "sell" the story of large generational improvements, heavily reliant on DLSS 3 / Frame Generation, which is why all of their marketing materials discussing performance improvement over last-gen reference metrics that only use DLSS 3, not raw rendering performance.
They took DLSS 3 / Frame Generation and decided to use it to sell us an RTX 4060 as an RTX 4070 for 20% more than they demanded for the RTX 3070, and 82% more than their MSRP for the ACTUAL previous generation RTX 3060 12GB. If Nvidia was making $30 gross profit on every RTX 3060 12GB (within industry ranges), then they are making closer to $300 in gross profit from the RTX 4070, and I don't know about you, but I'd personally feel pretty damned abused if I knew that a company decided to double their gross profit on a given product, but Nvidia has certainly more than quintupled their gross profit per unit on the 4070 compared to the 3060/3070.
Nvidia's gaming revenue exploded in CY 2021 (FY 2022) due to mining and morons paying over MSRP and price increases, then they were down almost 30% year over year for CY 2022 (FY 2023), yet net income (before a nearly $4B increase in operating expenses) remained nearly flat due to increased prices, yet they made sure they had enough to run an $11B stock buyback and $2.5B in Stock-based compensation expenses (exec bonuses). I'm getting tired of people defending what Nvidia is doing, they make incredible products, yet they've gone anti-consumer. The best we can hope for is another couple of quarters with 25%-30% YoY gaming revenue losses to help them recognize that they still need consumers, although with the AI explosion that is just starting to take off, I'm betting they go from $15B in data center revenue in FY 2023 to $25B or more in FY 2024, the gaming segment won't really matter at that point and profit margins for datacenter will double, possibly triple because they can charge whatever they want at this point.
very good information, thanks to you my knowledge about rtx40 series cards has increased dramatically.
@@racerex340 wow, i heard that samsung offered Nvidia half the price for their 8nm node compared to Tsmcs 8nm node and in way they earned way more revenue and considering the gpu price crisis , they made some serious profits, i think their focus was more towards increasing the profit per gpu over increasing performance. it is a very pro invester move from a business standpoint , but their consumer are showing awareness now in not buying 4070 gpus
It was going to be the 4070ti but the whole two 4080s thing happened. That's all marketing anyway. Like all other generations new models perform around the level of the higher model in the previous generation, just like now. Marketing is not the tech.
the 4070 is a 3060ti replacement according to power draw. same tdp class. if it would be 200 bucks cheaper it would be ok.
Yes 4070 only worth $399
but it ties the 3080-ti and 3090 in some case??
@@bloxoss each generation the 70 tier cards were at least 20% faster than the 80 tier from the past, this time it barely matches the 80 tier card, and sometimes loses to it. Since when is it 3090 level?
@@bloxoss No shit. If there's no improvement then why would people upgrade to the new generation? You do realize that the entire point of a new generation is to make people upgrade from what they currently have because the new gen is better. A new generation being faster does not justify a price increase, the entire point of a new generation is that it performs better.
@@username8644 Yes, and nVidia CEO was promised to release new product every 6 months !!! Not every 2 years now !!! so 2 years long. nVidia has to provide at least 2x faster speed at same price point !!! so every 40 series are all overpriced !!!
Nvidia is definitely going to come out with a Super series. Probably with extra VRam if they keep getting backlash.
And I won't buy it. I'm so pi$$ed at NVidia that I sold my one card from them and bought an Intel to replace it. I lost a tad of horsepower, but NVidia needs a good strong dose of competition and loss of sales to help them get the message, and Intel needs some encouragement to keep making cards.
Aren't the "Super" cards similar in performance to their "ti" counterparts?
@@ScottGrammer Honestly, Nvidia does not care too much about consumer graphics cards anymore.
They are shifting over to AI, which is probably why they increased the prices so much for their graphics cards. If they can get much much larger margins from the same TSMC node by producing AI accelerators, why bother with consumer graphics cards ? Thus, they increase the price so much, that it is worth it for them.
As much as I hate the game ( capitalism ), Nvidia is doing everything right within its rules.
Don't hate the player, hate the game...
Nvidias Low end 4000 cards don't even have enough vram to load the textures let alone the new AI models coming. I feel bad for people who buy these.
VRAM isn't possible unless they come out with 4GB density.. meaning 24GB 4070.
So you’re saying nvidia is charging more, and giving less?? 🤯
Sure
You got that right, Skippy!
No. they giving more for less. 3080 price was like 1000$ and UP.. 4070 i 599. and uses at least 30% less power than 3080. SO how it's giving less? Yeah...less in terms of power bill.
The more you buy the more you save … say it with me
@@SayWhaaaaaaaaaaaaaaaaaaaaaaat We talk about raw graphic power not just saving power. Using TSMC is more power efficient but nothing special. and 3080 was $699 not $1000. You just got scam on that price.
7:57 this aged like fine wine
You can get 3080TI Performance at 140-150W (undervolted and overclocked) plus DLSS3 support. I like the card.
Love how honest you are. Nice to see not all youtubers are too worried about their relationship with Nvidia to be honest.
@@2leggedpirate265 In any relationship honesty is key. Your not doing anyone any favors by letting them ram you from behind.
2 legged pirate you are insane
Dog, he’s gaming on a 4090 in his free time. Think about that. He doesn’t really give a crap. He’s feigning outrage to try to relate to the everyday poor person.
@@93836 Just because you're needlessy cynical, doesn't mean everyone else is. Some people *do* care about other people.
@@2leggedpirate265 are you.. ok?
I love that the 4070 go more disappointing
ikr 😂😂
I hear this in Linus' overly high pitched voice: "LET'S GOOOOOO (more disappointing)"
The rumor come out: Does 4070 is more disappointing?
@@Pickelhaube808 The answers may answer your question!
@@corndog9482 I would say a 4060
I got lucky with acquiring a evga 3080 ftw3 in the middle of all the madness a couple years back and was hoping to get a 4070 on launch thinking it would be a huge improvement. Glad I don't need to spend that money now.
Deep down u know u want a 4090
Why would you think a 4070 would be a huge improvement over the 3080...
@@sevsnk3043well of course, we all do.
@@mtrx1708 historically the 70's of the next generation are much better than the previous 80s. I was hoping the rt cores would be much better and they really aren't
@@mtrx1708 Historically it was, but the 4070 is not.
This was much more informative than most review videos from launch day.
(not a critic of the work done during that time crunch, just an applause for the work done for this video)
i can't believe that Jay actually called the release of 4070 super lol 8:00
"I don't think nvidia is dumb enough to do that this time" 🤣🤣
It's worse when you consider that it was originally planned to sell for 750.
Was never planned to be sold at $750, that was a leaked memo saying "DO NOT EVER PRICE THIS OVER 750!" as a guideline for partners. It was never the MSRP.
@@CyberneticArgumentCreator Yeah but I get what he means any time they put a ceiling on a product most of that product ends up being that price.
i believe the price is more an inflation issue than a height issue ... money getting more worthless and poor people nowadays arent earning that much money than back in time ....a price is never a problem but the bank account always is one :)))))
FOR HOW MUCH!?
@@LetsFixITJoe Inflation between 3070 release (2021) and 4070 release totaled 10%. That means adjusted for inflation, the 4070 should have been $549. At $599 that's double inflation not even counting the biggest problem ... it's not even close to the performance gains seen from both sides for all history. From 2070 to 3070 there was a 40% rasterization gain. From 3070 to 4070 it's only 27% despite the absurd price jump that we all know was supposed to be $750, aka a 50% increase. Compare that to the 980 ti vs 1080 ti. There was a 7.5% price increase and 100% performance gain. Screw Nvidia, so glad I bought AMD last year.
I commented on your last video that I thought this 4070 with its specs and performance looks like it should have been a 4060, everyone called me crazy, looks like I was right.
Especially since the 3060 has the same VRAM size and PCI bus bandwidth!
Screw NVIDIA, I’m looking at AMD and Intel’s offerings at the end of this year.
You're still crazy (in a good way), but you, sir, were correct.
I said the same thing. This is the "series" in Nvidia's portfolio that saw ZERO CUDA core increases generationally. 1070 to 2070 did, 2070 to 3070 did (big time), 3070 to 4070 = 0%. Shit, the 2080 to 3080 was almost a 200% increase in CUDA cores. Even the 3090 to the 4090 was a 73% increase, while 3080 to 4080 was a 12% increase.
We were getting less of a bump this time no matter what, as Nvidia had always planned on DLSS 3 / Frame Generation with RT providing most of the generational uplift over 3000 series, as evidenced by their marketing materials only focusing on RT/DLSS/FG numbers and NEVER native rendering/rasterization improvements, because the 4080 is really only a 30% increase over the 3080, makes sense as the new node process gave them almost 20% while the increase of cores by 12% covered the rest. But with everything else, Crypto and now AI/ML boom, Nvidia is basically telling us that they do not want to give us more than a 20% increase in performance in frames per dollar over the original 3000 series..
@@michaelmaness5493 thank you, as my wife says it’s not often that I’m right 😂
@@racerex340 I have a 3080Ti, and so I’m not really looking to upgrade at all this generic.
However, I think, I may be wrong, but I do think people who know about this stuff are seeing through the BS of NVIDIA giving us DLSS and frame generation instead of real hardware improvements in the mid to high tier range, the only exception this generation is the 4090 the price of that card here in the Uk from AIB’s are around £2000. Thus all the cards are to up sell. So NVIDIA gets away with it, because there is no competition in that tier of card, but I really think consumers should start buying AMD for a generation or two, I mean they are good cards, not the best no, but they can certainly get the job done until this bs gets sorted out.
"Nvidia isn't that dumb."
True. Their customer base just might be.
Their customer base *IS* that dumb. Anyone who didn't see this happening years ago is blindly chugging Koolaid.
AMD isn't much better in some regards, but their cards have longevity and better performance per dollar.
@@GrizzAxxemann Amd isn't better in any way. Their cards are always bad at launch until at least a year later due to their shitty drivers that they still can't seem to get control of. And all they do is match Nvidia's pricing and then lower their price a little bit to make up for their instability with their cards. Anybody who's bought a GPU since the 20 series is to blame for this. There hasn't been a single good deal since the 10 series and people keep buying cards, it's ridiculous. Too bad most of the population is stupid because the rest of us have to suffer due to their poor decisions.
@@username8644 It's a competition issue, not a customer issue. PC Gamers are enthusiasts. They don't care about cost as much as they do performance...if cost was the primary concern they would be console gamers and not pc gamers. The problem with this gen is that the performance does not even justify the cost, which is why sales are shit...make no mistake, if the costs were the same as they are, but performance for each card was 20% better, these cards would be sold out at these prices.
I guarantee you when Nvidia releases next gen, they will keep the same pricing scheme, but the cards are going to be sold out because the performance gains that are missing right now, will be there in the next gen.
@@poofy0121 Customers are as much the issue as the companies right now. They are so desperate to play games at 4k 200fps instead of just settling on 1440p gaming with a few settings in game settings turned down from ultra. You don't need an rtx 4090. My 1070 ti still works great to this day and is totally usable even with my 3440x1440p monitor. An rtx 3080 is more than enough for anything gaming related. Yet these people think they need more powerful GPUs and keep giving money to companies that are scamming us. They are the reason the companies keep scamming us with every generation.
@@username8644 if you're running old triple A games sure i guess
It's funny to watch this 6 months later and the announcement of the super series of 40 series cards... LOL!
I was just thinking the same as I got to that part LOLOL
im sort of glad that i went for 6000 series amd (6800 XT) as my card on paper performs the same most of the time give or take as the 4070 except ofc RT performance which i dont think anyone really uses(at least i don't). Also having paid a lot less than a 3080 and 3080 ti im very happy with my card and its performance
If you've seen any of the UE5 games, RT is about to become far more important, particularly in the AAA space (search Unrecord for where we are heading). AMD MUST improve their RT compatibility as they are lagging in Third place. RT/PT will be standardised in game engines in the next two years, given NVs market share. Intel have already outdone AMD and their next gen cards could really damage NVs market share at the all important low-Mid range.
Source trust me bro as every amd buyer tell us, i dont use RT so i think no else is using 😂😂
@@Takisgr07 Up until now its a valid argument as its hardly ubiquitous. However, every next gen engine is going to have some version of it baked in. You still don't have to use it but I don't think AMD can afford to be in third place, especially if Intels Battlemage cards continue to outclass them on that department.
"I don't think Nvidia is stupid enough to do that..." - Jay underestimating Nvidia.
"Hold my beer" - Nvidia
they arent stupid. they know exactly what they are doing, but if people keep buying from them then why stop?
They're not stupid, just greedy.
Also Jay,... "I don't know what Nvidia is capable of anymore"
@@zhardy323 It's not Nvidia who are the stupid ones - the stupid ones don't want to admit it though but still keep opening their wallets(cuz they "need" a new gen card of course)...
Everything i know about this card convinces me this is a 4060 class card.
yeah, especially the fact that it has exactly the same number of CUDA cores as the 3070, and they have NEVER kept CUDA core counts the same generationally, always increased except in this one 3070 to 4070 case.
"The 4070 just go more disappointing..." is the original title before they change it.
i just noticed you were right hahahaha good catch
Not changed after 41 minutes still
Oh good to know and I just got my strix 3080 OC and was worried I should have got the 4070 but the 3080 was only AUD$600 (USD380-400ish) so of course I couldn't pass that up. Looks like I made the right choice.
Given the usual prices in Aus, the 4070 would be, what, $250 - 300 more?
@@steveleadbeater8662 That's about right. As of today the "4070" is AUD$999 and the "4070 Ti" is between $1200-1300.
The tail of why EVGA left the GPU market, Jensen became Gamers enemy pretty much.
Never have I been more certain that my next GPU will be from AMD.. even if it has to be a used card.
If you aren't shy about open box, newegg was selling 6950xts open box for $569 just a few days ago. I ordered one.
Nah, even AMD is overpricing their cards because of NVIDIA. No gpu company is your "friend" when they're clearly working with each other's interests. Just wait for the next generation and stop giving these companies your money
@@hhkl3bhhksm466 What makes you think the next generation will be any different? One thing is for sure, the next gen will be more expensive.
@@hhkl3bhhksm466 The next generation will likely be even more expensive. While you're right that no GPU company is your friend, I tend to want to go with one that's at least less my enemy. So I'm also looking hard at a RX6xxx card to be my next, probably a RX6950XT.
@@Calamity_Jack Wait just a couple more months. Amd is about to the release their new 7800 xt, which is the competitor to the 4070. It also uses RDNA 3 and is significantly faster than the previous gen
The skip method worked for the 2000 series. If a large chunk of consumers do that as well then Nvidia may be forced to come to their senses for next gen.
I think what really hurting nvidia is ps5 399 usd console
I'll believe it when we see it.
@@chilpeeps The PS5 has definitely given disappointed NVIDIA customers another option but NVIDIA's fundamental problem in the PC Gaming market is self inflicted.
Not going to work. NGreedia would sooner keep the pricing next gen and push gamers to buy their subscription service to use network GPUs.
NGreedia couldn't give less of a shit about the gaming community or how 'affordable' their adult Legos are.
They are selling now mainly to people who use AI, stuff like that. Productivity related activities. Who do not care how much it costs, because they either make money with it or universities are paying for them etc. So as long as they don't have a competitor(AMD is not a competitor in that, at all), they can set whatever price they basically want. I hope I'm wrong, but I think I'm not.
I start to think that my spontaneous purchase of an used 3090 was a good deal.
Yeah I went from my 3070FE 8GB to a 3090Ti 24gb on a spontaneous "Oh that looks good price." thing just before they announced the 40 series and while I had a brief "Oh maybe I should have waited, now I'm kinda glad I got that 3090ti."
Tbh buy anything current gen seems like a rip off unless you can afford the 4090
@@Kinvarus1 I went from 1080ti to 3090, that was a huge gap :D the auction it was on said "lightly used for simulation project" so for sure it was mining for past 2 years but I did try to murder it with Kombustor and other programms and it passed with no problems. Not the best bin considering max clocks but still undervolted performs nicely enough. Even made 3d printed shroud for it with 3 noctuas 92mm so it is maxing at 62c and close to perfect silence.
@@vroomzoom4206 the 7900XT are starting to look a bit better now the prices are coming down, $700-750 and they'd be pretty solid.
I bought the 3090 at a higher price than MSP (though not that much higher) and was kicking myself for wasting money, but I think all in all it was a good purchase. With DLSS etc there's no real need to upgrade for a long time now.
just got the 4070 fe for my mini itx. It is actually a pretty good card for a mini itx build. it is quiet, cool, efficient and small. However, if you are not building a mini itx then it has pretty bad value.
lmao 7 months ago this guy calls out invidia's plan of a super 😂😂 love the vids brother
You thought the 4070 was a higher tier card that they limited? Well that's certainly a first I've heard, most people strongly believe this is a lower tier card that boosted "the number" to get more money out of it, just like they tried to do with the 4080 12GB edition
3080 12gb is a great card.
If you can buy it for 3080 10gb price.
Closer to Ti performance than the 10gb.
@@RacingAnt Sorry, typed wrong, I mean the 4080 12GB edition. Mind gets a little mixed up when thinking of generations of screw jobs Nvidia has been releasing.
@@Mike__B that makes a lot more sense now 👍Yeah, the whole 40 series has just been a long line of NVidia treating their customers like 💩
People who believe that aren’t doing the math right. If there originally was a 4080 16 gb and a 4080 12gb and Nvidia got their way and no one complained, they had to have had a card lined up to be the 4070 TI. It should have been this card (the card now named the 4070). After they bumped the 4080 12gb down to be the 4070 Ti. They had to bump down all the cards below it as well.
@@hastyscorpion That's an interesting take, and certainly a feasible one, however another theory is they simply would not have come out with a 4070ti at all and simply had the 4070 card be the first *70 card out there after all except for the 3060ti all the other "day 1" cards where non-ti variants (or super in the case of the 20 series) and then later the ti versions came out. Then Nvidia could really push the idea that those super pricey flagship products are well worth the money, just look at the difference in performance!
That said, the 4070ti wasn't a limited 4080, those cards were already produced by everyone (Nvidia and AIBs), already boxed and ready to be shipped to retailers, until word came down from on high and then boxes got changed out and any name plates or stickers on the cards that had "4080" were removed and swapped for 4070ti naming.
I had to rebuild my five year old system at the beginning of the year. Went from a 1050ti to a 3060. So far everything I've seen in the past few months tells me it was definitely better to get the 3060 rather than wait for a 4070 or any other 4000 series card. I'll be fine for the next few years when hopefully the graphics card market will finally fix itself.
If I upgrade from the 3060ti it’ll be to 6800 or 6900xt Prices will only keep rising thanks to the war mongers and countries that just keep printing money
@@mikepatrona472amd is not your friend
It's not because the 4070 was destined for more, it's because it was destined for less! It's the actual successor to the 3060ti. So yeah there SHOULD be at least an extra card between the 4070 and 4070ti
Nope the 4070 is really a 3060 successor.
Nope, they updsold all the cards by 1 tier.
4090 should be the 4080, the 90 series is a sham to make the flagships cost more.
4080 is the true 4070.
4070ti is the 4060ti.
4070 is the 4060.
The sorry excuse of a card that the 4060 will be is actually the overpriced entry level 4050.
Plz no! Don't give Ngridia anymore ideas!
More like a 4050.
@@guilhermelopes986 the 4070ti performance with 16 GB of VRAM should be the real 4070 being that the 70 is always last gen top tier performance. Hence why the actual 4070 is more like a 4060ti. (3060ti and 2080 were basically the same).
As someone who's been using this graphics card for over a year, I've got to say that although the value of it could be better, the performance is pretty good for my standards. I do not have older GPUs to compare to, but what I did have was a laptop with a 3050 TI in it. As we all know, laptop GPUs are much weaker than their desktop counterparts. So a 4070 was definitely a massive upgrade from a 3050 TI laptop GPU. I will probably keep using this GPU until it just stops working at which point I will hopefully have enough money saved up to buy something better.
I bought PNY 4070 and have no issue turning on the voltage or any other settings. My 3D Mark Time Spy is over 17,000. Maybe its just the Founders Edition that's locked down. Have photos if you want them.
Now imagine how great the 4060 will be...
64bit bus, pci-e 5 2x, 6gb, and nvenc disabled. $469 ... will be 3050 speed on any system with less than pcie-5 ... i mean, im just plucking that out of thin air, dont wanna give um ideas..
😂 It's going to be pretty awful
You mean the RTX 4050, right?
@@racerex340 no
@@ferdgerbeler8494 thats more likely going to be amd's for its 6500xt replacement...
was thinking of getting this gpu no thanks gonna stick with my 1080 ti and get the 6950 xt or 7900 xt both are looking real sweet now never tried amd before but its looking better and better
I prefer amd aderenlin over GeForce experience, 6950 xt is a good card haven't had any driver issues.
I guess I'll keep my 4070 Ti then. I was feeling a little guilty for spending $900 on it (Gigabyte Aero/White build), but since you can build a whole civilization in the valley between the 4070 and 4070 Ti, I guess it'll be worth it, lol.
Best price/performance ratio! Good choice!
@@RaptorFPV It ended up crashing games to I returned it before my window closed. Back to deciding what to get again. The 7800 XT is looking like a good candidate now.
@@Clint_the_Audio-Photo_Guy Crazy driver issues with 7800xt that I just got, returning it to get a 4070 tbh. Windows update loves to replace amd drivers with its own and fucks ur drivers. You then have to constantly rollback the driver och re-install it.
@@EliasOwnage95 There's a video on how to fix that windows update issue. That's a windows problem not a GPU problem.
@@Clint_the_Audio-Photo_Guy I know it's a Windows issue. I tried literally every fix and nothing solved it, don't want to risk getting past being able to return it and endure future driver bs. I saw it as paying extra for something that works with no headaches.
@8:00 Foreshadowing.
Hey Jay and the Two Cents team!
Big fan of the channel for years, thank you for the videos, I've have been pouring over them over the last few days. The reason being is that I'm being asked to travel to the US for business and I'm going to take the opportunity to visit Micro Center for the first time so I can upgrade my PC!
So, the reason I write here is;
I built my PC about 5-6 years ago, and have tipped away at upgrades over that time, upgrading the cooler, PSU, storage etc...
Originally, I had thought to do the same and was going to buy a 4070ti, but, my i7 6700k probably isn't going to cut it...
So I'm looking at a new Motherboard, CPU, Ram & GPU that give me a little flexibilty to upgrade later.
The problem I'm seeing online is bottlenecking, compatibilty, power delivery, and most importantly, bang for buck.
I've saved about $1,000 for this, but, I'm getting such great deals in the US (compared to home where the 12600k is $300-350) so I'm tempted to use the credit card to bump up a choice or two (like the 13600k) but I'm nervous that will balloon into unneccesary purchasing. Could you please cast your eyes over my build and nudge me in the right direction?
Any help at all would be most appreciated!
i7 6700k > i5 12600k
NZXT Kraken x62
Asus z170 > z690 DDR5
16gb DDR4 3200 CL16 > 32gb DDR5 5600 CL40
Samsung 970 EVO Plus 1TB + 3TB Toshiba p300
EVGA FTW 1070 > MSI Ventus 4070ti
EVGA Supernova 850w Gold
NZXT s340 Elite
I use Photshop, Illustrator & Premier for work and game at 3440 x 1440 144hz
Thanks in advance!
The 13600k is almost 28% faster (all core load). That's worth it in my opinion. Single thread only gains 5% though.
Buy what you can afford Robert, don't go loading yourself with debt for an extra 10fps. I'll get savaged by the fanboys but if you are using the GPU for anything other than gaming I'd always go NVidia so I'm not going to advise getting a last gen AMD card, even though they are much better value. I'd also consider sticking with a DDR4 board to pick up the 13600 within budget.
I wonder how it'd behave with a shunt mod, or other ways to get around the power limit
I think what's illustrated best here is that NVIDIA straight up thought that the 4080 12GB scam was gonna fly, mainly due to that large performance gap.
I think they originally planned to have the 4070Ti be the 4070, and the 4070 be the 4060, but then figured they'd be able to milk the consumer by doing a tier split on the 4080 class thinking nobody would notice. Gotta make up that mining money somehow. ¯\_(ツ)_/¯
I am stunned that nobody actually gives this card any points for the performance per watt (or silence) .... it dominantes in that cathegory. Are influencers target audience kids that do not care for energy bills or what ? I actually went for 4070 solelly that I can keep using my 600w passive PSU, undervolt the the GPU and have a very silent rig that consumes like a 2015 era PC and still have great 1440p ultrawide performance.
These youtube channels whilst giving out decent info also have to have hyperbole etc to get views. I gave my 3070 to my son and ended up getting g a new pc with a 4070. I wouldn't usually upgrade by one generation but wanted my own PC and to not share my sons. Seems like a decent card to me, a solid improvement from the 3070, better performance, better dlss, frame generation, and very good energy consumption....and will also benefit from future updates.
So, here's a crazy thing about this card. I bought a 4070 Gigabyte Eagle OC a couple of weeks ago (replaced my 1650 Super OC), and I am absolutely blown away. Now, keep in mind, everything else about my computer, is nearly 5 years old at this point; bought at the same time as my old 1650 Super OC. So I'm running a Core i5 9400, and 32GB of DDR4 3600 ram. Both slow compared to today's standards, and only low/mid tier. But yet still, I can manage to play Warzone 2 at 4K (I have a Gigabyte Aorus 48" 4K 120hz 'FO48U' monitor that I also bought at the same time as the 4070), and still average out at around 100FPS! On an i5 9400 CPU!! 😂I'm definitely CPU limited, but man, 100+FPS on Warzone at 4K(!!) is just insane. I'd average like 30FPS on the 1650 Super on this very same computer; nothing else has changed. I'm super happy with my purchase; although the price was definitely up there...
At this point these commenters are just crying babies. I decided to buy 4070 for the price/performance/power/availability combination. Of course if Nvidia can drop the price another 50 bucks or more it will be best. In my country AMD card ain’t cheap, these ivory tower RUclipsr don’t really know what they are talking about. Their eyes are only in the US market 😂
No they're not very interested in energy usage.
As for silence some say most gamers use ear phones, so don't care.
I undervolted an RX 6700xt, trimmed the max frequency to run cooler in summer to reduce with similar success on 1440p.
Most of the most interesting games aren't the GFX intensive AAA types anyway. For the few others AMD game profiles made it easy to push the GPU harder.
Gonna be honest, I bought one of the 4070's, but I got the Asus non-OC'd one, not the reference model.
I'm pretty happy with it, but I do feel it could've been a bit cheaper. Like $50-$100 cheaper.|
And before anyone says "Why didn't you buy X instead???"...
1) It was going in an eGPU box, thus limited space. Otherwise I'd have just ripped the 3080ti out of my desktop.
2) Power usage. Electricity prices are nuts here in the UK, so i'm trying to limit power usage where I can (also why I don't use my desktop much right now with it's 3080ti).
3) What I use my rig for really benefits from CUDA cores to speed up rendering.
I'd be very interested to see if there's a way to unlock the voltage to get it closer to the wattage it can pull max... I think it's 225W from the 8-pin and the PCI-E slot? Might give it a nice boost.
It seems like skipping the 4000 generation altogether is the right move financially. I'm building a new PC now and found a lightly used 3080 for $400 - it should be plenty good enough until 5000 gpu's get released!
Thinking on the same lines but getting hold of a low mileage 3080 at that price would be optimistic to say the least . Decent 3080's in the UK at $400 equivalent, is like trying to search for Unicorn Poo but much harder, harder to find that is, not harder Unicorn Poo which is even harder to find ( didn't want to get things confused, that's Nvideas job.)
1080Ti still running strong, but getting old and slowing down quickly!
I usually build a New Top of the Line PC every 7 yrs roughly and it's time to do so but the prices are just atrocious on Graphics cards and MOBOs.
Also Intels next CPU will be a different Socket, so building a 13900lk is just silly for future proofing.
Between Prices and Tech changes that's happening just sucks to build a new TOTL PC.
Also what game's?
I also have a 1080ti, hit it with a decent OC but it is feeling its age now. hjoping to push it to intel battlemage
Could I ask why you would take a 3080 over a 4070? They're still selling for more than the 4070 new anywhere I see, and are about the same on performance, with a higher wattage.
@Something Diabolical The 3080 that is close to the price of the 4070 is 10gb though, so its less than the 4070
I wondered the same thing, but then got WAY more confused when he said he would get a 3070 Ti over a 4070. That makes absolutely no sense to me.
Why would you get something slower, with much less VRAM, much higher power, and fewer features, for the same price?
The 3080 has more raw performance then the 4070. The 4070 has to use it’s gimmicks to match it. Especially at higher resolutions in most games the 3080 beats it. I’m sticking with my FTW3 3080
@@jedpratte Every comparison I've seen had them trading blows, but mostly the same. I could see the 4k maybe, but I only 1440p game anyway. But totally, if you already have a 3080, there would be zero reason to upgrade. I have a 1080 though, and had gone back and forth between the 3080 and 4070.
@@umbles7007 yea the ones I saw showed at 4k the 3080 wins the rest is similar performance. That’s kinda sad as usually the 70 series would trade blows with previous 80ti
Try the newest version of Afterburner (4.6.5). It has added support for 40 series.
0:25 🤣that was me about 2 months ago. I built a rig with 14 Corsair fans and 3 iCUE commanders. Took a while to get the cabling organized. Makes me curious as to what they have coming! 🤔
The 4070ti was suppose to be a watered down 4080 with less memory. This was already confirmed. They changed the labeling last minute when consumers complained. So this threw the whole line up out of whack.
4090
4080
4070ti (4080 12GB)
4070
There was something I saw with performance on another channel:
Previous gen's 80 series becomes new gen's 60 series. eg 1080, 2080 became 2060, 3060.
With this 4070, it mirrors the performance of the 3080. Lending more evidence to it should have been a 4060 card.
"the worrying thing about this is that there's almost room for them to slide a Super card in there"
This comment aged very well
The 4070 is so gimped it even got rid of the 't' in 'got'.
They want to push you to get a 4070ti if you want more t's
The "t" is a DLC.
I worked my ass off and bought a 4080. Told myself to ignore the price tag just keep saving. I couldn’t be happier performance-wise. The card is a beast. But not everyone can pay that much, I’m fact most can’t, which sucks.
I really have no complaints about my new 4070 that being said I upgraded from a 1080 and just let me say wow it’s the diff between night and day for me. I don’t really understand why people complain so much it’s good for what I want to use it for
Yeah, the 4070 is bad vs the 4070ti, 20% more performance for 25% more price. Yeah this comment is on a video that is 5 months old, but when you're talking about price for performance, per dollar the 4070 is better. It's not really a fair comparison when reviews are not taking account of prices. If price isn't an option, 4070ti is better, if price is an option, you get more bang for your buck with a 4070. With the prices of GPUs the last 5 years or so, price is a huge selling point.
I have a RTX 4070 as well. But i can adjust the voltage slider just fine. I have mine running at 3050mhz. So that's 150mhz overclock on the core clock and 1000mhz on the memory clock. No stability issue's.
That’s crazy bruh you paid $600+ for a 192bit bus 😂
It's a Founder's Edition bruh. Get help.
@@BlackJesus8463 They're all using the same boards. Plus you get help.
You should have gotten a 6950xt
@@joegreezy Its my money. I'll use it how i please. I'm not paying a 1000+ for a 4070 Ti. 650 was my max budget. And i was looking for a GPU with 3080 performance.
Sounds like I'll be sticking with my 3080 for a while longer.
you absolutely should, for a LOT longer...
For sure, we have a 3080 & 3090 in our house and they are going no where
Me too with my second-hand 2080 Ti. Was planning on getting the 3080 before its price jumped and now it just makes no sense to upgrade to it. The 40 series might as well not exist until the 4080 sells cheap second-hand.
@@ashleyobrien4937 100%
I'm always on the move and really enjoy GeForce Now. But, decided not to go for the 4080 subscription as I did not see the point of it, considering that I can game in 4k 60FPS DTS with the 3080 stream subscription. Glad I was right.
The RTX 40 series (with the exception of the 4090/4080) is basically the RTX 20 series. Minor performance boosts inexchange for a new early beta feature that they're charging a major premium over the previous gen.
yep, and screw that...
Performance difference massive but price difference massive too. Che ked Amazon earlier today and the cheapest 4070 is £560 and 4070TI starts at £850 here in the UK. I went for 4070 as its enough but I would go for 4070ti if it was cheaper
When the 4060 releases, I would love to see the power difference comparison between the 30 series cards. At this point I wouldn’t be surprised if 4090 would be the 3090 and the 4060 would be comparable to like a 2080ti.
yeah, a 2080ti with 8gb of vram. but hey, DLSS BRO! DLSS IS AN AMAZING MIRACLE!
So glad I got my 3080 12gb model last August for 800 after taxes and shipping costs to outside of the US. This is getting ridiculous. It took me 12 years to upgrade and I feel I made the best choice at the best possible time.
Edit: Was coming from an EVGA 970 (got that one borrowed after my HD7950 burnt out) and an FX8320.
I overclocked mine works fantastic
😂😂😂😂 Jay's "slimy mother" ending was priceless and hilariously funny
I screwed up getting the 3070, but again this was in the mining boom.
Yep me too. Didn't listen to the warnings about its 8GB vram buffer. I'm ok with the fact that every year new games will demand more GPU power so framerate drops a bit but lowering textures? ...nope.
@@RobertFromEarth Lol, im with ya
Nvidia is smoking meth. This was *supposed* to be the 4070 Ti (as the current 4070 Ti was originally the "4080 12GB"). A 60-class GPU priced as an 80-class... Weaker than last gen in terms of raw power; "But guys, it's better with DLSS3, which we locked behind firmware of our new gen!"
Thanks for taking the trouble to do things like this Jay, it really helps a lot of us and is really appreciated. Even though I have already purchased a new graphics card recently i still found this very useful recently to help warn my friend to not waste his money when buying a new graphics card for his son on a limited budget. Its not right to slam people who can just afford something decent with such a huge performance gap.
Im a 71 year old Jay. I've watched these video's for 100 years... Well... I forgot how many years. Thanks for the great content.
I just bought a EVGA FTW3 Ultra 3080 for $700 CAD ($515 USD) used. It has maybe one year mileage on it. Very content with my purchase.
Jay has it backwards, the 4070 is a 4060 class card they are upselling you lol
It's a 4050 class of card. The 4070 Ti was 4060, 4080 was 4060 Ti.
@@raresmacovei8382 by your logic a 4050 should match a 3080? What
@@RayanMADAO No, going by core count vs 4090.
@@raresmacovei8382 Yeah...No.
@@raresmacovei8382 you cant compare core counts between generations, what....
You compare performance
Its like comparing bus speeds when they increased cache by 3x this generation (meaning bus isnt needed as much with local cache)
I think this has to do with the 4070ti originally being the 4080 12gb, so it's higher in performance as it was categorized as an 80 card.
I completely agree. Not sure why this was not mentioned
I've been nvidia and intel for a long long time. Recently, I picked up an amd cpu and I was blown away by performance and lower wattage all around. Looks like I'll be going with AMD for my gpu next aswell.
Same. Doubting between rx6950xt and rx7800xt
And what AMD CPU did you picked up that blew you away that you forgot to mention?
Open test bench looks great ! Hell , It all does ,. Awesome job guys .
I'm so happy i didn't wait for this. I got a used 3080 ti 12gb vram for 600 USD.
The temps are perfect
I just got a Sapphire 6950XT Nitro+ last week. I never had a AMD card before. This card is a beast...runs everything at high speed. The only problem is the heat it generates (280W) but with an optimised airflow, it should be ok.
Ye and its god only for games, no 3d no nothing
@@denisbaz6682 I wouldn't know. I only game with it.
I think the 4070 is Exactly where nVidia originally planned it to be and that there was supposed to be a 4070ti bethween it and the 408012GB. When that got renamed to the 4070ti instead they've basically Dropped the middle card as I'n betting they already had a lot of the coolers, cards etc built for the 4070 and the middle card ti varient was to be made later. I could see it coming up as a 4070 Super tbh
My thoughts too while watching this. The extra-wide performance gap is a result of the 4080 12gb "unlaunch".
nVidia is really just causing themselves extra headaches by compressing their naming scheme. Why does every usable card have to be either 60, 70, 80 or 90?
Why not 4010, 4020, 4030, 4040, 4050, 4060, 4070, 4080 and 4090? Then push the Ti models as refreshes as they used to do.
@@nlflint isn't the 4070ti considerably worse than the 4080 though?
so what is coming for corsair??? wireless rgb?!
101% Daisy chainable no wire fans like Unifans / D30 etc.
Lian li style connections I’d imagine. Or a single fan / RGB connector?
Maybe wireless wiring 😂🤭🙃
I was going to get the 3070 to but I chose the 7900xt and my god I’m glad I did, atomic heart used 14gb vram while maxed out, 165hz not one dropped frame
Wish I would have watched this before building my pc. I was going for the 4080 or 4090 but ran low on money, so I went with the next one down because it was half the price of the 4090. Lesson learned! I will just have to replace it when I get the money.
I agree with every point made in the video, however, to be fair, a 7.4% performance increase by overclocking is actually pretty decent, especially considering this is an FE card. The 4070 would have been a very good deal at 450 - 500$
Still not a good deal. Decent at best. The "4070" is basically just a renamed 4060 which should at best have a price tag of $400. And ask yourself, would $400 be a good price for a 4060?
@@2nd_Directorate You need to take actual performance into account, and at 400$ it would be an amazing deal.
@@CanIHasThisName No you don´t. That is what Nvidia wants you to believe. You just need to take R&D and the production cost into account. And suddenly (as the R&D cost of the Graphics Unit division hasn´t changed significantly in 5 years ) it is not that good of a deal anymore.
Not with those 12GB of VRAM... 1080p cards should cost no more than $400
@@2nd_Directorate That's the dumbest take I've seen in a while.
Title typo lol
If you're disappointed now, just think about next release season. Their best offering will be a 4080 at best I feel Nvidia has turned it's back on what helped build them up, same for AMD my hope is that Intel keeps their GPU for gerneral consumer lines going, I like what they got so far, hope they can better it next time.
hope for the next release. Then never buy anything.... hate how that works.
This card is already just an overclocked 60 class card so it's not like there's any reason to expect much more in the tank. The '12GB 4080 / 4070 Ti' is the real 4070.
Fr
This reminds me of the Mini Apple pcs. Where they sautered in the ram so you couldnt replace it with better ram. Forcing you to pay alot more for higher and fastet gigs. I am trully despising these dirty tactics
I didn't know there was such a gulf between the two, its insane. I was lucky enough to find a 4070ti for $635 open box at my local Microcenter, Upgraded from a 2070 super. So It's a massive upgrade for me, but I would not have bought it at MSRP.
I'm going with the RTX 4070 because its the only card that doesn't draw more than 200W, has a single 8 pin connection, isn't the size of a cinder block and gets exactly double the frame rate than my 3 year old RTX 2060 on mature drivers gets.
I'm not sad and I won't regret it.
Going to buy It for the same reasons, im coming out of an r9 380 so its going to be a huge leap in performance
@@fernandoferraz4146 I got a Zotac Twin Edge OC and its great. OCs to 250+ core before certain games crash after 10 minutes.
I wanted to upgrade my 1080ti, guess I will wait another generation. 6950xt not an option on 650w PSU 😕
I will be sticking with my 1060 and 3070 Ti in my slow but great performance for the buck machines.
im sticking with my GTX 770
@@morpheus9137 GTX 480.
You should be buying a 1 dollar card then.
Gives you one fps but hey look at the performance for the buck.
Just insane.
To be honest compared to my old 3060 my 4070 has been a godsend, I do agree that the prices on the 40 series are a tad higher then what they should be
I believe the 4070 was always supposed to be the 4070, and they just renamed the 4080 12GB to the 4070 Ti. The main reason is Nvidia often released non Ti cards, and then later the Ti cards. In other words, originally the "4080 12GB" was going to stay called that, and eventually a "4070 Ti" was going to be slotted in between.
I totally agree with this
buyers remorse is real 😢 though i have upgraded from an asua x480 8gb oc to a zotac 4070 trinity, im really starting to think i should of waited a few extra months before pulling the tirgger. its not like i havent waited 6 years to upgrade. though i must admit going from a lower tier to the 4070 has really impressed me but apprantly i was small minded
If I am remembering correctly, there was a significant gap between the 3060 and the 3060ti? I was actually hoping for that same gap with the rest of the 3k series cause a 10% gain does not seem to be worth jumping for.
It was a similar gap, yes.
That Cyberpunk scenario for example: 81 vs 104 is a 28.4% lead for the 4070 Ti.
3060 is 39 fps, 3060 Ti is 52 fps. That's a 33.33% lead for the 3060 Ti.
It's dissapointing because you're doing the wrong thing.
What this card is amazing at, is UNDERVOLTING.
I'm literally sipping 165 watts of power with no performance loss.
This card is amazing, and great at many things.
Staying ice cold.
Zero noise
Great performance per watt
and the ability to undervolt like an absolute CHAMP.
Loving mine!!
bro how do you get zero noise my founders edition rtx 4070 is spinning 5 times per hour for 5 minutes each during office desktop use which is audible through the whole room
@@NoobProTV Wtf?! LMAOOO
I got the cheap MSI ventus which costs the same as the default card. has 3 fans and a large heatsink (longer)
Mine never goes above 55c in gaming. It's ice cold with zero fan noice. I'd return that one and get the msi ventus, do youreself a favor.
Also undervolt your card, seriously. They undervolt VERY well.
No surprise that if you ignore the performance or price or price to performance ratio, then yeah its pretty amazing.
Just go more disappointing