it's more of a 3080Ti competitor since both have 12GB. 4070 Super can only match a 3090 at 1080p/1440p. Also a $600 card matching a $2K card at all while using 120W less is mighty impressive imo.
Bro what? Even I still use my RTX 3080 to play games at 4K. I hope you know you can just lower the graphics settings if the ultra settings don't run at the desired frame rate?@@mnoise626
in point I never use stock options in game like medium high or something like that. allways use texture ultra and some other vram options ultra and other options that hit fps/cores on GPU I will be on low/medium and have some options off, all of that for max fps and best look in games need a loot vram, I got vram limited by rtx 3070 xD@@daniil3815
you defo dont need 24gb of VRAM to play maybe you need it if you making stuff in unreal and rendering movie/game with metahumans etc.. but today 24gb is overkill
@@daniil3815 5 years? Lol no that would happen in 2026 when next gen consoles arrives and all games again start using more vram. Its a cycle so time running out.
The 3090 stock voltage settings are pretty inefficient, you can match stock performance at 0.8-0.85v while lowering power draw by 100w+. Mine draws between 200-250w while gaming and is great for VR games that use a lot of VRAM.
yeah I am currently trying to decide what to do. I have a 34 1440p and VR with 3070. Thinking of upgrading to the 4070super but not sure its enough of what I would need. I want nvidia so i can still pair dlss with dldsr. I think I would be find for my monitor as I already am fine with my 3070 for basically all games without dropping too many settings. See couple 3090 available and its hard to pass up that much vram but I would like the new DLSS version and pathtracing
@str8ripn881 A used 3090 for £500 is good for VRAM intensive VR games. If going new I would go for a 16gb 4070ti Super or a 4080 if you can stretch the budget. The 12gb 4070 super is a good card but with half the VRAM of the 3090 with similar performance.
@@RimzoSky Yeah I have been torn. I never really buy used pc parts in all my years. I fear the same about just the super and would probably go 4070 ti super at the least for the 16gb to double the vram i have now from my 3070. I may go the whole 4080 super at like $1000 but that just seems ridiculous to me but sure it would work for many years. If I decide to go used I may wait for the new 50series and snag a 4090 off someone
True but while enrgy may be a concern for some, for any household any kind of AC/appliances will use marganially more energy that you would see less than a $1 difference in energy cost $/kWh is on average .14 cents in texas. RTX3090=350watts RTX4070 super=285watts it would cost you .04 US to run your 4070 for an hour vs .05 with the metrics provided. Mind you if you were gaming at the peak power consumption (which in the real world you arent) at a gaming rate of 120hours a month (4 hours a day) it would cost you $6 to run the 3090 and 4.80 to run the 4070 super. Is the power efficency worth the extra cost if they have the same performance? If energy is so expensive it makes a big impact you should have other concerns. Refined archetecture isnt worth shelling out hundreds of extra $$$ unless you want to burn money
$0.33 in Los Angeles. and 4070 Super is not just saving power and make much less heat. so in summer, you don't have to turn down A/C too much. and save you hundred of dollars.@@rea280
Not to mention price. All of these 40 series haters can't get over the fact that Nvidia made something decent with it's most recent launch. They mask their herd-like hatred (which they most likely obtained from their little echo chambers on Reddit) by claiming the performance is not impressive / they expected more. Listen, the fact that a $600 card that consumes less power comes so close to and even surpasses a card that's still listed at $1500+ is incredibly impressive. My only gripe with this card is the VRAM. Other than that, i'd say this card is fucking great.
Hey Daniel, in RE4 Remake 4070 Super looses performance at 4K solely due to VRAM bandwidth or insufficient size of L2 cache. But it is definitely not VRAM capacity issue. I performed a comparison between 4K/Max/High 8GB textures and 4K/Max/High 1GB textures on my 4070 Ti and found out that although VRAM allocation/usage is higher in case of "High 8GB" setting, the performance is identical, as well as RAM usage/allocation and PCI-E bus usage (this is shown in my latest uploaded video). Looks like RE4 just tends to reserve spare VRAM. So, at least, Nvidia 12GB cards are fine in this game (not so sure about AMD counterparts with the same VRAM capacity).
Am I reading this right? Does the 4070 super performs almost the same as the 3090 or better with FG at 25% the wattage of the 3090? Dang that’s good tech advance there right?
Again the performance difference at 4K in Avatar is not memory. It’s the power limit. You’re hitting the default 220w limit on the FE card. Had you set the slider in Afterburner or GeForce Experience to allow Nvidia max of 240w it would have likely had a tie or Super winning. At the same time you could always do the same for the 3090 FE up to 400w. Both cards are power limited in that 4K Avatar test.
Rtx40 is one of the very biggest jumps in GPU history, you're just coping. The fact that a 70 class is beating the previous 90 class is a technological miracle. Stop lying to yourself.
Regardless of whether it's 3% ahead or 3% behind in whatever game and setting you test, a -70 staying basically a tie with the previous generation's -90 class SKU is pretty cool. This is a great value.
I have a 3090 and was thinking about getting the 4080 super since the price cut and extra performance but at the same time i think ill just wait 3090 is still a very capable card for todays games.
Wish you had shown the numbers for 4k path traced without DLSS or FG in alan wake 2. the 4070 supers 3 fps in the techpowerup review illustrates the problem with 12gb cards going forward when texture sizes go up.
The 3090 when it was brand new didn't made much sense to spend 800 USD more than the 3080 for just 10-15% more performance and more VRAM. Since its a last gen product in the used market, its in my opinion totally worth the premium over a 3080 10 GB, 3080 12 GB and 3080 Ti. In Canada where I am. the 3080 is 550-600$ CAD, 3080 Tis are 650-700$ CAD and 3090s are 800-1000$ CAD depending on the models. So its 45% more money or 250$ CAD more for 10-15% more performance and 2.4X more VRAM. Totally worth the premium at those prices. I picked up a used 3090 for 900$ CAD in july of 2023 and zero regrets so far. The massive amount of VRAM allows me to play games at 4K on high settings with ultra texture quality and it never ever is a limit, its so nice.
Hard to tell if we don't see fan speeds, but even if they run the fans at a similar speed, denser packed transistors (smaller process nodes) tend to trap heat between transistors (AMD CPU's encounter the same challenge on top of having that very thick IHS). That can potentially cause some cooling problems.
As an owner of the 3090, I was really hoping that the 4070 series would be able to have as a viable upgrade with out spending over $1000 again(which I won't do again). I was thinking that the RTX4070ti Super might be based on projections, but not after seeing the real world results. The only two cards that would be a significant upgrade would be the 4090 or the 4080 super....and at those prices I'll be waiting for the 5xxx series or for AMD to catch up with Ray Tracing.
Dude who goes from a last generation 90 class card to a next generation 70 class lmao. That is what we call a side grade not an upgrade. Smart choice to wait for I would say a 5080.
yeah, unless its a 4090 for "cheap' you should not get any 40 series at all and wait for the 50 series\AMD\Intel. if you take care of your 3090 you'll be more than ok for a while to come imho ( also owning a 3090)
Thanks there's a lot of great info here, I just cant help but wonder what performance would be like if Nvidia didn't weirdly reduce memory bandwidth on most 40 series models vs previous gens (minus the 4090).
@@Wobbothe3rd Nvidia certainly harmed performance with the 4060 Ti's 128 bit bus. I still remember when the Yuzu emulation team put out a warning on how they're seeing worse performance with 4060 Ti vs. 3060 Ti in Zelda TOTK due to the reduced 128 bit memory bus (3060 Ti = 256 bit), the emulator relies on it with certain games due to the way texture streaming is emulated. Personally I'm waiting until 50 series as I doubt Nvidia will pull this again.
at mw3, more fps means more frequent data transfer, so it makes sense that the cache plays a bigger role. at higher res, the fps simply decreases and so does the load on the cache, therefore the performance equalizes.
Given that we know the 30 series is absolutely capable of running Nvidia's frame generation process I don't think you should be including it in these comparisons without at least throwing up a big disclaimer that Nvidia is feature locking 30 series cards to upsell you on 40 series cards.
who would have though a 128 bit bus 16gb 50 series card masquerading as a 60 series card was bad value lol. nvidia botched everything below the 4080 this generation atleast the 4070ti super will actually be a 70 series card vs the 60series memory config original 4070ti.
@@mobarakjama5570 This is also a way to save memory because you can use the same assets as many times as you want, but you don't have to load them more than once :)
I would be surprised if next generation of GPUs was the same or somehow worse than the previous one. Then again it's NVidia, they did GeForce FX once. But asking price of $700 from 3rd party for what is essentially a 12 GB 3090 is kinda meh.
@@potatogod3882 you're not supposed to measure smoothness by the video. Daniel said his capture card only goes to 60 fps or sum, which should make footage above and below that fps look stuttery sometimes
@@FeherViktor-zl8bm 💯 FINALLY someone that gets it! I have been telling ppl this and they get in their feelings. Kinda like the other guy that commented on your comment lmao
I don't like artificially gimping products like this, but I get the sentiment. Textures are free assets (computation wise) that can improve the visual fidelity of anything, I wish they would cut their margins elsewhere. If this wasn't the case with Nvidia, Radeon cards wouldn't even exist after this gen.. That's literally the only thing they can offer with RDNA3. We had to wait ~18 months for Nvidia to offer more than 12GB below 1200$ MSRP..
Honestly the amount of work Daniel is putting in every video is insane. He should feature with Gamers Nexus they would understand each other^^ Nevertheless i really respect your work.
Agree.. he's such a geek in such positive way. But making this his main job might steal the fun off of it since if it will become job it might b stressful. Maybe..
@ak33mc it's great for creators because of the feature sets that the other cards don't have, but you're right when regarding VRAM. That's a bit disappointing. The 4070 TI Super will be a sweet spot for price with 16GB behind the 4090. But I'm hoping the 5000 series can have a bit better uplift in this regard.
I don't care what anybody says about performance. The 24GB of vram from the 3090 is priceless at 1440p and 4K, compared to 4070 Super which may run out in the next 1-2 years due to bad game optimisation.
It is true that it's great to have lots of VRAM just in case. The reality is, however, that game settings that would warrant the use of >12-14 GB of VRAM wouldn't run well on either of these cards regardless of whether or not you have enough VRAM. Example: Alan Wake 2 4K + FG + PT would easily break the 12 GB VRAM requirement, but would run terribly on both of these cards. Same goes for Cyberpunk at those settings. Also I believe that games devs will be forced to make sure their games can run on
If either of these GPU's even runs games at settings that utilise more than 12gb of VRAM in 3 years I'd be surprised. Given a 1080Ti was rather quickly relegated to a 1440P card (for performance implications) when 4k finally started to creep up to the limit of 10gb VRAM & never really had its 11gb VRAM used in any games.
@@MLWJ1993 Not necessarily. High texture packs are a thing and they don't hit the shader units the same way as visual effects do. I think the 1080ti comparison is rather like a 4090 situation with its 24GB VRAM. I think a cheaper 4090 with 16GB VRAM would be more balanced. But who knows, maybe next gen RPG games would have local AI Chatbot capabilities where you could freely conversate with NPCs and that will certainly eat a lot of VRAM.
@@konstantinlozev2272 Higher resolution textures are bottlenecked by your input & output resolution though. If you're at 1080P using 4K textures there's more than 3x the pixels in the texture that your resolution is capable of displaying. Meaning you'd need to zoom in at least 3x to actually see those details & the object would need to fill your full view before zooming in. That's a lot of IF's to meet requirements of your example.
@@MLWJ1993 Amh, no, I am afraid. I don't know where that "4k textures" comes from, but textures in a game come at all resolutions. In addition, a "texture" is typically a "texture atlas", where different parts of that "4k texture" are assigned to different body parts of (for example) the character that you are looking at. So, you are actually never seeing the whole "4k" texture atlas at the same time. If you are looking at the face of the NPC, you are looking at a small fraction of the texture atlas. Just look at the textures on people in Cyberpunk 2077. Not good.
@@konstantinlozev2272 Yeah I agree, if I were to get a 4000 series gpu it would be at least RTX 4070 Ti Super, definitely if you use it for at least 5 years.
I thought about waiting for the the super release last year. Then I got lucky and bought a used 6950xt for 280€. Since the 6950xt is trading blows with 3090 and ti I think there is no need to regret that decision. Still a great card for that price.
...in rasterization only. It gets stomped in any RT or AI workloads. Also AMD has their thumbs up their butts with FSR still not looking that great compared to DLSS.
@@Aurummorituri luckily most games I play don't support dlss or fsr and I have to rely on pure raster performance. I tried RSR and FMF and wasn't that unhappy with the outcome. It's just a gimmick for me though. Edit: not that I don't believe you. A friend of mine uses Nvidia cards and my (travel) laptop uses Nvidia as well. If Dlss is an option it works well imho. A 3090 just costs so much more where I live. I had to pay at least 600€ for a used one. The performance difference between a 3090 and 6950 just isn't big enough to justify that difference in price
@@ProtossOP nothing wrong with the card. The previous owner installed new thermal pads and better paste for good temps but he wanted to upgrade to a 7900xtx shortly after. I tried pretty heavy oc and uv. The card runs fine with it's aircooler with 400w power limit. Usually I just use my uv for lower temps and silent fans but it's reassuring to know that I can still push it a bit if I have to.
@Aurummorituri What gamer cares about ''Ai workload"? if you're into multiple player games or sim racing like me RT will never be turned on so for damn near $1600 dollars less he got a better card compared to the 3090... I say that's a damn win
Got my 3090 FE used for less than 600 bucks right after the mining crash. Still feels like a great deal as they now usually go for 900 bucks used where I live.
I wish this would quiet all the people that talk endlessly about vram and act like it is somehow a major limiting factor. Daniel even made a video on this on top of this video being a perfect example. How many people bought the 3090 going ohh itll have enough VRAM for years yada yada yada, don't get a 4070 or 4070 super it only has 12GB of memory yada yada, trash, trash ,trash. . . . AND YET. . a SINGLE GENERATION forward from the 3090 we see the difference between 24GB vs 12GB of less than 10FPS. More particularly, the limiting factor according to multiple videos is likely the memory bus which would change in tandem with the memory size. While you could argue "isn't it bascially the same thing, since 16GB would increase the bus". . . and the answer is, well basically yes. But thats not what anyone in the comments section anywhere is arguing. The last thing worth noting is, there must be performance cuts somewhere otherwise there will be little difference between a 4070 super with 16gb (if it existed) and a 4070 TI super. . . Nvidia wants you to bump up to the $800 tier as their main objective is to make money. Case and point, anyones free to have their own opinions but this "it doesn't have enough vram" is nonsense in the context of GPU performance at certain pricepoints. If you're going to complain about it, go buy a 4070 TI super with 16gb for $800, or get a 4080 Super for $1000. Thats what they exist for. The performance you get out of those cards is far less a factor of their Vram size as it is their increased CUDA cores, RT cores, ROPs, and power limits. Just sharing the facts.
Since you have the cards available and we’ve now seen how comparable they are, one test that could be interesting is DLSS3 frame gen vs 3000-series DLSS2 with Radeon frame gen. Even though I’m personally extremely cynical about frame gen entirely, it would be interesting “for science”.
My 3070 still runs everything, although it's slowly becoming a little bit harder to run newer stuff on max, I will wait for a little while with upgrades, this generation of graphics cards isn't really doing it for me! If it does for someone else be my guest!
Shiiiit my 1080ti is still kicking ass at 1440p 60fps medium high settings. 4k playback is still buttery smooth with games like hades crosscode,cuphead etc... Still rocks for emulation including switch.ive never been happier with a gpu long term like this. Decided to skip 3000 Entirely and save up for 4 only to say fuck it and keep saving so i can get a 5090. Ya know really make it worth my while
Nice work, that was my first question and soooo many of the more BIG name reviewers, all did reviews, but none of them challenged the claim, which I do not get..... Goes to show quality content comes reviewers that take the time to test, and dont just spit out content as fast as they can to try and get those early clicks.
The problem is these new games are being made too fast and the optimization is soo fucking bad that you need a 4090 to run everything nowadays.. it’s ridiculous..
The price difference, even today, between a 3090 and 4070super makes it an easy choice in my opinion. But let's see what the Ti super comes in at. As far as lying, it is normal to claim this and that, in the end the games and the applications need to be able to address the memory. My 4070 does perfectly fine at 1440p, don't need 4k gaming as most games still run at 1080p. The best gaming card is still the 4080 and the 7900XTX, simple as that. A card with 12Gb memory being called a "super" is a bit of a stretch and a huge missed opportunity.
I would take the 3090 over the 4070 whatever any day of the week. It has way more VRAM and the same if not superior performance in raster which is 90% of what I use it for.
Yes 3090 is faster..., but there is one spectacular difference between both, means power consumption. I have personally 3090 and I need to be always worry about oven inside case -:), the sweet spot is power limit set up to 280W, this horrible experience says me 'avoid buy new GPU above 300W', which rejecting even 4080
Honestly, who cares about the pref vs a last gen flagship when the price delta is so huge. 4070s is cheaper on average. Some 3090's are still over 1k. The fact it effectively punches with a 3090 is impressive. At the end of the day Frames per $ at desired settings is all that matters. 4070s pretty much beats all the mid tier cards at $600 The only card that is remotely competing with it now is the $100 less 7800XT. Going down that road you are seeing about a 7-10% framerate decrease, and this jumps to 20% when RT upscale or RT is turned on. AMD if they dropped 7900XT by about $50 to $660 new it would smash everything in this bracket.
try testing ark survival ascended it is by far the most demanding UE5 game the 3090 probably couldnt handle it at 1080p without upscaling for a 60fps expierience
No but the base 4070TI is literally slightly better than 3090. 4090 owner here. This stomps XTX, enjoy your raw FPS and I'll enjoy my DLSS+FG more FPS :D
And the 3080 Ti is right in the mix with these cards. The 4070 Super is almost exactly as powerful as the 3080 Ti, except it has the new and better RT cores, but at the same time it has half the memory bandwidth (192 vs 384-bit). That's actually starting to be a pretty good value card, at least after the initial pricing of the 40 series cards.
@@lifemocker85 yes, I know. I've been closely following the industry for well over 10 years. Increasing demand on the top tier GPU performance doesn't close market for lower tier budget and mid-tier systems. 1080p high and 1440p medium gaming will be fine with a 12Gb GPU for many years to come.
Advantage of the 3090 is the work level performance such as 3D work needing 24GB of RAM. Plus it can play games well. Only missing FG but I believe hacks or NVIDIA will just expand the support over time.
Look up DLSSG to FSR3. Digital Foundry just did a vid on it the other day. FSR has always been open source, so I guess it was only a matter of time. But I'm dicking around with it in Cyberpunk on my 3090 and it's surprisingly stable. 1080p ultra + RT overdrive + DLSS Q, I'm averaging 121 FPS, just to compare it to the 4070 Super here in this vid... lol very similar results, yet again.
RTX 3090 with optimized settings at 4k with DLSS FSR FG mod is still a very good GPU @ 4K. I have seen ratchet and clank, star wars Jedi survivor with DLSS 4K ultra FG mod consume upto 21GB Vram.
@@stevieC11Hanworth I do not own a 1440p monitor. I either play @ 4k on my 49inch VRR 120 Hz Oled monitor or 3440*1440p 165 Hz Ultrawide. Otherwise, I do moonlight streaming via Sunshine to either of my handhelds: ROG Ally (120 Hz, 1080p) and Legion Go (144Hz, 1600p).
@@zacthegamer6145 thanks for the detailed response. I believe 4k is a scam not worth wasting my resources on and you can't notice the difference to 1440p happy gaming
They allocated 21gb VRAM, there's a big difference between usage & allocation. The bigger the attic the more stuff you keep there, doesn't need to be useful stuff.
By year 2030 the name of the graphic card will be: GTX 4070 Super trouper beams are gonna blind me But I won't feel blue Like I always do 'Cause somewhere in the crowd there's you. Lyric from ABBA
you are looking at this the wrong way yes its just a little bit slower not very much though. now the 4070 super is performing near the same as a 3090 for 799 compared to a 3090 at 1500 dollars the 4070 super has better power efficiency running the same amount of performance for way less money. this is pretty impressive. then you add the feature sets you beat a 3090 by alot.
Nvidia making questionable claims about their GPUs again. You need a 4070 Ti to be reliably faster than a 3090 in the vast majority of games. 4070 Super doesn't quite get there.
@@Archaoen0 Of course it is. No Question about that. But i feel like some ppl act like Nvidia is God Like and thats just not true. AMD is worse is a more accurate Fact. That doesn not mean Nvidia is terrible. What Nvidia did is… using he best TSMC Node and using a very small Die compared to whats used to. Thats how they get to this „very low“ (we are still talking about 220 Watt which is not very low) Power Usage. The Architecture is good as i said. They are doing fine but they arent Gods like some think or want to constantly want others to think. Thats all.
@@flimermithrandir True , some people worship Nvidia and completely swear by it and i hate that, i personally have a 4060 but i don't go to people saying that they shouldn't care about AMD when they clearly offer way better value across the board, but what i feel like AMD should be doing is not trying to catch up to Nvidia's features and rather try to fix gaming's other issues and make their own unique features as selling points.
If I were a sacred animal and were given both of those cards as an offering I would choose the 4070S without a second thoughts and send my blessings to leather jacket man !
200 Watts consumption of 4070 super vs 350 Watts of 3090 for similar raw performance or better performance with FG. Seems very straight forward easy decision to me.
@@TheAcadianGuy Not very much.. FG is fantastic technology.. Please show me real world usage of that much extra 12GB VRAM & corresponding increase in performance ?
So solid 3080 TI levels of performance and VRAM for half the price at 60 percent the power?!?!?!?!, yet somehow this product is bad according to AMD fanboys
I understand the call for 16GB of ram but clearly the card gets outdated before the amount of ram does. Look at all those 24GBs on the 3090, pointless.
The Alan Wake 2 4K Native example had both GPUs in the 30s and the noteworthy 16% win for the 3090 there slides into obscurity as people would turn on DLSS Quality at that point. Despite some saying there are no "4k" or "1440p" cards, I think this comparison shows what people mean: Where a "1440p" and a "4K" GPU have similar average performance, they will take the lead over each other on their respective resolutions. E.g. if you play on 4K native use the 3090 (assuming both cards cost the same, which they don't).
I don’t understand why people care so much about frame rate in single player games. FPS only matters in fps based games and competitive play such as shooters for example. Other then that as long as the game runs smoothly your fine.
Happy user on the 4070super. Even if its 12 gb instead of 16gb (not to mentium 24 gb...) . Just play on high settings. Ultra always sucked, an any card (from gtx 8800.. to gtx 285 to gtx 980ti and 1080ti/. etc) . Just don't go ultra.
Are you out of your mind? Who the hell plays games are 30-40 fps? Cyberpunk at 50fps? Why don't you show real world use cases? Tweak the settings so it's suitable for 120hz/165hz monitors that most people have. This data in your video is 90% irrelevant. 34fps vs 31fps on graphics settings that no one would seriously play with, for real?
Not paying $600 for 12 GB VRAM.... If I'm playing 1440p there will be games where the visual fidelity will suffer. Even if the texture packs are well designed not to tank the video cards performance, something has to give and that is visual fidelity. 600 bucks for 12 GB of VRAM is insulting
It could be attributed to the 12 GB maximum video memory on the RTX 4070, in contrast to the 24 GB on the RTX 3090. The latter has more RAM chips, contributing to increased stability.@@BBWahoo
I'm just an enthusiast and like to build a new machine every-other generation. When the 5090 comes out it'll be time for a new build. I'll give my current computer 5950X/3090Ti to one of my grandkids.@@lucasrem
My god some 1080p of of both these cards in the mid 30 fps only my god, 25fps for a 3090 in Alan wake 2 with those ray or path tracing settings, that's terrible so many settings have to be turned down or altered with frame gen to get anything decent - wowzers what was the 3090 like a 1400+ card just like 2 yrs ago?, if available & some were near 1800 or more due to stock issues, to be only able to play some game (with the best 4k settings now in 20-30 fps is kinda nuts and with that high power, whilst it good to see efficiency get better & performance of 4070 it games are still outstripping cards abilities unless you have a bottomless pit of money to spend. I'm not sure but I suspect many Nvidia buyers are feeling value has gone to hell, the 30 series was a stock price gouging sensation, the 40 series VRAM, duel 4080 initial row backs & price for value was broken on Launch. With Nvidia mistakes their customers of only the last 6 months having their investments trashed for those who were early customers of who the first 40 series cards. I dunno what a mess, I would think trust, in their business practices and model is eroding fast - I’m sure price fixing, drip feeding tech improvements and double dipping is all going on, the value of what released, availability seems too manipulative but AMD isnt much better with matching pricing & similar issues - 😂 Still findintg it all crazily fascinating on the sidelines atm All we can say is holy hell what a tour de force in tests of games, cards, cards v comparison Daniel is doing this month! ❤❤
Just the natural order of things. There's always something better and faster around the corner. Sure you can get 15-20% better performance for the same money now but I've had my 4070 for the last 9 months. You will never buy anything if you worry about this shit. In another 12-16 months the 5070 will be out and be another 30% faster than the 4070S.
Technology evolves. Of course newer cards will run faster and older cards will depreciate in value. Same for TV, cars etc. You certainly didn't expect tech to stagnate did you?
@@ehenningsen nope I know that of course, we all love tech & tech improvements (and it doesn’t stand still for sure & we wouldn’t want it too) - not against tech advancement! - it’s more the point about business pricing, supply & release scheduling for customer trust/value & what In what I often often suspect is purposeful price fixing etc & purposefully downgrading what’s possible in a drip feed small tech improvements in the recent cycles but as long as we keep buying they keep doing it, I do feel many have lost trust now in some of their future consumer confidence. You must see what they doing? release something not that good for high amounts, low stock, whilst already having another better version ready (at same time) but release it 4-6months later for double dipping etc So maybe I didn’t make my point well enough it’s the business practices & customer trust being eroded that I’m worried / interested in to see how this pans out. The last 5 years business practices have changed you cannot really say consumers feel the 30 and 40 series releases have gone well or been valued by the consumer like - anyhow just my rambling 😂
I think the VRAM is going to be an issue within the next year or so. However, Nvidia blocking the 30 Series from FRG means these cards are money grabs and you should go with AMD, until they stop trying to artificially enhance the same level of performance to create value.
The performance is impressive for the price against what the 3090 retailed at, and even at it's sale price at the 40 series launch, what is more impressive is the power draw comparison. I ran an EVGA 3080ti hybrid and it pulled way more power than the 4070 Super, and based on this comparison I assume would have lost in a FPS comparison. I'm not saying Nvidia's pricing model is where it should be, upgrading to a MSI 4090 hybrid cost an arm and a leg, but these Supers seem like decent cards and upgrading to them may not mean needing to buy a new PSU to run it.
It's impressive just because the 3090 was so bad in terms of price/performance, that even if this 4070 super is still a really bad value, seem "so much better", but it isn't. They should cost 100 to 150 euro/usd less to be just "a good" value. I miss the golden age of the rx580, gtx1070 and 5700xt
okay if you look at both images yes both have same fps but if you look closely for some reason the left side is running more stuttery, the big give away was the animals moving. the right side has a much more fluid feel to the play back of the image. even in the gun fire
@saintcastle 192bit bus for the 4070 super.. that in other words.. you wouldn't put a 1600hp engine in a reliant robbin that has 3 wheels and keeps falling over and no all wheel drive.. fucker be stuck doing burn outs, you squeezed the life out the engine.getting no where.. think of a tap.. turn it.to.a drip ( 4070 super) then turn it up some more but not max you have ( 256bit) , 384bit and up is consider better.. for.higher end cpus, it's basically.a.bottle.kneck.. as you ain't.able to push any more computing data through the pipe line.and that'll end up causing stutters..
which one would you get if you had the money? I have a 3070 now but I was thinking about getting a 40 series but this video made me question it. So more cores isnt always better?
Why not try lossless frame generation on these, That can give upto 4x the performance, Nice piece of software , Big plus for older games locked at 60fps 😉
I've loved my 2080 Super up to this point. I think the Super cards offer peak performance to dollar ratio for their respective generation, and look forward to upgrading to a 4080 Super
@@WololoWololo2 You get diminishing returns when talking on a dollar to performance ratio, typically paying 50% more for 10-20% extra performance out of the Ti card. So yeah, the Ti cards offer the best outright performance, but my comment specifically says, Super cards offer the best performance to dollar ratio, matching the launch prices of their non Super counterpart, while being just shy of the Ti performance
@@ZZPxFTW On Ti cards you can use it for way longer, RTX 3090 Ti owners wont change their hardware for atleast 4 years because of how powerful they are. More expensive but more life in it. You wont get anything after this best class hardware of years ago still can do amazing things better than cheap hardwares
@@WololoWololo2 the last sentence is not even coherent, so I have no idea what you’re trying to make a point about. And, the first point you make is already addressed in my first reply to you and what you’re trying to say makes no sense. 30 series didn’t have a super, but even still, let’s use your example. The 3090 Ti is only 12% more powerful than a 3090. So it’s not “THAT much more powerful,” as you’re claiming. 12% more performance DOES NOT equal and extra 4 years lmao. For my 2080 Super, it’s actually 18% less powerful than a 2080 Ti, so a bigger gap in performance than the 3090, but even then, the 2080 Ti still becomes out of date around the same time as my card, given a 4070 Ti is 56% faster than a 2080 Ti, while 82% faster than the 2080 super. So instead of getting 100 fps while gaming, I’d be getting 118 fps if I had the Ti… where’s this mega powerful Ti performance and lifespan you’re talking about? Doesn’t exist. Both users would find value in upgrading around the same time. All this without reiterating the fact that Ti cards are more than 50% more expensive than their Super counterparts. If your philosophy is “get ripped off to get the extra 12-18% more fps” then yeah, get the Ti. The rest of us are smarter than that. Don’t call my card cheap hardware either; $700 for a gpu is not cheap
I still don't understand not including FG/DLSS as "performance" as they literally add more physical cores to do that, to add calculation performance which is only different now because it's handled by AI, which is YET AGAIN just another calculation method that is now proven to be more efficient than traditional rasterization. We all know AI is going to be the future anyway so why are we still disregarding it...
I don't know what everyone is bitching about with the 4070 Super. It's a great value to performance. The 3090 had an MSRP of $1,500 and of course the scalping made it even more ridiculous. Granted that was 2020, but at least now scalpers don't want anything to do with it and I get a card at MSRP that let's me play 1440 at high frames especially with DLSS and frame-gen. Also, this card will last maybe 1 or 2 years then I can get a new card for $600 that keeps up and I'm only out $1,200 in 4 years instead of double that or more.
In the UK you can get a second hand 3090 for £750. So you get a card that's on par with 4070S for around the same price but with double the VRAM. As someone who plays games and uses AI, I went with the 3090 and it's been fantastic.
Same here, I couldnt justify buying a 4090 this late in its life cycle at full price but I needed to VRAM for AI workloads, so I went with Asus TUF 3090 for 750 USD.
The 3090 is a large and power hungry card. You need to have a case and psu to match. I could have bought a used 3090 for the same price I paid for my 4070, but I would have buy a new case and psu so I didn't bother.
The worst is that 3090 or 3080 potential has never been truly tapped.. we just push an insanely expensive never series and stop the old before it’s potential has been even marginally released
This video shows all, all these highend super expensive rtx4090 sold for $2000.00 , when next gen comes out it will lose to 70 class $600 gpus , Its morons who bought rtx2080 ti for $1200.00 , then rtx3070 beat it for $500.00 Same happened here rtx3090 sold for almost $3000.00 now its beat by 70 class gpuu again for $600.00 People should by now all high end becomes mid range in a few years , why spend that much money , just stupid fanboys cant stop the itch , and making jensen billionaire.
Side conclusion: cards perform almost identically in 1140p without DLSS and 4K with DLSS. Taking into consideration how good DLSS is and how cheap 4K are becoming I would pick a 4K screen especially since the further in the future the faster GPUs will be and most of us don't buy a new monitor every 2 years or so.
Iv been telling ppl this but then im called dumb lmao. I bought a 4k 144hz monitor and i play singleplayer games in 4k / 4k w/dlss Q, and fps games in native 1080p 200+ fps. a 4k monitor can display 1080p waaaaaaaaay better than a 1440p monitor bc 4k divides into 1080, 1440p doesnt. 1440p should be skipped bc 4k with dlss Q will look better than any native 1440p monitor while requiring near the same gpu performance
i don't know about you but i find it baffling that some games can't even reach 100 fps mark with these graphics cards at 1080p, not like they're weak. I swear optimization is a dumpster fire with newer games. Companies be like "enable dlss and frame generation 🤓", hell yeah 66% resolution of 1080p, now i can really start counting pixels
you are comparing a x70 class gpu with x90, of course x70 is more power efficient, lol... gtx 1070 was also incredible when compared to 980 Ti, same performance, less power consumption... oh, and guess what? 5070 will also perform in between 4080~4090 and consuming less...
The only thing I don't like about my RTX 3090 is how much power it draws---420w+ just the GPU alone! It's s nice space heater during winter but in spring it's like a sauna inside my room. 😅
I can't wait for Nvidia to release the 4070 Super Duper with 16GBs of VRAM.
@@captain000schmand Never work in marketing, dude.
@@yzfool6639idk y but that gave me a good laugh
The Super Duper is coming out next week.
More RAM? Not gonna happen.
The 'duper' only comes with the ti super range cards 😂
Isn't crazy how just one generation gives the same performance for 150W less power consumption. It's almost half at this point.
But 'VrAm' - the dumb person who thinks its the end of the world cause 12 gb ram.
Yeah it's crazy how Nvidia is artificially gimping their releases
makes the headroom for 5xxx series look goooood
That mostly shows how relatively bad the samsung 8n node is. Almost all efficiency comes from the node used, and very little from architecture.
@@MrStoffzorthat L2 cache also helps
it's more of a 3080Ti competitor since both have 12GB. 4070 Super can only match a 3090 at 1080p/1440p. Also a $600 card matching a $2K card at all while using 120W less is mighty impressive imo.
"Only" lol. No one would play at 4k with the 3090 to begin with.
@@mnoise626Not anymore at least. I had a 3090 and played games in 4K.
@@mnoise626 People buy 24gb vram cards to play at higher resolution, rock for brains.
3090 MSRP was $1.5K mb
Bro what? Even I still use my RTX 3080 to play games at 4K. I hope you know you can just lower the graphics settings if the ultra settings don't run at the desired frame rate?@@mnoise626
"Faster than an RTX 3090....until you run out of memory and when using frame gen which we purposely made exclusive to Ada"
You won’t, maybe one day in 5 years, but at that point both 3090 and 4070s will be outdated anyway.
in point I never use stock options in game like medium high or something like that.
allways use texture ultra and some other vram options ultra and other options that hit fps/cores on GPU I will be on low/medium and have some options off, all of that for max fps and best look in games need a loot vram, I got vram limited by rtx 3070 xD@@daniil3815
rtx 3070 have 8gb vram did not even hold up 3 years xD scammed again@@daniil3815
you defo dont need 24gb of VRAM to play maybe you need it if you making stuff in unreal and rendering movie/game with metahumans etc.. but today 24gb is overkill
@@daniil3815 5 years? Lol no that would happen in 2026 when next gen consoles arrives and all games again start using more vram. Its a cycle so time running out.
The 3090 stock voltage settings are pretty inefficient, you can match stock performance at 0.8-0.85v while lowering power draw by 100w+. Mine draws between 200-250w while gaming and is great for VR games that use a lot of VRAM.
Can make mine eat 700w running cyberpunk with ray tracing on first card I actually want to water cool to see what it can really do
Yup, thats what so few people notice how bad stock cards are. Plus copper mods etc
yeah I am currently trying to decide what to do. I have a 34 1440p and VR with 3070. Thinking of upgrading to the 4070super but not sure its enough of what I would need. I want nvidia so i can still pair dlss with dldsr. I think I would be find for my monitor as I already am fine with my 3070 for basically all games without dropping too many settings. See couple 3090 available and its hard to pass up that much vram but I would like the new DLSS version and pathtracing
@str8ripn881 A used 3090 for £500 is good for VRAM intensive VR games. If going new I would go for a 16gb 4070ti Super or a 4080 if you can stretch the budget. The 12gb 4070 super is a good card but with half the VRAM of the 3090 with similar performance.
@@RimzoSky Yeah I have been torn. I never really buy used pc parts in all my years. I fear the same about just the super and would probably go 4070 ti super at the least for the 16gb to double the vram i have now from my 3070. I may go the whole 4080 super at like $1000 but that just seems ridiculous to me but sure it would work for many years. If I decide to go used I may wait for the new 50series and snag a 4090 off someone
If you calculate power/performance, 4070 Super win for sure.
True but while enrgy may be a concern for some, for any household any kind of AC/appliances will use marganially more energy that you would see less than a $1 difference in energy cost $/kWh is on average .14 cents in texas. RTX3090=350watts RTX4070 super=285watts it would cost you .04 US to run your 4070 for an hour vs .05 with the metrics provided. Mind you if you were gaming at the peak power consumption (which in the real world you arent) at a gaming rate of 120hours a month (4 hours a day) it would cost you $6 to run the 3090 and 4.80 to run the 4070 super. Is the power efficency worth the extra cost if they have the same performance? If energy is so expensive it makes a big impact you should have other concerns. Refined archetecture isnt worth shelling out hundreds of extra $$$ unless you want to burn money
$0.33 in Los Angeles. and 4070 Super is not just saving power and make much less heat. so in summer, you don't have to turn down A/C too much. and save you hundred of dollars.@@rea280
@@KelvinKMS you're right i didnt factor in heat dissipation.
@@KelvinKMS hundreds of savings in heat desipation is rather speculative though. I do not think its that high.
Not to mention price. All of these 40 series haters can't get over the fact that Nvidia made something decent with it's most recent launch. They mask their herd-like hatred (which they most likely obtained from their little echo chambers on Reddit) by claiming the performance is not impressive / they expected more. Listen, the fact that a $600 card that consumes less power comes so close to and even surpasses a card that's still listed at $1500+ is incredibly impressive. My only gripe with this card is the VRAM. Other than that, i'd say this card is fucking great.
Hey Daniel, in RE4 Remake 4070 Super looses performance at 4K solely due to VRAM bandwidth or insufficient size of L2 cache. But it is definitely not VRAM capacity issue. I performed a comparison between 4K/Max/High 8GB textures and 4K/Max/High 1GB textures on my 4070 Ti and found out that although VRAM allocation/usage is higher in case of "High 8GB" setting, the performance is identical, as well as RAM usage/allocation and PCI-E bus usage (this is shown in my latest uploaded video). Looks like RE4 just tends to reserve spare VRAM. So, at least, Nvidia 12GB cards are fine in this game (not so sure about AMD counterparts with the same VRAM capacity).
Am I reading this right? Does the 4070 super performs almost the same as the 3090 or better with FG at 25% the wattage of the 3090? Dang that’s good tech advance there right?
Yep, but don't ask the idiots here who will hate Nvidia if they cure cancer.
@Wobbothe3rd How does Jensen's boot taste like ? They definitely need you to defend them
@@GWT1m0 You like crash like AMD drivers constantly, so you don't have the time to taste much at all.
@@dalaransadoringfan5267 amd drivers are fine
Again the performance difference at 4K in Avatar is not memory. It’s the power limit. You’re hitting the default 220w limit on the FE card. Had you set the slider in Afterburner or GeForce Experience to allow Nvidia max of 240w it would have likely had a tie or Super winning. At the same time you could always do the same for the 3090 FE up to 400w. Both cards are power limited in that 4K Avatar test.
3090 more expensive, but 4070 is cheaper, so draw your own conclusions.
I don't know. To me it seems anyone who has at least a 3060ti is still well served until the next nvidia series, 4000 is skippable.
pretty much
Rtx40 is one of the very biggest jumps in GPU history, you're just coping. The fact that a 70 class is beating the previous 90 class is a technological miracle. Stop lying to yourself.
im skipping 5000 as well 😂
3080 ti owner
Regardless of whether it's 3% ahead or 3% behind in whatever game and setting you test, a -70 staying basically a tie with the previous generation's -90 class SKU is pretty cool. This is a great value.
I have a 3090 and was thinking about getting the 4080 super since the price cut and extra performance but at the same time i think ill just wait 3090 is still a very capable card for todays games.
same lets stick with our 3090. No need to upgrade just yet.
Wish you had shown the numbers for 4k path traced without DLSS or FG in alan wake 2. the 4070 supers 3 fps in the techpowerup review illustrates the problem with 12gb cards going forward when texture sizes go up.
both cards in this mode will not show more than 30 fps.
This mode even brings the 4090 to a crawl.
@@Sayan2233I play Alan Wake everything maxed out including PT, DLSS balanced with FSR 3 FG mod on my 3090 @ 4K with 65 FPS avg.
@@zacthegamer6145 you do realize that dlss balanced is basically 1080p right? its 1080p upscaled to 4k, not 4k
@@zacthegamer6145What's your framerate without the FSR mod?
The 3090 when it was brand new didn't made much sense to spend 800 USD more than the 3080 for just 10-15% more performance and more VRAM.
Since its a last gen product in the used market, its in my opinion totally worth the premium over a 3080 10 GB, 3080 12 GB and 3080 Ti.
In Canada where I am. the 3080 is 550-600$ CAD, 3080 Tis are 650-700$ CAD and 3090s are 800-1000$ CAD depending on the models.
So its 45% more money or 250$ CAD more for 10-15% more performance and 2.4X more VRAM. Totally worth the premium at those prices.
I picked up a used 3090 for 900$ CAD in july of 2023 and zero regrets so far. The massive amount of VRAM allows me to play games at 4K on high settings with ultra texture quality and it never ever is a limit, its so nice.
Shouldnt the 4070 super run cooler too? Its running 5c hotter despite using over 100w less. in the avatar 1440p bit.
Hard to tell if we don't see fan speeds, but even if they run the fans at a similar speed, denser packed transistors (smaller process nodes) tend to trap heat between transistors (AMD CPU's encounter the same challenge on top of having that very thick IHS).
That can potentially cause some cooling problems.
As an owner of the 3090, I was really hoping that the 4070 series would be able to have as a viable upgrade with out spending over $1000 again(which I won't do again). I was thinking that the RTX4070ti Super might be based on projections, but not after seeing the real world results. The only two cards that would be a significant upgrade would be the 4090 or the 4080 super....and at those prices I'll be waiting for the 5xxx series or for AMD to catch up with Ray Tracing.
Dude who goes from a last generation 90 class card to a next generation 70 class lmao. That is what we call a side grade not an upgrade. Smart choice to wait for I would say a 5080.
yeah, unless its a 4090 for "cheap' you should not get any 40 series at all and wait for the 50 series\AMD\Intel. if you take care of your 3090 you'll be more than ok for a while to come imho ( also owning a 3090)
I went from 3070 to 4070S worth?
@@ZackSNetworkThe 4070 Super TI is going to crush the 3090 when the reviews come out next week so it's not really a side move.
@@IncredibleLyrics big time
Thanks there's a lot of great info here, I just cant help but wonder what performance would be like if Nvidia didn't weirdly reduce memory bandwidth on most 40 series models vs previous gens (minus the 4090).
Not as much as you think. Nvidia aren't idiots, they choose carefully based on power draw and AI performance. Gaming isn't everything.
@@Wobbothe3rd Nvidia certainly harmed performance with the 4060 Ti's 128 bit bus. I still remember when the Yuzu emulation team put out a warning on how they're seeing worse performance with 4060 Ti vs. 3060 Ti in Zelda TOTK due to the reduced 128 bit memory bus (3060 Ti = 256 bit), the emulator relies on it with certain games due to the way texture streaming is emulated. Personally I'm waiting until 50 series as I doubt Nvidia will pull this again.
at mw3, more fps means more frequent data transfer, so it makes sense that the cache plays a bigger role. at higher res, the fps simply decreases and so does the load on the cache, therefore the performance equalizes.
I think Nvidia spends 4 gb of VRAM cost on the damn boxes they send them in. Speaking of the founders edition models. Those boxes are nuts.
Given that we know the 30 series is absolutely capable of running Nvidia's frame generation process I don't think you should be including it in these comparisons without at least throwing up a big disclaimer that Nvidia is feature locking 30 series cards to upsell you on 40 series cards.
The 4060 ti is really showing how bad it's price to performance is compared to these super cards.
who would have though a 128 bit bus 16gb 50 series card masquerading as a 60 series card was bad value lol. nvidia botched everything below the 4080 this generation atleast the 4070ti super will actually be a 70 series card vs the 60series memory config original 4070ti.
Why is Nvidia scamming us like that?
Its
now we need 2060 supers
at minimum we should expect some decent price cut on 4060s like to allow them to belong to the budget gaming
0:14 This is getting out of hand! Now, there are two of them!
And I love it 😂
@@mobarakjama5570 This is also a way to save memory because you can use the same assets as many times as you want, but you don't have to load them more than once :)
Imagine both of them flying around.
@@PinHeadSupliciumwtf One flying around is already disturbing enough...
Oh no!
that the 4070 super is drawing so much less power from the wall, while being extremely close to the performance of the 3090 is incredibly impressive.
I would be surprised if next generation of GPUs was the same or somehow worse than the previous one.
Then again it's NVidia, they did GeForce FX once.
But asking price of $700 from 3rd party for what is essentially a 12 GB 3090 is kinda meh.
why 4070 stutter at 3:30 ? i mean fps is nearly identical but 3090 foootage looks smooth and 4070 looks like its skipping frames
@@potatogod3882 you're not supposed to measure smoothness by the video. Daniel said his capture card only goes to 60 fps or sum, which should make footage above and below that fps look stuttery sometimes
@@potatogod3882 raw drivers / 12 GBs VRAM / video recording issue
100%.
If 4070S would have 16GB VRAM it would be a really nice product.
Bro its 1440p card ,12gb enough for that,if u want to play in 4k buy 4080 super for 999$
@@FeherViktor-zl8bmThanks Jeff Bezos
@@FeherViktor-zl8bm jensen's burner account
@@FeherViktor-zl8bm 💯 FINALLY someone that gets it! I have been telling ppl this and they get in their feelings. Kinda like the other guy that commented on your comment lmao
I don't like artificially gimping products like this, but I get the sentiment. Textures are free assets (computation wise) that can improve the visual fidelity of anything, I wish they would cut their margins elsewhere.
If this wasn't the case with Nvidia, Radeon cards wouldn't even exist after this gen.. That's literally the only thing they can offer with RDNA3.
We had to wait ~18 months for Nvidia to offer more than 12GB below 1200$ MSRP..
The 3090 out there looking like a former A-list celebrity appearing in car insurance ads. ~55 fps on 1440p? Wat!? Wasn't it marketed as a 4k card?
in 3 years the 4090 will be a 1080p card. I mean its already basically a 1440p card for UE5
@@Dempig That's just not true and you sound stupid saying it.
@@Dempigthat’s insane lol
@@Dempig Your tweaking bad
Well that was 3.5 years ago. 🤣
Honestly the amount of work Daniel is putting in every video is insane. He should feature with Gamers Nexus they would understand each other^^ Nevertheless i really respect your work.
Completely agree. it takes days to get all this done!
Agree.. he's such a geek in such positive way. But making this his main job might steal the fun off of it since if it will become job it might b stressful. Maybe..
This would really be the goat’s collab for years to come.
Blah blah blah
whats your problem?@@CraneOPBR
the 24gigs is heaps better for all the creative and AI applications, but other than that, 4070S isn't bad.
That bandwidth and vram makes a huge difference in these workflows! The 40 series is so lacking for creators
@ak33mc it's great for creators because of the feature sets that the other cards don't have, but you're right when regarding VRAM. That's a bit disappointing.
The 4070 TI Super will be a sweet spot for price with 16GB behind the 4090. But I'm hoping the 5000 series can have a bit better uplift in this regard.
12gb is bad
@@lifemocker85Yeah rx 7700 is shit
@@MrSeb-S every 12gb gpu is
I don't care what anybody says about performance. The 24GB of vram from the 3090 is priceless at 1440p and 4K, compared to 4070 Super which may run out in the next 1-2 years due to bad game optimisation.
just buy the 4080 S..
@@catfacecat. Yeah nah, I got my 3090 for only £700 a year and a half ago. May not be faster, but it can still pull its weight in games.
It is true that it's great to have lots of VRAM just in case. The reality is, however, that game settings that would warrant the use of >12-14 GB of VRAM wouldn't run well on either of these cards regardless of whether or not you have enough VRAM. Example: Alan Wake 2 4K + FG + PT would easily break the 12 GB VRAM requirement, but would run terribly on both of these cards. Same goes for Cyberpunk at those settings. Also I believe that games devs will be forced to make sure their games can run on
In 3 years 12GB VRAM will be like today's 8GB VRAM.
It's a pass for me on tis one.
Better fork for a RTX 4070Ti Super 16GB
If either of these GPU's even runs games at settings that utilise more than 12gb of VRAM in 3 years I'd be surprised.
Given a 1080Ti was rather quickly relegated to a 1440P card (for performance implications) when 4k finally started to creep up to the limit of 10gb VRAM & never really had its 11gb VRAM used in any games.
@@MLWJ1993 Not necessarily. High texture packs are a thing and they don't hit the shader units the same way as visual effects do.
I think the 1080ti comparison is rather like a 4090 situation with its 24GB VRAM.
I think a cheaper 4090 with 16GB VRAM would be more balanced.
But who knows, maybe next gen RPG games would have local AI Chatbot capabilities where you could freely conversate with NPCs and that will certainly eat a lot of VRAM.
@@konstantinlozev2272 Higher resolution textures are bottlenecked by your input & output resolution though.
If you're at 1080P using 4K textures there's more than 3x the pixels in the texture that your resolution is capable of displaying. Meaning you'd need to zoom in at least 3x to actually see those details & the object would need to fill your full view before zooming in.
That's a lot of IF's to meet requirements of your example.
@@MLWJ1993 Amh, no, I am afraid. I don't know where that "4k textures" comes from, but textures in a game come at all resolutions.
In addition, a "texture" is typically a "texture atlas", where different parts of that "4k texture" are assigned to different body parts of (for example) the character that you are looking at.
So, you are actually never seeing the whole "4k" texture atlas at the same time. If you are looking at the face of the NPC, you are looking at a small fraction of the texture atlas. Just look at the textures on people in Cyberpunk 2077. Not good.
@@konstantinlozev2272 Yeah I agree, if I were to get a 4000 series gpu it would be at least RTX 4070 Ti Super, definitely if you use it for at least 5 years.
I thought about waiting for the the super release last year. Then I got lucky and bought a used 6950xt for 280€. Since the 6950xt is trading blows with 3090 and ti I think there is no need to regret that decision. Still a great card for that price.
What was wrong with the card? It’s was super cheap. Where I live for 280€ you can buy something like a used 6700xt. What a snatch!
...in rasterization only. It gets stomped in any RT or AI workloads. Also AMD has their thumbs up their butts with FSR still not looking that great compared to DLSS.
@@Aurummorituri luckily most games I play don't support dlss or fsr and I have to rely on pure raster performance. I tried RSR and FMF and wasn't that unhappy with the outcome. It's just a gimmick for me though.
Edit: not that I don't believe you. A friend of mine uses Nvidia cards and my (travel) laptop uses Nvidia as well. If Dlss is an option it works well imho. A 3090 just costs so much more where I live. I had to pay at least 600€ for a used one. The performance difference between a 3090 and 6950 just isn't big enough to justify that difference in price
@@ProtossOP nothing wrong with the card. The previous owner installed new thermal pads and better paste for good temps but he wanted to upgrade to a 7900xtx shortly after. I tried pretty heavy oc and uv. The card runs fine with it's aircooler with 400w power limit. Usually I just use my uv for lower temps and silent fans but it's reassuring to know that I can still push it a bit if I have to.
@Aurummorituri What gamer cares about ''Ai workload"? if you're into multiple player games or sim racing like me RT will never be turned on so for damn near $1600 dollars less he got a better card compared to the 3090...
I say that's a damn win
Got my 3090 FE used for less than 600 bucks right after the mining crash. Still feels like a great deal as they now usually go for 900 bucks used where I live.
gg wp no re
I just got my 3090 FE for $720 2 weeks ago! but crazy find for you now!
Got my evga ftw3 ultra for $800 cad 4080s are $2k here. Talked guy down a bit usually go for $1000-$1400 on the low side still for a 3090 around here
Now you can sell it to me for 100 bucks because you're a good person.
@@AnimeGamer
I got an all liquid cooled 5950 Ryzen + 3090 pc for 1200 lol.
I wish this would quiet all the people that talk endlessly about vram and act like it is somehow a major limiting factor.
Daniel even made a video on this on top of this video being a perfect example.
How many people bought the 3090 going ohh itll have enough VRAM for years yada yada yada, don't get a 4070 or 4070 super it only has 12GB of memory yada yada, trash, trash ,trash. . . . AND YET. . a SINGLE GENERATION forward from the 3090 we see the difference between 24GB vs 12GB of less than 10FPS.
More particularly, the limiting factor according to multiple videos is likely the memory bus which would change in tandem with the memory size. While you could argue "isn't it bascially the same thing, since 16GB would increase the bus". . . and the answer is, well basically yes. But thats not what anyone in the comments section anywhere is arguing.
The last thing worth noting is, there must be performance cuts somewhere otherwise there will be little difference between a 4070 super with 16gb (if it existed) and a 4070 TI super. . . Nvidia wants you to bump up to the $800 tier as their main objective is to make money. Case and point, anyones free to have their own opinions but this "it doesn't have enough vram" is nonsense in the context of GPU performance at certain pricepoints. If you're going to complain about it, go buy a 4070 TI super with 16gb for $800, or get a 4080 Super for $1000. Thats what they exist for. The performance you get out of those cards is far less a factor of their Vram size as it is their increased CUDA cores, RT cores, ROPs, and power limits. Just sharing the facts.
It's an AMD fanboy cope.
Since you have the cards available and we’ve now seen how comparable they are, one test that could be interesting is DLSS3 frame gen vs 3000-series DLSS2 with Radeon frame gen. Even though I’m personally extremely cynical about frame gen entirely, it would be interesting “for science”.
it's sad that the cards vram is already maxed out🤣🤣🤣🤣🤣
My 3070 still runs everything, although it's slowly becoming a little bit harder to run newer stuff on max, I will wait for a little while with upgrades, this generation of graphics cards isn't really doing it for me! If it does for someone else be my guest!
The 5000 series is less than a year away. If you're still happy with your card, hold on to it till then
Same here at 3070 and gona wait to see how the 5070 performs and hope it has 256bit bus and 16gb vram
Yeah I have 4090 but end up using the 3070 more often, I might just sell the 4090 and use the 3070 for a while and get a 6090 in the future
Well I’ve got great news !!!! FSR3 to DLSS mod enables frame Gen on 3000 series !!
Shiiiit my 1080ti is still kicking ass at 1440p 60fps medium high settings. 4k playback is still buttery smooth with games like hades crosscode,cuphead etc... Still rocks for emulation including switch.ive never been happier with a gpu long term like this. Decided to skip 3000 Entirely and save up for 4 only to say fuck it and keep saving so i can get a 5090. Ya know really make it worth my while
Nice work, that was my first question and soooo many of the more BIG name reviewers, all did reviews, but none of them challenged the claim, which I do not get..... Goes to show quality content comes reviewers that take the time to test, and dont just spit out content as fast as they can to try and get those early clicks.
you see you see he's picking his own nose yet again🤣🤣🤣🤣🤣
The problem is these new games are being made too fast and the optimization is soo fucking bad that you need a 4090 to run everything nowadays.. it’s ridiculous..
4070 Super = 3080Ti
Lower consumption would be the only deal on 4070 Super 🤷♂
yeah and 3080ti is like 3-max 5% slower than a 3090
@@DELTA9XTC amount of VRAM tho.
False. It's just below the 4070ti for $300 less.
Oh good I already have one.
The price difference, even today, between a 3090 and 4070super makes it an easy choice in my opinion. But let's see what the Ti super comes in at. As far as lying, it is normal to claim this and that, in the end the games and the applications need to be able to address the memory. My 4070 does perfectly fine at 1440p, don't need 4k gaming as most games still run at 1080p. The best gaming card is still the 4080 and the 7900XTX, simple as that. A card with 12Gb memory being called a "super" is a bit of a stretch and a huge missed opportunity.
It’s a joke not being able to play 4 k on a 4070 1080 p was last decade .
aelaan
Why is this guy not understanding marketing, and run a channel here ?
He will never survive here !
I would take the 3090 over the 4070 whatever any day of the week. It has way more VRAM and the same if not superior performance in raster which is 90% of what I use it for.
BUT MUH POWER EFFICIENCY!!!
More
Cuda cores 10700
384bus
945bandwith
Overclock 2125/9875
Only 2 games cant max 1440P do 120fps
Cyberpunk 2077
RDR2
They're about even.
There, I saved you half an hour.
The 4070 won almost every time. You idiots act like the 3090 is a slow card, the thing cost over a thousand dollars. Stop hating so dishonestly.
Yes 3090 is faster..., but there is one spectacular difference between both, means power consumption. I have personally 3090 and I need to be always worry about oven inside case -:), the sweet spot is power limit set up to 280W, this horrible experience says me 'avoid buy new GPU above 300W', which rejecting even 4080
Honestly, who cares about the pref vs a last gen flagship when the price delta is so huge. 4070s is cheaper on average. Some 3090's are still over 1k. The fact it effectively punches with a 3090 is impressive. At the end of the day Frames per $ at desired settings is all that matters. 4070s pretty much beats all the mid tier cards at $600 The only card that is remotely competing with it now is the $100 less 7800XT. Going down that road you are seeing about a 7-10% framerate decrease, and this jumps to 20% when RT upscale or RT is turned on.
AMD if they dropped 7900XT by about $50 to $660 new it would smash everything in this bracket.
The 7900XT is not as good in RT as EITHER of these cards.
try testing ark survival ascended it is by far the most demanding UE5 game the 3090 probably couldnt handle it at 1080p without upscaling for a 60fps expierience
question, on the baulders gate 3 13:25, why is the performance similiar, yet the game looks smooth on the right, and a jittery mess on the left?
I noticed that with the first Avatar benchmark as well.
Capture card - he mentioned it before it's due to 60fps capture card limit and causes stutters in recordings sometimes
No but the base 4070TI is literally slightly better than 3090.
4090 owner here. This stomps XTX, enjoy your raw FPS and I'll enjoy my DLSS+FG more FPS :D
3090 still selling for £2k on Amazon uk 🤣
Got my evga ftw3 ultra for $800 cad
And the 3080 Ti is right in the mix with these cards. The 4070 Super is almost exactly as powerful as the 3080 Ti, except it has the new and better RT cores, but at the same time it has half the memory bandwidth (192 vs 384-bit). That's actually starting to be a pretty good value card, at least after the initial pricing of the 40 series cards.
No value with 12gb
@lifemocker85 it's going to be fine for couple of years. Just like the 1080 Ti with 11Gb memory still offers great value.
@@OrcCorp demands are only increasing
@@lifemocker85 yes, I know. I've been closely following the industry for well over 10 years. Increasing demand on the top tier GPU performance doesn't close market for lower tier budget and mid-tier systems. 1080p high and 1440p medium gaming will be fine with a 12Gb GPU for many years to come.
@@OrcCorp one doesnt pay several hundreds just to do medium
Advantage of the 3090 is the work level performance such as 3D work needing 24GB of RAM. Plus it can play games well. Only missing FG but I believe hacks or NVIDIA will just expand the support over time.
Look up DLSSG to FSR3. Digital Foundry just did a vid on it the other day. FSR has always been open source, so I guess it was only a matter of time. But I'm dicking around with it in Cyberpunk on my 3090 and it's surprisingly stable. 1080p ultra + RT overdrive + DLSS Q, I'm averaging 121 FPS, just to compare it to the 4070 Super here in this vid... lol very similar results, yet again.
@@OnBroGrave Why are you running a 3090 on 1080p? XD???
@@RainOrbsbecause he has a 1080P monitor
@@willy7968 3090 is a 4k capable gpu at 40-50 fps, using it on a 1080p monitor is just huge performance loss
@@RainOrbs that’s valid in games.
If you use it for production, it doesn’t really matter
Given the choice, i'd take a 3090 over any card with less than 24GB. I upgraded my 3090 to 4090, but my 2nd choice would still be 3090.
RTX 3090 with optimized settings at 4k with DLSS FSR FG mod is still a very good GPU @ 4K. I have seen ratchet and clank, star wars Jedi survivor with DLSS 4K ultra FG mod consume upto 21GB Vram.
4k is not smart res 50fps on starfield 😂 I have the best 3090 money can buy I use 1440p
@@stevieC11Hanworth Starfield ultra settings @ 4k, DLSS Q, FSR 3 FG mod is 95 FPS avg in New Atlantis.
@@stevieC11Hanworth I do not own a 1440p monitor. I either play @ 4k on my 49inch VRR 120 Hz Oled monitor or 3440*1440p 165 Hz Ultrawide.
Otherwise, I do moonlight streaming via Sunshine to either of my handhelds: ROG Ally (120 Hz, 1080p) and Legion Go (144Hz, 1600p).
@@zacthegamer6145 thanks for the detailed response. I believe 4k is a scam not worth wasting my resources on and you can't notice the difference to 1440p happy gaming
They allocated 21gb VRAM, there's a big difference between usage & allocation. The bigger the attic the more stuff you keep there, doesn't need to be useful stuff.
By year 2030 the name of the graphic card will be: GTX 4070 Super trouper beams are gonna blind me But I won't feel blue Like I always do 'Cause somewhere in the crowd there's you.
Lyric from ABBA
but yeah they lied no surprise ;)
you are looking at this the wrong way yes its just a little bit slower not very much though. now the 4070 super is performing near the same as a 3090 for 799 compared to a 3090 at 1500 dollars the 4070 super has better power efficiency running the same amount of performance for way less money. this is pretty impressive. then you add the feature sets you beat a 3090 by alot.
@@davidfrazier6308is more like 599 usd
@@davidfrazier6308This is precisely how nvidia wants you to think. Accepting mediocrity.
@@davidfrazier6308 yep great but they lied it s not faster/better in performance and thats what this video is about ;)
@@zhila5958 nearly as fast as the previous flagship at nearly half the cost is mediocre?
Nvidia making questionable claims about their GPUs again. You need a 4070 Ti to be reliably faster than a 3090 in the vast majority of games. 4070 Super doesn't quite get there.
220w vs 340w tho. Not bad.
You expect it to use more?
well it doesn't use even nearly the same power nor does it cost the same, so it's still technically a improvement regardless@@flimermithrandir
Yeah, the Ada cards are crazy efficient. I'm almost certain you can undervolt this and still keep 97% of that performance.
@@Archaoen0 Of course it is. No Question about that. But i feel like some ppl act like Nvidia is God Like and thats just not true. AMD is worse is a more accurate Fact. That doesn not mean Nvidia is terrible. What Nvidia did is… using he best TSMC Node and using a very small Die compared to whats used to. Thats how they get to this „very low“ (we are still talking about 220 Watt which is not very low) Power Usage. The Architecture is good as i said. They are doing fine but they arent Gods like some think or want to constantly want others to think. Thats all.
@@flimermithrandir True , some people worship Nvidia and completely swear by it and i hate that, i personally have a 4060 but i don't go to people saying that they shouldn't care about AMD when they clearly offer way better value across the board, but what i feel like AMD should be doing is not trying to catch up to Nvidia's features and rather try to fix gaming's other issues and make their own unique features as selling points.
If I were a sacred animal and were given both of those cards as an offering I would choose the 4070S without a second thoughts and send my blessings to leather jacket man !
200 Watts consumption of 4070 super vs 350 Watts of 3090 for similar raw performance or better performance with FG. Seems very straight forward easy decision to me.
12gb VRAM will be a handicap
@@TheAcadianGuy Not in 1440p, this card does not target 4K. But I agree 4070Ti super does feel like a better choice (close to a 4080 and same vram)
@@TheAcadianGuy Not very much.. FG is fantastic technology..
Please show me real world usage of that much extra 12GB VRAM & corresponding increase in performance ?
It is sad to see you pay for rtx 3090 or 4070 super and get barely playable experience at 1440p.
But that power draw is less then that 3090 which is nice
Absolutely right! 4070S has 1,5x times lower power consumption than 3090
So solid 3080 TI levels of performance and VRAM for half the price at 60 percent the power?!?!?!?!, yet somehow this product is bad according to AMD fanboys
doh, 3080ti was crypto grift at nearly double the 3080 msrp for 10% more performance. Yeah 12gb $600 is shit in 2024.
@@Vextime77AMD fanboy cope. The 4070 Super is an excellent value. Wayyyyy better than anything AMD offers.
12gb is Bad
@@Wobbothe3rd12gb is stupidity
I understand the call for 16GB of ram but clearly the card gets outdated before the amount of ram does. Look at all those 24GBs on the 3090, pointless.
Gotta wait for the PS6 for the 12GB to be outdated. Developers aim for that limitation because of the consoles.
Cards don't feel like their cost... still rocking a 1080ti Strix.
NVidia lied - shocked Pikachu face.
But it does beat it in 1080/1440 in some of these games. So partial truth?
Did you idiots even watch the video!? Nvidia told the truth.
Ill properly never use dlss. I'd rather have raw performance or turn down the resolution.
The Alan Wake 2 4K Native example had both GPUs in the 30s and the noteworthy 16% win for the 3090 there slides into obscurity as people would turn on DLSS Quality at that point.
Despite some saying there are no "4k" or "1440p" cards, I think this comparison shows what people mean: Where a "1440p" and a "4K" GPU have similar average performance, they will take the lead over each other on their respective resolutions. E.g. if you play on 4K native use the 3090 (assuming both cards cost the same, which they don't).
The 3090 used is $100-150 extra over the 4070 Super new. Imo, the extra 12GB of vram makes it worth it.
@@filip9587 LOL keep thinking VRAM is that important.
@@amerrill351 Well, 12GB wasn't important 1½ years ago, but now it's the minimum. Sooo ¯\_(ツ)_/¯
@@amerrill351compare 3070 to 6800 non xt at 1440p lol you will see why vram is important lmao
I don’t understand why people care so much about frame rate in single player games. FPS only matters in fps based games and competitive play such as shooters for example. Other then that as long as the game runs smoothly your fine.
I'm so glad I managed to snag a 3090 Ti for 650 euros last year.
Many games use 15-17+gb of VRAM on my 3090.
A 4070s seem to be a 3090 with half the VRAM
What games?
Allocation is not the same as utilisation. There is literally no game out today that actually uses that much VRAM on any setting.
Happy user on the 4070super. Even if its 12 gb instead of 16gb (not to mentium 24 gb...)
. Just play on high settings. Ultra always sucked, an any card (from gtx 8800.. to gtx 285 to gtx 980ti and 1080ti/. etc) . Just don't go ultra.
Thank you. I have a 3090 and I never see anyone compare it to any new cards. I was wondering if I should wait or upgrade. I'll wait 🔥🔥
If you buy a flagship card, you can be assured that you can skip a generation
Brother you are good for at least a few generations.
4070 isn’t a good upgrade from 3090. If you’re gonna upgrade sell the 3090 and buy a 4080 super
3090 was only 10% faster than a 3080 and wss stupidly priced like all of the ada cards.
If u watch carefully... U can see in Avatar 3090's benchmark looks smoother compared to 4070 this is for all resolutions..4070's looks little laggy
Does no one spend 1 minute to watch the warning at the start of the video??
Are you out of your mind? Who the hell plays games are 30-40 fps? Cyberpunk at 50fps? Why don't you show real world use cases? Tweak the settings so it's suitable for 120hz/165hz monitors that most people have. This data in your video is 90% irrelevant. 34fps vs 31fps on graphics settings that no one would seriously play with, for real?
Tbh the entire 40 series lineup is a slap on the consumers face.
Wow, I guess 11% of all PC Gamers love getting slapped in the face. COPE.
Not paying $600 for 12 GB VRAM.... If I'm playing 1440p there will be games where the visual fidelity will suffer. Even if the texture packs are well designed not to tank the video cards performance, something has to give and that is visual fidelity. 600 bucks for 12 GB of VRAM is insulting
Thank you daniel for your hard work ! Always super informative and interesting videos !! Cheers (from italy)
Why does the RTX 4070 Super footage look so laggy, like 30 fps instead of 60 fps as seen in the RTX 3090 footage?
You noticed this too?
I have two theories, shader compilation or the 192-bit bus is choking it.
It can be as simple as him recording this footage with a frame rate setting lower than 60fps (e.g., 30 fps), while the other footage is not. @@BBWahoo
It could be attributed to the 12 GB maximum video memory on the RTX 4070, in contrast to the 24 GB on the RTX 3090. The latter has more RAM chips, contributing to increased stability.@@BBWahoo
Waiting for the 5090 before upgrading from my 3090Ti
What is it you need, 4k monitor, 3090 is too slow ?
I'm just an enthusiast and like to build a new machine every-other generation. When the 5090 comes out it'll be time for a new build. I'll give my current computer 5950X/3090Ti to one of my grandkids.@@lucasrem
@@lucasrem3090 is too slow for 4k
yes, 5090 & 4k@@lucasrem
Something that stands out a lot the 4070 super is running at 60-70% only and the 3090 is running at 100%. The 4070 is significantly more efficient.
Recheck. Misplaced temps and %usage numbers.
4070 super good! 3090 bad! End of story!
Radeon RX 7800 XT
93%
Radeon RX 6900 XT
95%
GeForce RTX 4070 SUPER
100%
GeForce RTX 3080 Ti
103%
Radeon RX 6950 XT
103%
GeForce RTX 3090
105%
My god some 1080p of of both these cards in the mid 30 fps only my god, 25fps for a 3090 in Alan wake 2 with those ray or path tracing settings, that's terrible so many settings have to be turned down or altered with frame gen to get anything decent - wowzers
what was the 3090 like a 1400+ card just like 2 yrs ago?, if available & some were near 1800 or more due to stock issues, to be only able to play some game (with the best 4k settings now in 20-30 fps is kinda nuts and with that high power, whilst it good to see efficiency get better & performance of 4070 it games are still outstripping cards abilities unless you have a bottomless pit of money to spend.
I'm not sure but I suspect many Nvidia buyers are feeling value has gone to hell, the 30 series was a stock price gouging sensation, the 40 series VRAM, duel 4080 initial row backs & price for value was broken on Launch.
With Nvidia mistakes their customers of only the last 6 months having their investments trashed for those who were early customers of who the first 40 series cards.
I dunno what a mess, I would think trust, in their business practices and model is eroding fast - I’m sure price fixing, drip feeding tech improvements and double dipping is all going on, the value of what released, availability seems too manipulative but AMD isnt much better with matching pricing & similar issues -
😂
Still findintg it all crazily fascinating on the sidelines atm
All we can say is holy hell what a tour de force in tests of games, cards, cards v comparison Daniel is doing this month! ❤❤
I paid over $2000 for my 3090 at Microcenter in July 2021.
@@brianrobinson3961 👀👀
Just the natural order of things. There's always something better and faster around the corner. Sure you can get 15-20% better performance for the same money now but I've had my 4070 for the last 9 months. You will never buy anything if you worry about this shit. In another 12-16 months the 5070 will be out and be another 30% faster than the 4070S.
Technology evolves. Of course newer cards will run faster and older cards will depreciate in value.
Same for TV, cars etc. You certainly didn't expect tech to stagnate did you?
@@ehenningsen nope I know that of course, we all love tech & tech improvements (and it doesn’t stand still for sure & we wouldn’t want it too) - not against tech advancement! - it’s more the point about business pricing, supply & release scheduling for customer trust/value & what In what I often often suspect is purposeful price fixing etc & purposefully downgrading what’s possible in a drip feed small tech improvements in the recent cycles but as long as we keep buying they keep doing it, I do feel many have lost trust now in some of their future consumer confidence. You must see what they doing? release something not that good for high amounts, low stock, whilst already having another better version ready (at same time) but release it 4-6months later for double dipping etc
So maybe I didn’t make my point well enough it’s the business practices & customer trust being eroded that I’m worried / interested in to see how this pans out.
The last 5 years business practices have changed you cannot really say consumers feel the 30 and 40 series releases have gone well or been valued by the consumer like - anyhow just my rambling 😂
I think the VRAM is going to be an issue within the next year or so. However, Nvidia blocking the 30 Series from FRG means these cards are money grabs and you should go with AMD, until they stop trying to artificially enhance the same level of performance to create value.
The performance is impressive for the price against what the 3090 retailed at, and even at it's sale price at the 40 series launch, what is more impressive is the power draw comparison. I ran an EVGA 3080ti hybrid and it pulled way more power than the 4070 Super, and based on this comparison I assume would have lost in a FPS comparison. I'm not saying Nvidia's pricing model is where it should be, upgrading to a MSI 4090 hybrid cost an arm and a leg, but these Supers seem like decent cards and upgrading to them may not mean needing to buy a new PSU to run it.
It's impressive just because the 3090 was so bad in terms of price/performance, that even if this 4070 super is still a really bad value, seem "so much better", but it isn't.
They should cost 100 to 150 euro/usd less to be just "a good" value.
I miss the golden age of the rx580, gtx1070 and 5700xt
I see one problem in this test, my rtx 3090 ftw3 takes 450w and keeps clocks at 2000mhz+ and that makes a big difference
okay if you look at both images yes both have same fps but if you look closely for some reason the left side is running more stuttery, the big give away was the animals moving. the right side has a much more fluid feel to the play back of the image. even in the gun fire
Yes I agree. I seen more stutter in the 4090S. The extra cores in 3090 has the win 🥇
@saintcastle 192bit bus for the 4070 super.. that in other words.. you wouldn't put a 1600hp engine in a reliant robbin that has 3 wheels and keeps falling over and no all wheel drive.. fucker be stuck doing burn outs, you squeezed the life out the engine.getting no where.. think of a tap.. turn it.to.a drip ( 4070 super) then turn it up some more but not max you have ( 256bit) , 384bit and up is consider better.. for.higher end cpus, it's basically.a.bottle.kneck.. as you ain't.able to push any more computing data through the pipe line.and that'll end up causing stutters..
which one would you get if you had the money? I have a 3070 now but I was thinking about getting a 40 series but this video made me question it. So more cores isnt always better?
@saintcastle just wait Intil the 50 series
@@NrGSourcEsurfeR ok will do. the30 series card is working good so i should be ok until the 50 series comes out.
Why not try lossless frame generation on these, That can give upto 4x the performance, Nice piece of software , Big plus for older games locked at 60fps 😉
I've loved my 2080 Super up to this point. I think the Super cards offer peak performance to dollar ratio for their respective generation, and look forward to upgrading to a 4080 Super
That will be a great upgrade. Hope you get one at msrp man, good luck.
Ti cards are best in their generation
@@WololoWololo2 You get diminishing returns when talking on a dollar to performance ratio, typically paying 50% more for 10-20% extra performance out of the Ti card. So yeah, the Ti cards offer the best outright performance, but my comment specifically says, Super cards offer the best performance to dollar ratio, matching the launch prices of their non Super counterpart, while being just shy of the Ti performance
@@ZZPxFTW On Ti cards you can use it for way longer, RTX 3090 Ti owners wont change their hardware for atleast 4 years because of how powerful they are. More expensive but more life in it. You wont get anything after this best class hardware of years ago still can do amazing things better than cheap hardwares
@@WololoWololo2 the last sentence is not even coherent, so I have no idea what you’re trying to make a point about. And, the first point you make is already addressed in my first reply to you and what you’re trying to say makes no sense. 30 series didn’t have a super, but even still, let’s use your example. The 3090 Ti is only 12% more powerful than a 3090. So it’s not “THAT much more powerful,” as you’re claiming. 12% more performance DOES NOT equal and extra 4 years lmao. For my 2080 Super, it’s actually 18% less powerful than a 2080 Ti, so a bigger gap in performance than the 3090, but even then, the 2080 Ti still becomes out of date around the same time as my card, given a 4070 Ti is 56% faster than a 2080 Ti, while 82% faster than the 2080 super. So instead of getting 100 fps while gaming, I’d be getting 118 fps if I had the Ti… where’s this mega powerful Ti performance and lifespan you’re talking about? Doesn’t exist. Both users would find value in upgrading around the same time. All this without reiterating the fact that Ti cards are more than 50% more expensive than their Super counterparts. If your philosophy is “get ripped off to get the extra 12-18% more fps” then yeah, get the Ti. The rest of us are smarter than that. Don’t call my card cheap hardware either; $700 for a gpu is not cheap
I still don't understand not including FG/DLSS as "performance" as they literally add more physical cores to do that, to add calculation performance which is only different now because it's handled by AI, which is YET AGAIN just another calculation method that is now proven to be more efficient than traditional rasterization. We all know AI is going to be the future anyway so why are we still disregarding it...
I don't know what everyone is bitching about with the 4070 Super. It's a great value to performance. The 3090 had an MSRP of $1,500 and of course the scalping made it even more ridiculous. Granted that was 2020, but at least now scalpers don't want anything to do with it and I get a card at MSRP that let's me play 1440 at high frames especially with DLSS and frame-gen. Also, this card will last maybe 1 or 2 years then I can get a new card for $600 that keeps up and I'm only out $1,200 in 4 years instead of double that or more.
In the UK you can get a second hand 3090 for £750. So you get a card that's on par with 4070S for around the same price but with double the VRAM.
As someone who plays games and uses AI, I went with the 3090 and it's been fantastic.
Same here, I couldnt justify buying a 4090 this late in its life cycle at full price but I needed to VRAM for AI workloads, so I went with Asus TUF 3090 for 750 USD.
The 3090 is a large and power hungry card. You need to have a case and psu to match. I could have bought a used 3090 for the same price I paid for my 4070, but I would have buy a new case and psu so I didn't bother.
The worst is that 3090 or 3080 potential has never been truly tapped..
we just push an insanely expensive never series and stop the old before it’s potential has been even marginally released
You don't get FG which I think makes 4070 worth
@@MrDarcykampeIt can be easily undervolt to perform similar at 250 watt power draw.
This video shows all, all these highend super expensive rtx4090 sold for $2000.00 , when next gen comes out it will lose to 70 class $600 gpus ,
Its morons who bought rtx2080 ti for $1200.00 , then rtx3070 beat it for $500.00
Same happened here rtx3090 sold for almost $3000.00 now its beat by 70 class gpuu again for $600.00
People should by now all high end becomes mid range in a few years , why spend that much money , just stupid fanboys cant stop the itch , and making jensen billionaire.
Side conclusion: cards perform almost identically in 1140p without DLSS and 4K with DLSS. Taking into consideration how good DLSS is and how cheap 4K are becoming I would pick a 4K screen especially since the further in the future the faster GPUs will be and most of us don't buy a new monitor every 2 years or so.
Iv been telling ppl this but then im called dumb lmao.
I bought a 4k 144hz monitor and i play singleplayer games in 4k / 4k w/dlss Q, and fps games in native 1080p 200+ fps.
a 4k monitor can display 1080p waaaaaaaaay better than a 1440p monitor bc 4k divides into 1080, 1440p doesnt. 1440p should be skipped bc 4k with dlss Q will look better than any native 1440p monitor while requiring near the same gpu performance
@@XxAtomic646xX based
Could the 3090 have a new lease of life with FSR3? What would such a comparison look like? Between the 4070 super and the 3090 with FSR3
all the numbers mean nothing except the 1 percent low. just saying.
i don't know about you but i find it baffling that some games can't even reach 100 fps mark with these graphics cards at 1080p, not like they're weak. I swear optimization is a dumpster fire with newer games. Companies be like "enable dlss and frame generation 🤓", hell yeah 66% resolution of 1080p, now i can really start counting pixels
Damn that 3090 uses a lot of power just to match the 4070. Pretty insane how efficient 40 series is
Its what we should have gotten from the 40 series. The 70 class cards perform around the 90 class of the previous gen.
that is what we got?@@clownavenger0
4070 Super not just "4070".
you are comparing a x70 class gpu with x90, of course x70 is more power efficient, lol... gtx 1070 was also incredible when compared to 980 Ti, same performance, less power consumption... oh, and guess what? 5070 will also perform in between 4080~4090 and consuming less...
@@SubzeroBlack68yes same price, it replaced 4070
much smoother expirience on 3090,4070 Super sttuedirng like crazy
Basically if you have a 3090, then wait for the 5000 or 6000 series...
I have RTX 4090 skipped 30 series due to miners. I will wait for 6000 series.
The only thing I don't like about my RTX 3090 is how much power it draws---420w+ just the GPU alone! It's s nice space heater during winter but in spring it's like a sauna inside my room. 😅