It's because they did not have a model that could be competitively priced against the Intel Battlemage B580. The model they originally had planned for release got beat out so the revision on that model has been postponed until August 11th of this year. It's just competitors biting at their lower end models, nothing too serious. It's good to see competition.
@@vextakes How bout you not making a video about how AMD is not even showcasing their new Rx 9070xt and instead keep yapping about AI same as Nvidia? Your AMD fanboyism is showing.
Clear as day too. Wish more people pointed out this obvious ploy. On top of them overselling the shit out of this generation. They know how to play their customers that's for sure.
The raster cores being about 15% uplift at most already tells enough. And they called native rendering "brute force rendering" as a cherry on top. Clowns
i think some ppl just hate nvidia so much, that whatever they does, someone has to say something negative about it. i dont want to defend nvidia honestly, but atleast the price has adjusted a bit and perfomance wise? well see about that claim in a few months. if it is as good as they claim, i dont care even if it fake frames or anything, as long as it works... then it works. eventho: the more generated frames produced by AI, i can sense wider error/inaccurate image representative coming. but, i guess we'll see.
2080Ti vs 3070 was a foolish debate. Everybody bought into the 3070 hype thinking only to then find out it was an 8GB scam. 2080Ti was overpriced for sure but ain't no way it is worse than a 3070 in the long run
*Nvidia's upcoming launch lineup features* RTX 4000: AI frame gen RTX 5000: AI multi frame gen RTX 6000: AI ultra frame gen RTX 7000: Only one frame will be the game itself & rest is AI generated RTX 8000: There is no actual rendering, the brand new RTX 8000 series will AI generate all of the frames for you RTX 9000: AI will generate whatever game you want to play real time just like chat gpt RTX 10,000: AI will generate android girlfriend to make sure gamers get laid
They did the same when they announced the 4000 series at CES a few years ago using dlss with frame gen only available on the 4000 series vs. the 3000 series. Admittedly, there was a decent bump in straight raster performance, tho.
Brain has own dlss "software" so to speak, that allows us to see clean uninterrupted picture. There are physical limits with hardware processing power that you can dedicate to vision or rendering. I am not excusing nvidia in any way but they should be more honest with claims and prices.
Fake frames SHOULD NEVER be used to compare performance. It should be shown as a separate feature that makes the new generation a better option for those who want it. Performance should only be compared with native. Taking DLSS upscaling into account is okey aslong as it is shown separate. This new generation is a scam, just used FSR and FSR framgene. Literally the same thing but free
I just used some sort of intel frame gen on my gtx 1050 to get 45-60 fps on Cyberpunk 2077, so according to Nvidia, my 1050 can run modern games at 60fps. Realistically the 1050 only gets 15-25 fps on low settings at 1600x900.
yep, some ppl said that :D and it make total sense. but the fact that those pirces are without any taxes etc is also what ppl forget. 1250 for a 5080 ? and that is without any AIB slaping their 300€ cooler on it. The 5080 will be close to 1,5-1,7 ... here in europe. especially when it launches
@@Bastyyyyyy This a problem for anyone not living in the US, your costs are coming from outside factors not related to Nvidia but you want to blame Nvidia for them?
This is exactly why I was 100% fine buying a high-end Zotac open-box 4080. Original MSRP was $1,400, paid 900. Thank u and it means no new purchase $$ to NGREEDia 💰💰
Yeah GDDR7 is worse than GDDR6X isn't it? This happens with many generations and the new cards are still faster, those specs don't mean anything when comparing different generations.
@@DrizzleT1 that's not really true. GDDR7 is faster than GDDR6 and 6X. specifically, it gets about 32Gbps per pin where 6X got 21Gbps per pin. It's more power efficient as well. 6X was only faster than 6 because it used more power, so 7 is better in both respects. There's a lot more specifics about RAM and VRAM these days, that really don't apply to the end user. Things like, error rates, error checking, etc, and some of these things were issues in the past, and have become issues with newer RAM/VRAM versions, but by time the standards are set, the issues are largely resolved such that it is of no concern to us, the consumer. For instance, ECC RAM was never a thing for consumer RAM chips in the past, but with DDR5, some error checking is required since it's so fast that errors happen naturally, but because it is so fast and with the included ECC (sort of), DDR5 ends up being faster than DDR4. I believe something similar might be going on with GDDR7 vs GDDR6/6X, but i cannot confirm, and it wouldn't matter anyway, GDDR7 is faster per pin, per module, per GB, and per watt. it's better in every measure.
They talk about RayTracing and AI, only to deliver GPUs with low VRAM capacity, even though they (nVidia) know, that these two features need tons of VRAM.
@@yanis_s500 4090 cost $1,600 at release. price is $2,500-2,700 for a while. price went up by $900-1,100. how can you justify the price? you can't. X090 is focused on gamers and not creators. X090 is an enthusiast video game GPU. it's made for 1440p-4K and high fps. for creators there are dedicated workstation GPU's, but they cost like $10,000 and above, so people buy 4090 cause it's cheaper. do you see AMD's Epyc CPU used by gamers? no. it's a workstation CPU. there are GPU dedicated for workstation.
see most people dont want to hear this, but the 50-series has custom silicone, specifically designed for this workload - its simply not feasible to integrate MFG with the 40-series, as response time would go down the shitter.
@@nickwong2525inexpect Jensen to come out one time with his arms straight out from every jacket he adds to his collection and wears per presentation. YOULIKEMYJACKETS??
NVIDIA doesn’t care about gaming at all-we already knew that. They’ve focused all their updates on boosting AI performance. I think DLSS 4 was created just to showcase their capabilities with generative AI models.
Mores law isnt what it used to be, its harder and harder to get those performance leaps. Thats why AI is the focus now, cuz thats something that is getting a lot better.
@@dante19890 Moore's law has been thoroughly broken for +10 years now if you think about transistors per chip (we used to think transistors per CPU core back in the old days of one core per chip) in the old way. Parallelism has been one of the few things to keep performance increasing. Process nodes aren't getting much denser chips anymore, chips can't get much bigger anymore, clock speeds have stalled years and years ago and so on.
its not just third world contries, will probably cost the same in most of europe too, its only really the US and maybe Canada that get these cheap-ish prices
wait or look for more, theres chance for local retail stores like an ARC B580 costing fairly 290$ w/ tax edit: i mean local store similar to the likes of EasyPC would take weeks or a month.
In 2022, Nvidia also claimed that the RTX 4080 would be 2x to 4x faster than the RTX 3080 Ti. After testing, it turned out to be only 33% more powerful.... Bs marketing and fake laggy frames, no info regarding real performance uplift.
4:48 I really don't understand how you would even spend 10k on a PC unless you were filling it up with stupid amounts of RAM and storage and an insane custom water loop.
10k on a ultra entertainment command center for PC gaming. He is smoking, but there are ways to spend that much on PC gaming. Simracing, flight sim, triple screen oled, highend VR rigs, etc. To play Indiana Jones? Or Cyberpunk? On a single screen 4K 240? Yeah, no need to break the half mark.
I got close... Threadripper with 128gb ram and 8TB m.2 storage plus an RTX 3070 that I will upgrade to the 5090 as soon as I have seen some reviews. I use it for work and for gaming.
Yeah? And fuel and food used to be cheaper too. What else are you going to moan about. Graphics cards are non essential luxury items. You dont like the price? Dont buy em.
Is it over for AMD? Will Jensen dominate Lisa Su? Who'll be the jobber of this generation? Does Trump being a literal WWE hall of famer has anything to do with this? Find out on the next exciting episode of Dragon ball Z!
AMD can't win in GPU. Nvidia has brand loyalty. 4070 at $600+ outsold 7600 at $300. how did this happen? brand loyalty. AMD beating Nvidia is like thinking Iphone will lose on phones. ain't gonna happen = brand loyalty.
When we voted for that 1400$ 5080 MSRP we were expecting it to be 4090 level of performance. If it's not 4090 level of performance, if it just beats the 4080 by 5% than ... yeah. 1000$ sounds reasonable. They moved down the stack again. What was supposed to be a 5070Ti for 800$ with 4080performance, now you pay 1000$ for it and it's called a 5080 And so on and so forth
You can look at far cry 6 and plague tale Requiem it's like 30% with regular dlss/some ray tracing compared to the 4080 which would be comparable to the 4090. Not bad for the same price as the 4080 super rn.
@@Chris67-p9v It's just RayTracing performance. Not Raster performance. And yes. It has better RT performance. What about games that don't have/use RayTracing? What if the game I'm playing RayTracing, but I don't my framerate to dip at all, therefore, I'm not turning it on? +5% Rasterized That's how we measure actual performance. Until every game out there will have RayTracing that you can't turn off, this will be the way we judge Performance.
@@gruiadevil even hardware unboxed is expecting at least 20% increase gen on gen(just raster). Which with the memory speed increase with gddr7 and increased bandwidth, as well as the admittedly lackluster core count increase and architectural improvements, is perfectly reasonable. Some new games do have mandatory rt btw, and far cry 6 as well as plague tale Requiem aren't doing full rt.
@@Chris67-p9v 1. They are expecting 20%. Doesn't mean they will actually get it. Unfortunately we're going to get that only at the 90 class level. I'm expecting 30% at the 5090 vs 4090, judging by the specs alone. Anything else is looking ridiculous. They are heavily relying on FakeFrames to market as "performance uplift". 2. Check the Boost clocks. Most of these cards have significantly reduced Boost clocks to fit into the power envelope. The core-counts doesn't look much improved. 3. GDDR7 speeds will help in games where GDDR7 matters. For instance, PlagueTaleRequiem is famous for favoring GPU's with faster memory. No wonder they showcased it. 4. "Some" new games have mandatory RT. Correct. In those specific handful of games, having better RT is nice. In the other Hundreds of Millions of games available on the market it won't matter. I repeat. Until ALL the games available on the market have mandatory RT, testing JUST with RT on is deceiving the customer.
Upscaling is where I draw the line, that's fine by me as DLSS produces a really good quality image at 4K and you can hardly tell the difference at a sitting distance, especially when playing on a TV (couch gaming with your PC, anyone?). But I absolutely refuse to use fake frames as fake frames don't equal performance / lower latency nor do they really look as good as the real deal. And if you're into playing older titles at crazy high resolutions such as 8K without those games having DLSS, raw performance is what you need. Aside from the RTX 5090, the rest of the models are a disappointment in that regard.
Yeah, why don't they put their R&D into making DLSS work in older titles or all titles, or creating a branch of DLSS that does because it seems AMD is doing that and that makes me pretty jealous of them even if their implementations aren't as good.
@@CockpitCrusherdude Its literally cheaper for them to just double vram and charge us an extra 50 bucks Will they do it? Nope. Because that doesn't help them strengthen their monopoly.
@@FLMKane My point is that if their form of DLSS doesn't work on older titles and newer titles that weren't tailored to have it, but AMD's AFMF does, then they won't have a true monopoly. Because there are games still coming out now that don't support DLSS, so that is a huge library of games without any support for the feature.
I just bought 4070 Ti s and didn't know about 50 series release date and now i regret it or should I ? Because multiple frame feature is basically lossless scaling
@@Retrogamer-wmhah...shoulda waited. Im glad i did. Paying barely 150$ more for current gen, high native fps, hell yea. Will see real test results soon. Ill buy in three months time
People are so laser focused on how much they hate frame gen that they're missing a respectable 25% uplift when it's turned off. Considering it comes with a price drop aside from the 90 that's better than anything we got with 40 series. I'm cautiously optimistic, but if Reflex 2 works half as well as they claim maybe you finally _will_ want to enable frame gen.
Is no one going to mention how a new 70 series card only has 12 GB of VRAM for the 2nd time in a row now? It's ridiculous. Edit: And the base RX 9070 is going to have 16GB GDDR6/GDDR7 lmao
12GB of Vram for a 70's card isn't the worst thing. Back in the day we would usually get 1/4th less Vram than the 80 series and sometimes slower vram. I would rather have Vram that is the same speed as the 80 and less VRAM, like if you want to game at 4k pay 4k prices unfortunately, but with how close people have to sit to monitors its unncessary to go past 2k resolution anyways which 70 can easily run.
@@Tjtellsthetruth Factors that Influence VRAM Usage: Resolution: The higher the resolution (e.g., 1440p or 4K), the more VRAM the game will use, as more pixels need to be processed and stored in memory. Texture Quality: Setting texture quality to Ultra will significantly increase VRAM usage, as high-quality textures require more memory. Ray Tracing and Other Effects: Enabling Ray Tracing, DLSS, and other advanced visual effects will also increase VRAM usage, especially on high settings. Mods: Any mods that increase graphical fidelity can also increase the amount of VRAM used. The issue with not enough VRAM is that you could be at a point unable to play a game smoothly at any settings especially on big AAA single player with big maps like all the AAA RPGS. So in practical terms the amount of VRAM gate keeps you from playing newer AAA titles especially the open world ones.
I think why we aren't seeing non-dlss/raytracing/frame generation performance is because there hardware is getting to the point where the only improvements can come from dlss, raytracing, and frame generation
Also got to keep an eye out for future ti and super cards like a 5080 super, 5070ti super, and 5070 superif they follow what they did for the 40 series 🤔
The price to bandwidth of the 5070 is 1.2240437158 gb/s for every dollar, the 5070 ti is 1.1962616822 gb/s to every dollar, the 5080 is 0.960960961gb/s to every dollar, and the 5090 is 0.8964482241 gb/s to every dollar at msrp prices so the 5070 and 5070 ti are better price to performance in terms of bandwidth
Wish they stop focusing on that garbage including computing power ans just focus on power efficiency, reliability, cooling and highest vram in that price range. The 8gb and 12gb mindset needs to stop ffs
@@three_wood_sen Nvidia sheep's can't cope that AMD will always be the smartest price/performance choice 😂 They spend so much money just to see other guys having the same performance for way less money 😂😂
Thats true, but relative to the raster preformance of other cards on the market, the raster will be better worth it if your upgrading from a 30 series or below. Otherwise stick to what you have, its honestly not worth upgrading for.
i also suspect Nvidia is pricing the gpus a bit cheaper to gain some social credit amongst the gamer community, but are going to raise the prices as soon as Trump's Tariffs kick in and take the opportunity to blame the new presidency as the reason why they increased the price. which im sure is partially true, but i bet you anything Nvidia will raise the price more than to cover the tariffs, seizing the opportunity to charge what they originally wanted to charge for the new gpus.
This always happens when you are in a monopoly. Intel did it, GM did it and now NVIDIA. Honestly we need another GPU maker shake up. I am hoping for an entry from one of the big electronic players.
This is exactly what Threat Interactive was pointing out in their recent video about the video game industry now no longer doing proper optimization. The devs are relying more and more on blurry "topping effects" for games instead of actually using optimization techniques, because it fosters an economy in the gaming industry where people are forced to buy the next new GPU that only really gets to tout "amazing frame rates" while using said "topping effects". The hardware is starting to shift prioritization towards enabling these blurry *ss excuses instead of actually bringing out innovation in raw performance and efficiency. Meanwhile the prices just keep ramping up to oblivion.
Getting GPUs at their MSRP is another delusion for me. As someone who bought an RTX 4060 Ti for $455 even after waiting 7 months for a price drop, I can confirm it
i bought the 4080 super online at msrp on the first try around launch. my friend also did at best buy in store first try...im not saying thats the case for everyone all the time but to call it a "delusion" is...delusional
Yeah, that's true. Then you have the few unicorns that pop in here saying they scored it at MSRP. Admittedly, I'm one of them. I scored the 4090 at MSRP tax free. Buuut, I don't expect lightning to strike twice when it comes to the 5090. My nephew is trying to buy one, so maybe it can happen. I guess we'll see once the trackers are spun up and the wait begins. 😅
Don't forget there's going to be a vast difference between the announced price and what you'll end up paying to get one when stock is going to be near non-existant.
Honestly, they should stop releasing more powerful cards ans just make old cards bht more power efficient and with higher vram. Have that line up be for normal gamers while all this garbage for elitists
Because the RTX secretly are weak gpu's. Using ray/path tracing and DLSS as an excuse to raise the price for their marketing. AMD had been a superior cards for years now but not in 2017 though
you were were very effing lucky dude. but yeah the pricing is just ridiculous. I still feel kindabad for buying a 4070 super. 12gb vram for 550€, because my 2070 super really ran out of steam. Like yeah I am happy with the performance, but the pricing for a card with such poor specs is just so loughably bad and amd is not that much better. their prices also went up with nvidia's prices
Aren't those prices announced by Nvidia for FE cards? And then you add brand markup (Asus Gigabyte etc.), store markup and VAT you end up paying a fortune for a mid range 12gb gpu.
ye that is kinda wild. I think AMD is actually worse cuz they price their much more inferior cards close to the nvidia ones when they should be ALOT cheaper.
I think the 5000 series being on basically the same die-size as the 4000 series means they can only do so much without lowering efficiency. The biggest difference is definitely GDDR7 but not huge.
Putting the raw specs in front of me really put my eagerness in check. Every feature they announced today will be on other RTX cards aside from MFG which I can live without, I've never really used it before except in CP2077. In most cases I would only get an extra 20fps, like Alan Wake II only went from about 63 to 84 or something like that. I could also notice the input delay; it wasn't enough of a boost to justify the delay for me. The new FG updates might change this going forward as it's more efficient, but that'll largely be something I can test once the updates are out for 40 series.
There is no game on the market today that would warrant the expenditure of 2000 dollars to buy a 5090 for. I'd make the case that there is not even a game that would warrant the expenditure of 1500 dollars for a 4090. For 2000 dollars you can build a PC yourself and if you are willing to use and scrounge for some parts you can get a better deal, and have money spare to buy and play the games you like after.
Im on a 3070 so cant really skip 50 series, 2 more years at least on an 8gb 5 year old card for someone that wants 1440p high settings 80+ fps in modern games 😢
the discount is to give illusion of a fair price when in reality compared to the last 2 generations its still overpriced. its done to get people used to the insanely expensive mid range gpus that cost the price of gpus of the high end years ago
People are getting tired of the AI and upscaling methods. It makes the game look disgusting compared to native resolutions. I would rather play anything at native 1080p over any upscaling because of how much worse upscaling makes your games look. From someone who's always done native resolution, ai upscaling looks like trash.
Bus! Memory cheapness, over pricing based on software like dlss 2:03 I’m sticking to used cards and aiming for an upgrade from 3080 to 4070tisuper/4080super
Im new to pc gaming ( i dont even have one), so im confused. What exactly is upscaling, and why is it such a big problem. Isn't better performance and graphics a good thing, what am i missing?
It's because game developers use upscaling as a way to avoid optimizations. Basically, it's a way to increase performance and developers see that and go 'Well, we can reach 60 FPS using this, good enough'.
Kinda reasonable? Those are prices before VAT or sales tax and from nvidia not their AIB partners. Living in a country with a 20% VAT where pretty much all cards will come from AIBs, no, those prices aren't reasonable.
I dont care about AI and i dont want fake frames. I bought a vega 64 for like 100 bucks a year ago and overclocked the snot out of it while undervolting it too. It runs fast and cool and all my games at 1080p run well over 60fps including space marine 2. Dont feel like upgrading kinda ever with these prices ngl.
Maybe the 7090 will be able to run cyberpunk 2077 4k 60fps native with path tracing... Maybe! Oh and yeah. For 3000$ MSRP I am sure. That means if you want an asus rog for examle , with 3rd party and country/region taxes it will be more like 3500$. Impressive.
Prices will be about what people predicted from OEM’s, the MSRP hasn’t been that way for over 20 years. The BOM might be less just because it’s physically a smaller PCB, so manufacturing efficiency may be passed on to the consumer for the lower spec cards.
Great job, Vex. Good analysis. What a times, where you can get excited about "good" rtx 5070 price but than after some thought process, you see, that it's not actually that good in term of performance. Seems 5070super for 600usd will be a good choice, unless it will be +5% more on cores etc comparing o 4070 super.
AMD make great gpu no doubt. Is kind of sicken me that they will copy Nvidia because everything RTX bring makes absolutely no sense for anything gaming. The Nvidia rasterizing is bloody weak. What if they remove DLSS and that ain't gonna be any good marketing trick
I just think it's ridiculous that all the qualifiers for GPU performance are tied to video games. There are performance related tasks, such as editing, that cannot make use of AI to create better, fake performance. Native performance is what matters. Oh, also, thanks for not segwaying into a sponsor like Linus does. The entire GPU community appreciates it.
Basically, 5xxx series have more ai to deal with the shitty performance that you got with 4xxx and Ray Tracing. So for single player games that look fancy… you have 5xxx series, for first person shooters, I wouldn’t bother changing if you are happy with your current performance
I hope i am not the only one who thinks that the focus on using AI for more frames is meh, damn artefacts with dlss annoy the heck out of me. Also kinda hilarious seeing a guy with what looks like a jacket, expensive enough to pay me a years salary easily or more, talking about a 10k home multimedia command center. Like most gamers dont use the same monitor for 10+ years.
I still think those leaks are closer to the truth. GPU MSRPs on Nvidia cards haven't been the price you pay for the the cards for many cards in past 2 gens now. So the real price is msrp + 200-500 , because there's still going to be tariffs too. When the reality is that for longest time you could get 7900xtx for cheaper price than the normal 4070 Ti costed at the time, then it's hard to believe these nvidia's msrp prices mean anything.
If you are one of those gamers who hate the modern smeared look of games, the 50series has little to offer. Almost all performance comes from DLSS4, more frame generation, more frame smear, more TAA smear on upscaled image output with DLSS. If you enjoy playing in native resolutions, and native FPS... do you even need to upgrade? And I really hate the tolerance we have for nVidia's frame smear technology that they call "frame generation". It's basically the same as what cheap android TVs have been doing for like 15 years, and you always turn it off. People say you're supposed to only use framegen if you already have 60 fps but that's ont what nvidia is showing. They're showing using to to boost sub-30 performance. And it doesn't even help with latency, because these frames are not processed by the games. The games do not process input on the generated frames - so the more generated frames you have, the worse latency you'll get.
Have gpus actually gotten better? Like how much real performance have we gained since the 1080 TI? DLSS and Frame gen is fake performance, its inherently lossy. There isnt really a good way to bench performance cause all the game engines now are "optimized" for this fake make belief performance.
Nvidia allocated their resources to improve the AI hardware a bit and more into AI software, now why do you think they locked the dlss 4 or dlss 3 for specific card variant, they selling bullshit now
nvidia did leave themselves room for a 5090 ti/super as the 5090 isn't using the full blackwell die. There is still room for more cuda cores, ai cores and ray tracing cores to go up to the maximum that Jensen said was possible for blackwell
Nah, off a glitch in the website I got my 4090 new for 1000, sold my old card and paid 400 total. None of this frame generation bs is going to sell me on the 50 series or the 60 series. I need to see pure, native frame rates that justify the upgrade. If they keep trying to use frame gen as a selling tool to compare to previous gens that's just dishonest.
@@ironhulk1979 That saying is always in favor of the one working smarter and doing less work, which in this case is Nvidia. This is bad for the consumer, not a win. So unless you work for nvidia or receive a portion of their profits, them working smarter and not harder is not good for you either.
I think you're the first channel to mention something which I've been thinking about since the announcement ... the graphic showing the GPU exploded with the PCB in the middle fails to show how the outputs and PCIE are connected! The board is either extended towards the output side and has a big hole in the middle of it or it needs to run a PCB or cables along the edges!
You get a like just for 7:36. Love these vids and I know you are a gamer but dive into Ai with all your cards and I bet this channel will blow up. Go down the rabbit hole of Comfyui
Dude, 999 is the US price without taxes. The prices you saw from AUS are with tax, 1350 will be the price for the card, I believe it will be even more than that. So you were not wrong. What's even worse is the fact that these cards aren't upgrades at all, they will just have the DLSS 4 performance jump, not a real performance jump. They are cheaper because they aren't a real upgrade, just an hardware fix to handle a new software.
What's interesting from 5080 is with only 5% increase in RT cores, they managed to get around 30% increase in RT performance (Far Cry) in non DLSS/FG condition. Personally im gunning 5080 to replace my 4070 Ti.
Based on the 5080 to 4080, it's 'sort of' fairly priced. He never said THEY ALL were. He even mentioned 4080 at $1000 was still not fairly priced before, that's what he means. Based on last gen, it kind of is.
@@Daltonisntabot That’s bad logic because last gen prices were based on what scalpers were charging during the pandemic. Thus, the pricing for last gen GPUs were based on pure greed… and the same goes for the 50-series. There’s no way to validate how the prices from the 30 to 40 series doubled and once again the prices are going up. Don’t be fooled by the prices shown for the founder’s editions. NVIDIA is simply manipulating the media with how they announced the 50 series. The 5080 should probably be the 5070 and so forth down the line. For example, it’s ridiculous that the 4080 had 24gb of memory, but the 5080 only has 16gb. The point is some naive people are saying the 5080 isn’t so bad because it’s at least cheaper than the 4080. However, I’m sure they’ll launch a 5080 ti with 24gb memory (that should’ve been the 5080) and they’ll charge more for it than the 4080.
@@DamienGWilson I didn't say otherwise. I think the 40 and 50 series are overpriced. They undersell VRam and sell purely software, keeping it locked behind their cards. It's absurd.
They can make all the pricing/MSRP claims they want but also play with the supply. And don't forget that triffs are coming, so these cards will be way more than MSRP, in my opinion.
@@Daltonisntabot Yep, but I am more financially successful and he would get cooked in alive in a roast battle or a live chat being called out on his BS... So I will call him what I want
DLSS still can’t escape the fact that FG adds latency and they still need the base framerate to be high enough. So how is 5070 going to beat 4090 if it can’t reach the base framerate? Its all BS
they convieniently excluded the RTX 5060..... hmmmmmmmmmmm
It's because they did not have a model that could be competitively priced against the Intel Battlemage B580. The model they originally had planned for release got beat out so the revision on that model has been postponed until August 11th of this year. It's just competitors biting at their lower end models, nothing too serious. It's good to see competition.
To avoid backlash from 8gb vram?
4060ti rebranded with gddr7 will be fire!!
🔥🔥🔥🔥
@@vextakes How bout you not making a video about how AMD is not even showcasing their new Rx 9070xt and instead keep yapping about AI same as Nvidia? Your AMD fanboyism is showing.
@paincake2595 and x8 pcie again 🤣
People "upgrading" From 4070 to 5070 are paying 549$ for a software upgrade.
they can sell the 4070 I guess though so probably more like 200 not that that is much better though considering its a new generation.
@@sravans149 yeah....similar to 4070 ti with less vram. When the biggest problem of 4070 was vram.
@paincake2595. 4070ti also 12gb
@paincake2595 4070 ti only have 12 GB . 4070 ti super is the only 16 gb/
@paincake2595 How is the biggest problem for the 4070 (a 1440P card) VRAM?
Nvidia faking leaks to make too expensive cards look ok is crazy
Clear as day too. Wish more people pointed out this obvious ploy. On top of them overselling the shit out of this generation. They know how to play their customers that's for sure.
Consumers are dumb. A lot of redditors are saying they are priced correctly and want to buy them.
You mean like AMD faking leaks to make their "8800XT" look like it will match a 4080 for $599?
The raster cores being about 15% uplift at most already tells enough. And they called native rendering "brute force rendering" as a cherry on top. Clowns
i think some ppl just hate nvidia so much, that whatever they does, someone has to say something negative about it.
i dont want to defend nvidia honestly, but atleast the price has adjusted a bit and perfomance wise? well see about that claim in a few months.
if it is as good as they claim, i dont care even if it fake frames or anything, as long as it works... then it works.
eventho: the more generated frames produced by AI, i can sense wider error/inaccurate image representative coming. but, i guess we'll see.
The people who think this is remotely close to the 2080 ti vs 3070 are fools
Those were the Days 😂😂😂😂.
Please tell me more how the 20 series is relevant to this 😂
yeah that was honestly crazy,doubt it’ll ever happen again though😅
Care to develop?
2080Ti vs 3070 was a foolish debate. Everybody bought into the 3070 hype thinking only to then find out it was an 8GB scam. 2080Ti was overpriced for sure but ain't no way it is worse than a 3070 in the long run
*Nvidia's upcoming launch lineup features*
RTX 4000: AI frame gen
RTX 5000: AI multi frame gen
RTX 6000: AI ultra frame gen
RTX 7000: Only one frame will be the game itself & rest is AI generated
RTX 8000: There is no actual rendering, the brand new RTX 8000 series will AI generate all of the frames for you
RTX 9000: AI will generate whatever game you want to play real time just like chat gpt
RTX 10,000: AI will generate android girlfriend to make sure gamers get laid
rtx 11000 make you AI
@DragonOfTheMortalKombat with RTX 11000 you will achieve CHIM
I dont know if anyone will get this reference
rtx 10,000 looks promising.
RTX ULTRA: There isn't a gpu anymore. It's only Cloud AI...
RTX 69,000 - Have sexy time in the Clouds 😋😛😛
Nvidia compared native to AI upscaling with framegen and then acted like the 50 series is some huge leap over the 40 series.
They did the same when they announced the 4000 series at CES a few years ago using dlss with frame gen only available on the 4000 series vs. the 3000 series. Admittedly, there was a decent bump in straight raster performance, tho.
Brain has own dlss "software" so to speak, that allows us to see clean uninterrupted picture.
There are physical limits with hardware processing power that you can dedicate to vision or rendering.
I am not excusing nvidia in any way but they should be more honest with claims and prices.
no they compared upscaled vs upscaled but the new upscaler doubles the frames from last gen.
Fake frames SHOULD NEVER be used to compare performance.
It should be shown as a separate feature that makes the new generation a better option for those who want it.
Performance should only be compared with native.
Taking DLSS upscaling into account is okey aslong as it is shown separate.
This new generation is a scam, just used FSR and FSR framgene.
Literally the same thing but free
FSR is copium
yeah 9070 is a scam 😂😂 and get a better job instead of complaining here
@@three_wood_sen its the truth
I just used some sort of intel frame gen on my gtx 1050 to get 45-60 fps on Cyberpunk 2077, so according to Nvidia, my 1050 can run modern games at 60fps. Realistically the 1050 only gets 15-25 fps on low settings at 1600x900.
@@three_wood_sen Better job? You're the one getting scammed buying a 12GB VRAM card for the same price as a 16GB one lol
so it works,make 5080 rumours seem super high price then reveal it at 1000 makes people think priced okay.
yep, some ppl said that :D and it make total sense.
but the fact that those pirces are without any taxes etc is also what ppl forget. 1250 for a 5080 ? and that is without any AIB slaping their 300€ cooler on it. The 5080 will be close to 1,5-1,7 ... here in europe. especially when it launches
@@Bastyyyyyy This a problem for anyone not living in the US, your costs are coming from outside factors not related to Nvidia but you want to blame Nvidia for them?
This is exactly why I was 100% fine buying a high-end Zotac open-box 4080. Original MSRP was $1,400, paid 900. Thank u and it means no new purchase $$ to NGREEDia 💰💰
yep. they make up the prices last second
@@Bastyyyyyy bro that is US specific problem, jensen is not trump
1999 msrp for the 5090 is 2500 in real life. Ridiculous😂😂
more like 3500 on ebay :)
It will equalise to 2k just like 3090 series
I'm going to have to really hover over that cart button.
Forget thinking about the 90 series like they used to be. The 90 series is a titan class
@@Aethelbeorn want it for msrp - this.
Holy, I didn't even notice! 5070 having 10% worse specs than a 4070 super is insane!
True.
Yeah GDDR7 is worse than GDDR6X isn't it? This happens with many generations and the new cards are still faster, those specs don't mean anything when comparing different generations.
Good, I was going to be pissed I spent $730 (800ish after tax) on a 4070 ti super just to see a $550 card beat it.
@@DrizzleT1 that's not really true. GDDR7 is faster than GDDR6 and 6X. specifically, it gets about 32Gbps per pin where 6X got 21Gbps per pin. It's more power efficient as well. 6X was only faster than 6 because it used more power, so 7 is better in both respects. There's a lot more specifics about RAM and VRAM these days, that really don't apply to the end user. Things like, error rates, error checking, etc, and some of these things were issues in the past, and have become issues with newer RAM/VRAM versions, but by time the standards are set, the issues are largely resolved such that it is of no concern to us, the consumer. For instance, ECC RAM was never a thing for consumer RAM chips in the past, but with DDR5, some error checking is required since it's so fast that errors happen naturally, but because it is so fast and with the included ECC (sort of), DDR5 ends up being faster than DDR4. I believe something similar might be going on with GDDR7 vs GDDR6/6X, but i cannot confirm, and it wouldn't matter anyway, GDDR7 is faster per pin, per module, per GB, and per watt. it's better in every measure.
@@CockpitCrusher It's not going to beat it in pure performance, don't worry. This is just Nvidia upselling with fake performance.
They talk about RayTracing and AI, only to deliver GPUs with low VRAM capacity, even though they (nVidia) know, that these two features need tons of VRAM.
you think that the majority of people know about vram, etc? nope. you'll see when 5060 with 8gb outsells B580 12gb.
@@SapiaNt0mata not ppl. know but Nvidia knows
They have technology in the 50 series cards that cuts down on the VRAM usage when using any of the AI features
the 5070 is borderline false advertising. I can't wait for the gamer's nexus review. steve burke can be very savage.
4k resolution full RT for cyberpunk; the 4090 has about 21 Fps. 5090, about 28 Fps. To get more fps, the rest is Ai and fake frames.
Did you not learn how percentages work in school? 21 to 28 is a 35% increase….
@@ryanvasei8412 I mean that is an increase true, but the 4090 was like 60% better than the 3090, so people are now expecting that every generation
@@ryanvasei8412 whatever makes u feel good about a garbage increase friend. ur right its 35% increase. 28 is still garbage fps
with full RT* thats important to mention
@M1szS Yeah, the massive increases comes from Ai and frame generation technologies now. They said the word Ai alot on stream.
The 4090 where I live in Europe is still 2.5-3k euros so I wouldn't be surprised if the 5090 will be around 3.5-4k eur... Which is just unreal
True! But it is a beast of a GPU, mostly focused on creators, not video games in my opinion. Therefore, I can justify the price.
In Sweden the starting price is 27990 kr
@@yanis_s500 clown
@Rzor101 holy shit!!!!. Danish here so it should be around the same..that's absolutely insane
@@yanis_s500 4090 cost $1,600 at release. price is $2,500-2,700 for a while. price went up by $900-1,100. how can you justify the price? you can't. X090 is focused on gamers and not creators. X090 is an enthusiast video game GPU. it's made for 1440p-4K and high fps. for creators there are dedicated workstation GPU's, but they cost like $10,000 and above, so people buy 4090 cause it's cheaper. do you see AMD's Epyc CPU used by gamers? no. it's a workstation CPU. there are GPU dedicated for workstation.
5070 has less AI TOPS than 4090, so if that is the case, why can't 4090 have MFG?
Because nvidia wants you to buy more gpus and save money!
Simple, they want you to spend your money so that Jensen can have another leather jacket next time.
see most people dont want to hear this, but the 50-series has custom silicone, specifically designed for this workload - its simply not feasible to integrate MFG with the 40-series, as response time would go down the shitter.
@@nickwong2525 it won't be leather next time it will be chinchilla fur
@@nickwong2525inexpect Jensen to come out one time with his arms straight out from every jacket he adds to his collection and wears per presentation.
YOULIKEMYJACKETS??
NVIDIA doesn’t care about gaming at all-we already knew that. They’ve focused all their updates on boosting AI performance. I think DLSS 4 was created just to showcase their capabilities with generative AI models.
And gamers are still going to "upgrade" to another Nvidia GPU 😂
Mores law isnt what it used to be, its harder and harder to get those performance leaps. Thats why AI is the focus now, cuz thats something that is getting a lot better.
@@dante19890 Moore's law has been thoroughly broken for +10 years now if you think about transistors per chip (we used to think transistors per CPU core back in the old days of one core per chip) in the old way. Parallelism has been one of the few things to keep performance increasing. Process nodes aren't getting much denser chips anymore, chips can't get much bigger anymore, clock speeds have stalled years and years ago and so on.
@@mort996 better than trash amd 😂😂 which keeps downgrading, even intel is better than them now
@@three_wood_sen why are you so insecure lmao
Local retailers here in the Philippines already listed RTX 5070 on their website. And it costs $1000. I love this 3rd world country.
What the actual fuk😂
its not just third world contries, will probably cost the same in most of europe too, its only really the US and maybe Canada that get these cheap-ish prices
wait or look for more, theres chance for local retail stores like an ARC B580 costing fairly 290$ w/ tax
edit: i mean local store similar to the likes of EasyPC would take weeks or a month.
5090 is 4k Australian meaning its 2.5k USD at release, priced $500 ABOVE msrp. I Fucking hate this country
In 2022, Nvidia also claimed that the RTX 4080 would be 2x to 4x faster than the RTX 3080 Ti. After testing, it turned out to be only 33% more powerful....
Bs marketing and fake laggy frames, no info regarding real performance uplift.
4:48 I really don't understand how you would even spend 10k on a PC unless you were filling it up with stupid amounts of RAM and storage and an insane custom water loop.
10k on a ultra entertainment command center for PC gaming.
He is smoking, but there are ways to spend that much on PC gaming. Simracing, flight sim, triple screen oled, highend VR rigs, etc.
To play Indiana Jones? Or Cyberpunk? On a single screen 4K 240? Yeah, no need to break the half mark.
I got close... Threadripper with 128gb ram and 8TB m.2 storage plus an RTX 3070 that I will upgrade to the 5090 as soon as I have seen some reviews. I use it for work and for gaming.
Yeah, madness. I've a 4090, Ryzen 9 7950x, 64GB RAM and it cost £2800. I'm unsure how you could spend 10K without being completely ripped off.
you said it yourself: custom water loop. and buying the most expensive CPU, motherboard, etc.
5080 being 1K is crazy... 80 series used to be 600 to 700$
ye but that was on a shitty cheap samsung node. TSMCs nodes are expensive
Why are you surprised? The economy sucks, everything is going up. Your favorite hobby is not an exception.
After vat is will be 1250
@@dante19890GTX 1000 was made on 16nm TSMC
Yeah? And fuel and food used to be cheaper too.
What else are you going to moan about. Graphics cards are non essential luxury items. You dont like the price? Dont buy em.
Is it over for AMD? Will Jensen dominate Lisa Su? Who'll be the jobber of this generation? Does Trump being a literal WWE hall of famer has anything to do with this? Find out on the next exciting episode of Dragon ball Z!
Jenson dominate lisa siu 😈💀
AMD can't win in GPU. Nvidia has brand loyalty. 4070 at $600+ outsold 7600 at $300. how did this happen? brand loyalty. AMD beating Nvidia is like thinking Iphone will lose on phones. ain't gonna happen = brand loyalty.
When we voted for that 1400$ 5080 MSRP we were expecting it to be 4090 level of performance. If it's not 4090 level of performance, if it just beats the 4080 by 5% than ... yeah. 1000$ sounds reasonable. They moved down the stack again.
What was supposed to be a 5070Ti for 800$ with 4080performance, now you pay 1000$ for it and it's called a 5080
And so on and so forth
You can look at far cry 6 and plague tale Requiem it's like 30% with regular dlss/some ray tracing compared to the 4080 which would be comparable to the 4090. Not bad for the same price as the 4080 super rn.
@@Chris67-p9v
It's just RayTracing performance.
Not Raster performance. And yes. It has better RT performance.
What about games that don't have/use RayTracing? What if the game I'm playing RayTracing, but I don't my framerate to dip at all, therefore, I'm not turning it on?
+5% Rasterized
That's how we measure actual performance. Until every game out there will have RayTracing that you can't turn off, this will be the way we judge Performance.
@@gruiadevil even hardware unboxed is expecting at least 20% increase gen on gen(just raster). Which with the memory speed increase with gddr7 and increased bandwidth, as well as the admittedly lackluster core count increase and architectural improvements, is perfectly reasonable. Some new games do have mandatory rt btw, and far cry 6 as well as plague tale Requiem aren't doing full rt.
@@Chris67-p9v
1. They are expecting 20%. Doesn't mean they will actually get it.
Unfortunately we're going to get that only at the 90 class level. I'm expecting 30% at the 5090 vs 4090, judging by the specs alone.
Anything else is looking ridiculous. They are heavily relying on FakeFrames to market as "performance uplift".
2. Check the Boost clocks. Most of these cards have significantly reduced Boost clocks to fit into the power envelope.
The core-counts doesn't look much improved.
3. GDDR7 speeds will help in games where GDDR7 matters. For instance, PlagueTaleRequiem is famous for favoring GPU's with faster memory. No wonder they showcased it.
4. "Some" new games have mandatory RT. Correct. In those specific handful of games, having better RT is nice. In the other Hundreds of Millions of games available on the market it won't matter.
I repeat.
Until ALL the games available on the market have mandatory RT, testing JUST with RT on is deceiving the customer.
Upscaling is where I draw the line, that's fine by me as DLSS produces a really good quality image at 4K and you can hardly tell the difference at a sitting distance, especially when playing on a TV (couch gaming with your PC, anyone?). But I absolutely refuse to use fake frames as fake frames don't equal performance / lower latency nor do they really look as good as the real deal. And if you're into playing older titles at crazy high resolutions such as 8K without those games having DLSS, raw performance is what you need. Aside from the RTX 5090, the rest of the models are a disappointment in that regard.
Yeah, why don't they put their R&D into making DLSS work in older titles or all titles, or creating a branch of DLSS that does because it seems AMD is doing that and that makes me pretty jealous of them even if their implementations aren't as good.
Bro they are selling you a solution via dlss and dlaa to a problem they create which is taa
@@CockpitCrusherdude
Its literally cheaper for them to just double vram and charge us an extra 50 bucks
Will they do it? Nope. Because that doesn't help them strengthen their monopoly.
@@FLMKane My point is that if their form of DLSS doesn't work on older titles and newer titles that weren't tailored to have it, but AMD's AFMF does, then they won't have a true monopoly. Because there are games still coming out now that don't support DLSS, so that is a huge library of games without any support for the feature.
DLSS = Fake frame software. That shows high frame count in software.
DLSS is used not only for framegen ;-)
the big hitters this round are the 5080 and 5070 ti, but if you already have a 4070 super/ ti super, or a 4080 super there is no reason to upgrade.
I just bought 4070 Ti s and didn't know about 50 series release date and now i regret it or should I ? Because multiple frame feature is basically lossless scaling
@@Retrogamer-wmmy brother just stick with your card. Its plenty fast
@@jacobthatguy8001 yeah i will can't sell it for the price i bought or trade it for 5070 ti so yeah but it's a beast anyways
@@Retrogamer-wmhah...shoulda waited. Im glad i did. Paying barely 150$ more for current gen, high native fps, hell yea. Will see real test results soon. Ill buy in three months time
@@fatmanslim4592 yeah I needed it for some work but it's okay my gpu works fine it will work for more 5 years easily
The 5090 is 2500+ USD in my country, so the 5090 poll wasn't totally wrong
People are so laser focused on how much they hate frame gen that they're missing a respectable 25% uplift when it's turned off. Considering it comes with a price drop aside from the 90 that's better than anything we got with 40 series. I'm cautiously optimistic, but if Reflex 2 works half as well as they claim maybe you finally _will_ want to enable frame gen.
Exactly my opinion
Wow, 25% uplift for 20% more power consumption after 2 years 🤣 Moore's law is long dead.
Is no one going to mention how a new 70 series card only has 12 GB of VRAM for the 2nd time in a row now? It's ridiculous.
Edit: And the base RX 9070 is going to have 16GB GDDR6/GDDR7 lmao
it's just business
yeah, that bugs me the most. Use case scenarios like VR gaming need ideally 16GB+ VRAM
Essentially that is Nvidia generating artificial weakness to continue demand. Its like a super car but with a 10 min fuel tank.
12GB of Vram for a 70's card isn't the worst thing. Back in the day we would usually get 1/4th less Vram than the 80 series and sometimes slower vram. I would rather have Vram that is the same speed as the 80 and less VRAM, like if you want to game at 4k pay 4k prices unfortunately, but with how close people have to sit to monitors its unncessary to go past 2k resolution anyways which 70 can easily run.
@@Tjtellsthetruth
Factors that Influence VRAM Usage:
Resolution: The higher the resolution (e.g., 1440p or 4K), the more VRAM the game will use, as more pixels need to be processed and stored in memory.
Texture Quality: Setting texture quality to Ultra will significantly increase VRAM usage, as high-quality textures require more memory.
Ray Tracing and Other Effects: Enabling Ray Tracing, DLSS, and other advanced visual effects will also increase VRAM usage, especially on high settings.
Mods: Any mods that increase graphical fidelity can also increase the amount of VRAM used.
The issue with not enough VRAM is that you could be at a point unable to play a game smoothly at any settings especially on big AAA single player with big maps like all the AAA RPGS. So in practical terms the amount of VRAM gate keeps you from playing newer AAA titles especially the open world ones.
These are just the founders edition prices. OEM prices are gonna be insane.
also, tarrifs haven't kicked in.
I think why we aren't seeing non-dlss/raytracing/frame generation performance is because there hardware is getting to the point where the only improvements can come from dlss, raytracing, and frame generation
Also got to keep an eye out for future ti and super cards like a 5080 super, 5070ti super, and 5070 superif they follow what they did for the 40 series 🤔
The price to bandwidth of the 5070 is 1.2240437158 gb/s for every dollar, the 5070 ti is 1.1962616822 gb/s to every dollar, the 5080 is 0.960960961gb/s to every dollar, and the 5090 is 0.8964482241 gb/s to every dollar at msrp prices so the 5070 and 5070 ti are better price to performance in terms of bandwidth
Wish they stop focusing on that garbage including computing power ans just focus on power efficiency, reliability, cooling and highest vram in that price range. The 8gb and 12gb mindset needs to stop ffs
we need raw performance comparison
😂😂 are u eating raw produce straight from the wild? get a grip amd fanboy , amd native cant even solo nvidia and intel now
@@three_wood_sen I've only bought Nvidia since Max Payne 2 was still a new game and raw performance is the only thing that matters
@@three_wood_sen oh look someone got triggered over simple comment, i didn't even mention amd lol. eat ur fake ai blurry frames nvidia/intel fan boy 😂
you will get it before cards launch.
@@three_wood_sen Nvidia sheep's can't cope that AMD will always be the smartest price/performance choice 😂
They spend so much money just to see other guys having the same performance for way less money 😂😂
ANY game that is older or emulated will ONLY have base card performance - which is about 15% greater for raster on the new cards. Not good.
Thats true, but relative to the raster preformance of other cards on the market, the raster will be better worth it if your upgrading from a 30 series or below. Otherwise stick to what you have, its honestly not worth upgrading for.
But older games are more optimized anyway lol. Man just realized that and suddenly all that fearmongering gamers were talking abt with vram goes away.
More like 35%
Wish it was closer to 60 tho
i also suspect Nvidia is pricing the gpus a bit cheaper to gain some social credit amongst the gamer community, but are going to raise the prices as soon as Trump's Tariffs kick in and take the opportunity to blame the new presidency as the reason why they increased the price. which im sure is partially true, but i bet you anything Nvidia will raise the price more than to cover the tariffs, seizing the opportunity to charge what they originally wanted to charge for the new gpus.
The trump tarriffs never left.
Biden kept them in place
and scalpers (nvidias middlemen) raise the price already
This always happens when you are in a monopoly. Intel did it, GM did it and now NVIDIA. Honestly we need another GPU maker shake up. I am hoping for an entry from one of the big electronic players.
This is exactly what Threat Interactive was pointing out in their recent video about the video game industry now no longer doing proper optimization. The devs are relying more and more on blurry "topping effects" for games instead of actually using optimization techniques, because it fosters an economy in the gaming industry where people are forced to buy the next new GPU that only really gets to tout "amazing frame rates" while using said "topping effects". The hardware is starting to shift prioritization towards enabling these blurry *ss excuses instead of actually bringing out innovation in raw performance and efficiency. Meanwhile the prices just keep ramping up to oblivion.
Getting GPUs at their MSRP is another delusion for me. As someone who bought an RTX 4060 Ti for $455 even after waiting 7 months for a price drop, I can confirm it
A 6800XT is $350 used and absolutely smokes the 4060ti.
i bought the 4080 super online at msrp on the first try around launch. my friend also did at best buy in store first try...im not saying thats the case for everyone all the time but to call it a "delusion" is...delusional
Yeah, that's true. Then you have the few unicorns that pop in here saying they scored it at MSRP. Admittedly, I'm one of them. I scored the 4090 at MSRP tax free. Buuut, I don't expect lightning to strike twice when it comes to the 5090. My nephew is trying to buy one, so maybe it can happen. I guess we'll see once the trackers are spun up and the wait begins. 😅
My 6750xt performs same as 4060ti and I bought it for 300$ 😂
You weren't wrong with the pricing, that's just the msrp, the leaks you were talking about were retail prices, aka scalper store prices kek
Don't forget there's going to be a vast difference between the announced price and what you'll end up paying to get one when stock is going to be near non-existant.
Honestly, they should stop releasing more powerful cards ans just make old cards bht more power efficient and with higher vram. Have that line up be for normal gamers while all this garbage for elitists
That would cut their margin. They have no interest in that @@MGrey-qb5xz
"I know many of you have 4090's" Yeah Jensen, whatever you say..
It's strange (not really) that they didn't give FPS in Native Resolutions with no features...
Because the RTX secretly are weak gpu's. Using ray/path tracing and DLSS as an excuse to raise the price for their marketing.
AMD had been a superior cards for years now but not in 2017 though
Because the % of ACTUAL performance increase is probably single digits.
@@Eldenbruh Your probably the same guy that was saying the 5080 was going to be $1500 - $1700 and the 5090 was going to cost $3000.....
@@toututu2993 amd can't even make a competitor to 4090 and now you are saying AMD is superior lol.
No not factual by any metric, i wish amds graphics cards were better, id buy them, but its just not true @toututu2993
i paid $649 for my 3080 new on launch, so no $999 is still a joke.
you were were very effing lucky dude. but yeah the pricing is just ridiculous. I still feel kindabad for buying a 4070 super. 12gb vram for 550€, because my 2070 super really ran out of steam. Like yeah I am happy with the performance, but the pricing for a card with such poor specs is just so loughably bad and amd is not that much better. their prices also went up with nvidia's prices
Aren't those prices announced by Nvidia for FE cards? And then you add brand markup (Asus Gigabyte etc.), store markup and VAT you end up paying a fortune for a mid range 12gb gpu.
And don’t forget all the prices go up when Trumps 20% tariffs hit. So the 70->670, 70 ti-> 950, 80-> 1200, 90-> 2400
Are people forgetting the 7900xtx had the same msrp?
Wow got everyone there 😂
Same MSRP as what? It was $1k if I remember correctly
@@Pawcio2115 7900xtx has $1000 MSRP same as 5080 $1000 msrp
ye that is kinda wild. I think AMD is actually worse cuz they price their much more inferior cards close to the nvidia ones when they should be ALOT cheaper.
@@dante19890 facts
One of the best takes ive seen. The 5070 will absolutely be about the same or worse than the 4070 super. That explains the price perfectly.
msrp only for FE in US only , outside of us tariff and other local tax ,around 200-500 margin increase
underrated comment
I think the 5000 series being on basically the same die-size as the 4000 series means they can only do so much without lowering efficiency. The biggest difference is definitely GDDR7 but not huge.
Putting the raw specs in front of me really put my eagerness in check.
Every feature they announced today will be on other RTX cards aside from MFG which I can live without, I've never really used it before except in CP2077. In most cases I would only get an extra 20fps, like Alan Wake II only went from about 63 to 84 or something like that. I could also notice the input delay; it wasn't enough of a boost to justify the delay for me. The new FG updates might change this going forward as it's more efficient, but that'll largely be something I can test once the updates are out for 40 series.
I'm pretty excited to see the New DLSS Transform model and improved Ray Reconstruction.
Mostly I'm just hoping Monster Hunter Wilds runs well.
There is no game on the market today that would warrant the expenditure of 2000 dollars to buy a 5090 for. I'd make the case that there is not even a game that would warrant the expenditure of 1500 dollars for a 4090. For 2000 dollars you can build a PC yourself and if you are willing to use and scrounge for some parts you can get a better deal, and have money spare to buy and play the games you like after.
This video just reinforced that skipping the 40 series was the right move. Skipping the 50 series is probably the right move too.
Just dont build a PC lol.
Skip the 6000 series too! ;-)
Im on a 3070 so cant really skip 50 series, 2 more years at least on an 8gb 5 year old card for someone that wants 1440p high settings 80+ fps in modern games 😢
the discount is to give illusion of a fair price when in reality compared to the last 2 generations its still overpriced. its done to get people used to the insanely expensive mid range gpus that cost the price of gpus of the high end years ago
Remember how 3070 was really on par with 2080 ti without all of the AI BS?
Good breakdown and really highlights that we're not getting any major upgrades for the price. It's all in the frame generation.
People are getting tired of the AI and upscaling methods. It makes the game look disgusting compared to native resolutions. I would rather play anything at native 1080p over any upscaling because of how much worse upscaling makes your games look. From someone who's always done native resolution, ai upscaling looks like trash.
That is objectively wrong in damn near all games that it’s supported in
Bus! Memory cheapness, over pricing based on software like dlss 2:03
I’m sticking to used cards and aiming for an upgrade from 3080 to 4070tisuper/4080super
Can't wait for actual performance tests without all the obfuscating nonsense. Couldn't give two shits how it performs with DLSS and frame generation
Im new to pc gaming ( i dont even have one), so im confused. What exactly is upscaling, and why is it such a big problem. Isn't better performance and graphics a good thing, what am i missing?
It's because game developers use upscaling as a way to avoid optimizations. Basically, it's a way to increase performance and developers see that and go 'Well, we can reach 60 FPS using this, good enough'.
Kinda reasonable? Those are prices before VAT or sales tax and from nvidia not their AIB partners. Living in a country with a 20% VAT where pretty much all cards will come from AIBs, no, those prices aren't reasonable.
I dont care about AI and i dont want fake frames. I bought a vega 64 for like 100 bucks a year ago and overclocked the snot out of it while undervolting it too. It runs fast and cool and all my games at 1080p run well over 60fps including space marine 2. Dont feel like upgrading kinda ever with these prices ngl.
we australians are still fucked. heard rtx 5080 goes for 2019 AUD. ridiculous. just when i wanna upgrade
Here we go again... Nvidia locking new features from older cards who are more then capable of doing the same as the newer.
Maybe the 7090 will be able to run cyberpunk 2077 4k 60fps native with path tracing... Maybe! Oh and yeah. For 3000$ MSRP I am sure. That means if you want an asus rog for examle , with 3rd party and country/region taxes it will be more like 3500$. Impressive.
If the the 5070 has less Ai tops than the 4090 why can’t the 4090 get dlss 4 multi frame gen 🤔
Answer: nividia will make less money
3:36 ngl i prefer the new card design. 🤷🏾
It's the FE variant so who cares anyway?
Prices will be about what people predicted from OEM’s, the MSRP hasn’t been that way for over 20 years.
The BOM might be less just because it’s physically a smaller PCB, so manufacturing efficiency may be passed on to the consumer for the lower spec cards.
28..fps with no dlss and frame gen is sad...for that price range can we stop using fake frames and dlss for a bandage and get more real power.
Great job, Vex. Good analysis. What a times, where you can get excited about "good" rtx 5070 price but than after some thought process, you see, that it's not actually that good in term of performance. Seems 5070super for 600usd will be a good choice, unless it will be +5% more on cores etc comparing o 4070 super.
The Vram is the catch, dont trust to lower than 16 gb vram, you gonna regret it in 1-2 years in the future.
we really here being offered to buy fake frames
I'm upgrading from a 6600xt for 240hz 1440p but it doesn't really look to be an insane generation for people who upgraded recently to ADA lol
AMD make great gpu no doubt. Is kind of sicken me that they will copy Nvidia because everything RTX bring makes absolutely no sense for anything gaming. The Nvidia rasterizing is bloody weak. What if they remove DLSS and that ain't gonna be any good marketing trick
😂😂 amd is a scam
@10:25 Do add the 1080, 2080 and 3080 to this chart. The price for the 5080 didn't go down, it started to get slightly more normal again.
I just think it's ridiculous that all the qualifiers for GPU performance are tied to video games. There are performance related tasks, such as editing, that cannot make use of AI to create better, fake performance. Native performance is what matters.
Oh, also, thanks for not segwaying into a sponsor like Linus does. The entire GPU community appreciates it.
Basically, 5xxx series have more ai to deal with the shitty performance that you got with 4xxx and Ray Tracing. So for single player games that look fancy… you have 5xxx series, for first person shooters, I wouldn’t bother changing if you are happy with your current performance
What does it mean with pricing STARTING FROM ?. So lets see real Retail price.
My Gues :
5090 2999 $
5080 1499 $
5070 ti 999$
5070 749 $
I think I'm good for now with my 4070ti Super. We'll see what the 6000 family brings in the future... 😊
I hope i am not the only one who thinks that the focus on using AI for more frames is meh, damn artefacts with dlss annoy the heck out of me. Also kinda hilarious seeing a guy with what looks like a jacket, expensive enough to pay me a years salary easily or more, talking about a 10k home multimedia command center. Like most gamers dont use the same monitor for 10+ years.
I had to use your a I counter to bookmark where I was in your video multiple times it was very useful
I still think those leaks are closer to the truth. GPU MSRPs on Nvidia cards haven't been the price you pay for the the cards for many cards in past 2 gens now. So the real price is msrp + 200-500 , because there's still going to be tariffs too. When the reality is that for longest time you could get 7900xtx for cheaper price than the normal 4070 Ti costed at the time, then it's hard to believe these nvidia's msrp prices mean anything.
A weak $800 gpu is no way near cheap for a pc part that is focus on graphic and display only😂
Right? I don't understand why they're calling this cheap. Because it didn't match an absurd price estimate to begin with?
It's still nutty.
If you are one of those gamers who hate the modern smeared look of games, the 50series has little to offer. Almost all performance comes from DLSS4, more frame generation, more frame smear, more TAA smear on upscaled image output with DLSS. If you enjoy playing in native resolutions, and native FPS... do you even need to upgrade? And I really hate the tolerance we have for nVidia's frame smear technology that they call "frame generation". It's basically the same as what cheap android TVs have been doing for like 15 years, and you always turn it off. People say you're supposed to only use framegen if you already have 60 fps but that's ont what nvidia is showing. They're showing using to to boost sub-30 performance. And it doesn't even help with latency, because these frames are not processed by the games. The games do not process input on the generated frames - so the more generated frames you have, the worse latency you'll get.
Happy with 90% of the new features and keeping my 4070s. Fake frames upgrade just not for me.
Have gpus actually gotten better? Like how much real performance have we gained since the 1080 TI? DLSS and Frame gen is fake performance, its inherently lossy. There isnt really a good way to bench performance cause all the game engines now are "optimized" for this fake make belief performance.
Nvidia allocated their resources to improve the AI hardware a bit and more into AI software, now why do you think they locked the dlss 4 or dlss 3 for specific card variant, they selling bullshit now
nvidia did leave themselves room for a 5090 ti/super as the 5090 isn't using the full blackwell die. There is still room for more cuda cores, ai cores and ray tracing cores to go up to the maximum that Jensen said was possible for blackwell
Nah, off a glitch in the website I got my 4090 new for 1000, sold my old card and paid 400 total. None of this frame generation bs is going to sell me on the 50 series or the 60 series. I need to see pure, native frame rates that justify the upgrade. If they keep trying to use frame gen as a selling tool to compare to previous gens that's just dishonest.
Absolutely agreed.
The days of raw power are over ever heard the term work smarter not harder
@@ironhulk1979 That saying is always in favor of the one working smarter and doing less work, which in this case is Nvidia. This is bad for the consumer, not a win. So unless you work for nvidia or receive a portion of their profits, them working smarter and not harder is not good for you either.
I think you're the first channel to mention something which I've been thinking about since the announcement ... the graphic showing the GPU exploded with the PCB in the middle fails to show how the outputs and PCIE are connected!
The board is either extended towards the output side and has a big hole in the middle of it or it needs to run a PCB or cables along the edges!
I did not spend $10,000 on my PC wtf does he mean? lol
He isn't talking to us - he is talking to those 0.1% ;-)
Over my whole livespan that number may fit
Nvidia is selling DLSS4, not better raw performance. Huang told us bullshit.
You get a like just for 7:36. Love these vids and I know you are a gamer but dive into Ai with all your cards and I bet this channel will blow up. Go down the rabbit hole of Comfyui
Dude, 999 is the US price without taxes. The prices you saw from AUS are with tax, 1350 will be the price for the card, I believe it will be even more than that. So you were not wrong. What's even worse is the fact that these cards aren't upgrades at all, they will just have the DLSS 4 performance jump, not a real performance jump. They are cheaper because they aren't a real upgrade, just an hardware fix to handle a new software.
What's interesting from 5080 is with only 5% increase in RT cores, they managed to get around 30% increase in RT performance (Far Cry) in non DLSS/FG condition. Personally im gunning 5080 to replace my 4070 Ti.
Really not worth it..
Just curious do you upgrade every 2 years? Not judging, I have a 3080 from launch and am tempted at a 5080 myself
I think multgen is gonna be really nice for people with 240hz displays so you can fill them out more for singleplayer games.
You’re a fool if you think these GPUs are fairly priced.
Based on the 5080 to 4080, it's 'sort of' fairly priced. He never said THEY ALL were. He even mentioned 4080 at $1000 was still not fairly priced before, that's what he means. Based on last gen, it kind of is.
@@Daltonisntabot That’s bad logic because last gen prices were based on what scalpers were charging during the pandemic. Thus, the pricing for last gen GPUs were based on pure greed… and the same goes for the 50-series. There’s no way to validate how the prices from the 30 to 40 series doubled and once again the prices are going up. Don’t be fooled by the prices shown for the founder’s editions. NVIDIA is simply manipulating the media with how they announced the 50 series. The 5080 should probably be the 5070 and so forth down the line. For example, it’s ridiculous that the 4080 had 24gb of memory, but the 5080 only has 16gb. The point is some naive people are saying the 5080 isn’t so bad because it’s at least cheaper than the 4080. However, I’m sure they’ll launch a 5080 ti with 24gb memory (that should’ve been the 5080) and they’ll charge more for it than the 4080.
@@DamienGWilson I didn't say otherwise. I think the 40 and 50 series are overpriced. They undersell VRam and sell purely software, keeping it locked behind their cards. It's absurd.
I love how they put the led right where you should route your cable
First
They can make all the pricing/MSRP claims they want but also play with the supply. And don't forget that triffs are coming, so these cards will be way more than MSRP, in my opinion.
Says the goofy who makes fake PC news content for a living because you have no real talent in life
I'm pretty sure Goofy is a Disney character.
@@Daltonisntabot Yep, but I am more financially successful and he would get cooked in alive in a roast battle or a live chat being called out on his BS... So I will call him what I want
DLSS still can’t escape the fact that FG adds latency and they still need the base framerate to be high enough. So how is 5070 going to beat 4090 if it can’t reach the base framerate? Its all BS
They said the 4070 super was twice as fast as a 3090. Turns out that was with rt and frame gen in isolated scenarios..
These cards are using 2GB modules, G7 also comes in 3GB modules as seen on the 5090 Laptops. Super refresh inbound?