Yep & Make sure you pick up your 🧂🧂🧂🧂4X Salt. The Truth will Hurt but really when your doing Marking saying it's only 10%, 20% or 30% faster then the 4070 is not good Marketing & No I am not counting the 4070 Super.
4090 pure raster will be around 5070ti/5080 no doubt, main issue is how long it takes DLSS 4 to start hitting the games, still many many titles that dont even have dlss at all
You know it's a lie when the CEO proudly shows his flagship 4090 product at $1599 MSRP, then goes on to say you can buy the same performance (5070) for almost exactly 1/3rd the price. He's either totally distorting reality or a marketing buffoon. Or both.
Jensen went on and on about how many gazillion gigs of memory his high-end commercial AI machines have, and he wants us to believe 12GB is enough for a 5070 that uses AI, which needs memory, to be "as fast" as a 4090. He thinks we're idiots.
@@rangersmith4652 The amount of memory that Nvidia's enterprise or commercial GPU's have is really insanely high! Consumers seem to get the left-overs. 12GB for anything over $250 is really criminal. I know people will say "but the features" it doesn't matter, look at what AMD is offering consumers; a 16GB card for $315 in the 7600XT I just found by a quick search. And that is their last gen! You can get a 24GB card for approx. $1000. I'm glad they lowered the prices, but 12GB for $600!? I'm also glad that Intel has entered the market as a competitor.
I once saw a truck running on 4 temporary spare donut tires. Those can spin at the same speed as the usual 33's it had on it before so it's basically the same thing as the real ones, right?
@@Snerdles Oh the irony is absolutely brilliant. Shame you dont see it. Eitherway, 3d games arent real either, since its just a flat screen that tricks your brain into thinking its actual 3d. If it looks good, it looks good. Thats really all i care about. Its a shame UE5-studios (in particular) are abusing there tools though.
The ai isnt bad its actually really cool, the problem is lazy devs not optimising games properly and relying on these technologies to make their games run as they should without A.I. How it should be is 5070 matching 4080( super/ti whatever) in raw performance then using A.I you can experience something similar to a 4090 in raw performance roughly while using all the pretty RT etc. I honestly think we should be pressuring devs (or the people that force them to rush unoptimised crap out) to squeeze every bit of performance out of our cards raw power then we use nvidias clever tech to enhance the experience
AI ruined tf2 casual, for five whole years. Ai is a big part of why gta definitive edition became known as the defective edition. The last thing i want, is ai in the GPU aimbotting against me.
@@heroicsquirrel3195 AI is bad and does not actually exist. "AI" can only do what it has been programmed to do and has ZERO actual intelligence or independence. Which means it is not actually AI. For the most part AI is being used unethically and used for such things as pushing Western Propaganda Narratives, controlling "wrong think" and suppressing free speech and access to information. Also allowing companies to sell inferior products for bloated prices.
But it's DDR5 which doesn't compare to DDR6 which doesn't compare to DDR6X which doesn't compare to DDR7 non-X. Which is all BS because I bet the 1080ti can run DLSS 4 just fine. :D
@@VarriskKhanaar problem is, RT and DLSS increase VRAM ussage by a lot and once you go over edge DLSS wont save it. Once you run out of VRAM its game over.
@@VarriskKhanaar I'm not sure, don't you need a special chip in the GPU for it to run DLSS? But yeah i meant more like, AMD is giving more VRAM but nvidia refuses to add VRAM probably to limit you in the future.
@@VarriskKhanaar Except RAM-speed is not an issue when texture-sizes and frame-buffering outpace the size of your RAM. In other words: It doesn't matter if your sports-car can run at Mach 2, when the tasks demand you getting from A to B with some additional passengers and luggage in the trunk instead of just you. And that 2017 1080Ti with 11/12GB? I'm running that too and it gets me from point A to point B just fine still. But increasingly often it already runs into the RAM-size brickwall, due to 4k textures, *despite* a distinct lack of DLSSing and RTing. You ain't gonna have much fun with a modern card that does all the extra bits and bobs and that much faster, when you can't carry the load.
yea even i caught myself im like WAIT WHAT CRAZ....OHHH fg.....fg.......then pricing and im like damn, even nvidia marketing allmost caught me. AMD is still AMD, never failing to drop the ball in how they can announce stuff/drum up excitement.
3:52 "the game tells us to do 1 thing and we just make the other 15 things up." Might look pretty and smooth sometimes but there's going to be so much input lag and artifacting =/
The bottom line here is this; you are NOT buying a GPU for PC gaming any longer, you are buying an AI chip pretending to be a PC gaming GPU. Let's see how these will all be garbage in two months trying to push 575 watts over those garbage 12V HP connectors.... Not buying....
Truth be told, the 12VHP-connector probably is, what bothers me the most right now. The marketing bullshit and AI-pushing is annoying, but that connector is still not market-ready despite several after-market mods.
5:22 27 fps without DLSS is what is really impressive here. Imagine if games were optimized to have 60 FPS in 2k native for an average GPU. How awesome and accessible games would be. Now we have powerfull GPU where all the power is wasted on compensating bad optimized games with raw power and upscaling technologies to just keep up with the bare minimum acceptable... that's crazy
So very true. Everything's just bloating and to keep up with it, other stuff bloats as well and uses divination magic to compensate, at the cost of... costs.
Yeah but if games were optimised and graphics cards wouldn't be as expensive and wouldn't have as little VRAM on budget model people wouldn't have to upgrade so often would they ? Just think of the sharehodlers
I wish they'd just talk about real performance at real resolutions not imagined performance setting a lower resolution and letting their AI 'imagine' the details of a higher resolution. I'm fed up with their smoke and mirrors.
I love how half the presentation was just "these AI will replace your tech jobs AND physical AI will take your blue collar jobs" and no one was clapping 🎃
there's a vast array of jobs where AI is no threat to job security, and likely wont be for many, many years. . Until the machines rise up and overthrow humanity, I see AI as being of more benefit than detriment.
The lamest part was when he said Software Developers are going to become HR - not really why many of us went into dev work! Thankfully I think AI is somewhat further behind than the snake oil salesman is selling!
In a perfect world, we wouldn't have to work jobs that machines can easily do (because why would I want to waste years of my life doing something that a machine could have done? It just wastes my life away) but we don't live in a perfect world. Once AI replaces us in jobs, we won't get anything in return to be able to pursue our actual hobbies and passions. But it's sad, humans wasting away on jobs that we don't need to be doing when we should be free to pursue actual meaningful stuff like working on creative endeavors.
No need to wait, Nvidia made the mistake of using the same Cyberpunk benchmark to promote the 5090 as they did for the 4090. The native performance increase is 30%.
You can tell they don't want to make gpu's for gamers, they won't even try and add the expected 16gb vram to their 5070 card, forcing people to get 5070ti at a stupid price as usual.
My concern with DLSS is accuracy. With the significant number of generated pixels, will the cards maintain the intended art style of each game? And temporal smeared frames in high speed games such as Forza Motorsport should not be an acceptable way to render higher frame rates...
So far it is indeed retaining alot of the detail, in fact I've seen it improve on the base in the cyberpunk box of noodles comparison. Look at the cup lid for example.
When Jensen says the 5070 = 4090, he means that the 5070 generates more frames with AI than 4090 can per actual rendered frame, which is what is making up for the difference in raw power. Just feel the need to state that clearly for people due to the hype around Jensen's statement.
I understand where you’re coming from, but this reminds me of when people dismissed CPU branch prediction by saying it was “just guessing.” Advanced AI-based frame generation uses sophisticated models and extensive training-it’s not just blind guessing. If you’re not a fan of new tech, that’s totally your choice. But for many of us, these innovations are an exciting evolution that can really enhance performance, even if the raw hardware specs aren’t as high.
It's all AI frames. Lmao Blackwell GPUs are primarily optimized for AI, so cutdown Blackwells (50 series cards) are going to be all AI-generated frames
These performance gains are only going to matter with games that have DLSS/RT already - it seems to be a boost for those technologies rather than pure “grunt.” Far Cry 6 comparison there on the chart doesn’t show a great improvement over the last-gen counterparts, considering it’s only relying on RT with no form of Frame-Gen/DLSS. I’ll wait for proper benchmarks to draw a conclusion.
The 5070 cannot match the performance of the 4090 without all the AI enhancements. So, is access to this AI free or are we being setup for an ongoing subscription to 'unlock' GPU performance/functions?
There is no way we’re gonna have to pay to unlock the performance we paid for already. I don’t think they’re dumb enough to get themselves into that situation.
@@alikhaled2389 20 years ago the idea of renting software was dumb and yet here we are subscribing to lots of our software. It isn't dumb it is cleaver business to get subscribers as a constant income stream.
I really hate this direction we are going, relying on AI and framegen stuff rather than raw rasterization power. i even feel like some developers don't optimize their games anymore very well and just force to use framegen to get good FPS numbers
@@foreveremoatheartI can play Indiana jones at 4K dlss balanced everything ultra bar texture cache at 60fps on my 2080ti… Bethesda are literal KINGS of optimisation.
So they're basically designing the new hardware and software to make lots and LOTS of frames to make benchmarks look good, in hopes that people won't notice that the latency and response times haven't improved? That's what it initially looks like to me. Maybe the faster memory and bandwidth will offset that, but I look forward to seeing what it looks like when hardware reviewers actually get their hands on the product and put it through some more 'real world' tests to see how it actually performs.
If it's double the performance as the 4090 like they say (it won't be) I'll gladly pay that. 4090 gets 80-100fps with native 4kRT with most games I play double that would be nuts
@@AndrewB23 With or without DLSS? The games where you'd want the extra frames from DLSS, don't work well with artifacting-risks and in games with low artifacting risks, you just don't need the extra FPS.
I saw the 5070 at $549 and immediately flashed back to the 3070 being $500, which I was only able to buy nearly a year late for $650! Thanks Crypto & scalpers!
@@ryangray600There being little supply means scalping immediately gets more worth it. If there's a huge amount of supply scalping won't be worth it because you first have to buy out a lot of supply before you can sell at scalping prices. And no supply won't be through the roof in the first 6 months.
This is the same marketing bs they gave when DLSS frame gen came out the first time. I wish they would show parity settings to show actual generational improvement. Not what six games can do with a feature only it can do.
@@Abrasive-Heat oh, you were fine; reality is the one getting out of hand here lol I can't even really watch it much; AI are going to take your jobs so you can't afford our mid tier cards that cost more than our Titan cards used to! I'm bruised from pinching myself to wake from this nighmare planet ... I laugh to heal!
@@Roland1405 it costs time, money and resources to develop and implement additional features (such as DLSS) on a mass-produced consumer product, but it's the consumer who pays for it. I'm not interested in DLSS as it does not generate genuine frames or improve responsiveness. It tricks you into feeling like the game runs smoother but at the cost of graphical fidelity. If I go to a burger restaurant, I want something made from real animal meat. Not the synthetic lab grown proteins.
Basically TLDW: If you a competitive gamer and you got a 4090 stay where you're at. If you got 2k lying around and a PCIe 5 motherboard get the 5090 for its raw power. Raw power = reaction speed if you a casual gamer and you play games like the stellar blade (insert any other coomer game with physics) and you don't have a 4090, get the 5070 because the fake frames gonna smooth the booty jiggle out. Fake frames= smooth motion but base FPS response time and feel. in hindsight this is why a lot of COD mnk players were complaining about "floatiness" in MWIII but 4090 users were like "Le MAO what are you talking about?". The people complaining probably had frame generation on getting "240FPS" but their actual FPS speed was 120fps. Sidenote: I bought a prebuilt corsair about 2 years ago now, with a 4090, 14gen, i9 64gb ram a little under 3k before taxes and delivery. After the 50 series announcement that PC or one close to it was listed at 4999 before taxes. You see the same pattern on Starforge PC's website with a lot of the 40series builds already "out of stock" If there's an actual TLDR here that should tell you all you need to know.
@@karlreading3201 Same as.s me, prior card was gtx980, thinking finally good card to buy and...what a fckup. Maybe 4dlss can save us from total buyer remorse disaster
is it just me or has anyone else also noticed that this tiny RTX 5090 PCB that Huang showed off cannot properly connect to a PCIe slot because of where it sits in the graphics card (in the middle : too far to the front of the motherboard to be able to properly connect to a PCIe slot) and that it also lacks any kind of PCIe fingers/connector and most likely a bunch of additional electronics parts as well ???????????? (I don't see even a tiny PCIe x1 connector let alone an x16 one, and the tiny PCB only seems to contain a GPU chip, memory, the power stages and not much else) The PCB that was shown (physically and in the parts animation) looks incomplete or it somehow connects to a 2nd PCB in order to connect to the motherboard that wasn't shown. Either way Huang didn't present the entire truth, and @JayzRwoCent's Phil fell for it hook, line and sinker (time stamp 00:00:24-...). I wonder what else NVIDIA has hidden and/or skewed during this presentation...
If the 5070 is actually in stock at $550 that might be my next gpu. I thought the amd 9070xt would be a good deal this gen but it's going to have to come in at $450 retail now to compete, even if fsr4 is way better than current fsr.
£800 for a 5080 I couldn’t be more happy with that considering I’m still rocking the 2080 Seahawk ek x watercooled gpu. So close to finally being able to upgrade my pc.
well amma sit this series out so it's great to see they are tinkering with some new tech in the shaders etc so it'll be properly ready with the 60series :D
Watch that $1999.99 turn into $3500+ for the first 2 years of release. I waited on the 4090 over a year to sell at or below MSRP and I ended up getting one for $1500 with a $450 EK water block already installed. Doubt I will ever upgrade again.
Its just so misleading... At least during the 4000 gen keynote they showed both with frame gen on and off, this year they only showed the X4 multi frame gen on and claimed the 5070 had 4090 performance. The only real performance benchmark is the far cry 6 benchmark, the 5070 is a 4070Ti... 15-20% generational performance uplift across the board, probably the worst ever generational uplift from nvidia. Were used to the new gen xx70 being as fast as the last gen flagship/xx80Ti/xx90. Like the 1070 had 980Ti performance, 2070 Super had 1080Ti performance, 3070 had 2080Ti performance, 4070 Super had 3090-3090Ti performance. Thats no longer the case, instead of 50-60% performance improvement, were seeing 15-20% across the board. Extremely bad, and AMD now got a golden opportunity to release something that got real 4090 performance for 5070-5080 price.
It sucks but i dont see AMD touching 4090 until even ankther generation. Nvidia is just another level right now but im rooting for AMD or even Intel eventually. We need some high end competition.
That’s impossible. They would have to drop prices on 7900xtx by at least 200$ to make 1,000-1,200$ 5080 equivalent cards. From what I heard, AMD aren’t even going to try to compete against Nvidia top end cards cus they failed to create. Hopefully they can read the room and not lean so heavily on frame gen stuff and gives us better RT and raw rasterization.
That’s impossible. They would have to drop prices on 7900xtx by at least 200$ to make 1,000-1,200$ 5080 equivalent cards. From what I heard, AMD aren’t even going to try to compete against Nvidia top end cards cus they failed to create. Hopefully they can read the room and not lean so heavily on frame gen stuff and gives us better RT and raw rasterization.
I believe I'll be upgrading from 4090 in 3 ish years when the 60 series and smaller nm processing happens. By that time nearly all games will probably have DLSS enabled.
@@dieglhix I'm a pretty old gamer, 45+ . I can understand what you mean, but I have been able to find enjoyable newer games, the problem is that there are so many games these days, one has to wade thru so many. Also part of why I watch a lot of playthrus I guess lol.
@@hrodgarthevegan I am "only 35", have tried so many newer games and what I find about them is that they are too slow, too many RPG elements, they are less "bold" and lastly, the physics animations actually look worse than PS3 era games. For example, I just finished Max Payne 3 which was released in 2012, but it looks really great. I then tried Red dead redemption 2 which was made by the same writer and studio but it was super slow. I am finding a lot of joy with Just Cause 4 now though. I still believe the "golden era" stopped in 2016 before they added RPG stuff to everything. I enjoy a lot my arcade cabinet with batocera front end system. I have discovered thousand of games with the screensaver videos.
Can they just give us the raw horsepower? DLSS, FSR, ABCD, Why are cards stuck to software reliance for frame rate? Give us brute force with the benefit of frame generation if we need it for new games in a year or two. 5090 is 32GB while the 5080 is 16? couldnt give it the 24? And AMD....... Not even attempting to give us a raw 4k card? nope take the 9070 and use FSR. f outta here. Intel its your time to shine with the next gpu launch.
It's to please shareholders, most of whom sadly only know FPS as the big number that must grow bigger. A lot of gamers may interpret 15 out of 16 pixels being fake as an insult. To shareholders, that means 240fps instead of 60, which means in their mind that more consumers will buy these products so they can have 240 FakePS. The groaning and the cacophony of silence in the crowd during the keynote doesn't matter to most of them. BTW 4k raw gaming has been a thing for 10 years already, and the RX 6800XT does a fine job at 4k high. Just not with antialiasing on top of that (why would you use it at that pixel density anyway).
@@michelvanbriemen3459 Im not hurting for an upgrade you know? I just finished my 7900x/7900xtx 6 months ago I am set for a while to come. It's just frustrating that AMD tucked tail and said you get 1440p for cheap or sell your family for 4k with Nvidia. I also understand that in the grand scheme of business, consumer GPU purchases are just a ever so small fraction of their revenue so they care less. For the past 15 years we felt cared about. This year its like they both cant stand our peasant smell.
@@EpicValleysStill I mean yeah maybe they will release more cards later but idk. Didn't they say something like they won't be competing in the high end gpu market anymore ? Does anybody know why ?
Did you learn nothing from last year's CPU releases? Or that AMD gave up on the high end? Moore's law is dead. There is no brute force approach anymore, at least not without going to a insane power consumption nobody would tolerate anymore. (Most already complain about 575W.) This cards already give you what you ask for, the highest rasterization performance ever with additional up-scaling or FG technic.
I dont understand why people are crying about "fake frames". The visual difference isnt even that significant. The smoother visuals due to higher fps is significantly better than any compromise made to the picture quality due to fsr or dlss. Why are people crying about the technicalities that these are "fake frames" and "the raw power hasnt seen an improvement". Why does it even matter if you have to slow down and zoom in the game to see a noticable difference between dlss on and off. Fsr and dlss are only going to get better over the coming years to the point you wont even be able to tell which is which.
It is the analog like old guys whining about new, modern cars, they are not real, they do not put out smoke and gas fumes...but they actually drive the same or better. I do not care how those fps are generated, if they look good then it is fine. And if the 5070 can give same frames with modern stuff then it can be equal to 4090 because 4090 cant use this modern "stuff" Period.
I don't understand this take. In purely gaming raw power doesnt matter. Nvidia is sticking with AI is the future and the price of the cards is more than just the raw power of the cards. It's the R&D of their AI. AMD, despite their best efforts, still can't compete with DLSS 3 and we don't know FSR4 can compete with DLSS 3 let alone DLSS 4. I have a 4080 Super and I love FG. DLSS is only going to get better Would you rather have a future where cards draw 800w or an AI powered card that draws half the power for similar performance. Me, I dont want to game in a room that is boiling hot because of my gpu
@aHungiePanda He is in the "fake frames" crowd who thinks DLSS is blurry, unplayable and has major input lag. Not true, It's actually amazing and I love it.
@@aHungiePanda we all have different expectations to be honest. I personally don't use framegen. the only reason I spent 700 bucks on my gpu was not to use framegen. and I will add, that you can improve, at least to some degree, the power consumption and improving the performance at the same time (it is definitely not simple, but it can be done and it has been done in the past). that's part of hardware innovation.
3:16 The chip is larger and must be less dense in heat producing cores. If the heat is spread out it is much easier to cool. The vapour-chamber and cooling fins looked brilliant.
Until you realize you can't secure a 5090. Never sell your card before you can secure the new one first. A very hard lesson i had to learn when i sold my 2080 to buy a 3090 card years ago.
@PrestigeRich you don't even know how much he sold it for. Buying cards like that means they'll almost always sell at a loss, unless you're trying to scalp them on release.
@@thamisfit23 I once, stupidly sold my 1080Ti two weeks before the 3080's were released. I ended up going two years without a serious graphic card LOL, because there was a shortage during the pandemic and the prices were insane.
I don’t because I recently bought a 4080 Super for 1000 being absolutely sure the 5080 would be 1500. Oh well, i can at lease hope, availability will suck. 😛
The pricing of the RTX 5080 at $999.99 USD is quite surprising considering the rumoured price range was $1,299~$1,599 USD.The question is will the cards actually retail for the announced MSRP of $999.99 USD without the imposed 25% tariff from the incoming trump administration?Third party reviews of these new Blackwell based GPU’s are going to be indispensable.
It seems like Nvidia kept the pricing more in line with its core count than it did in the 40 series. The 5090 seems expensive but it also has 20000 core which is a metric ton.
starting to enjoy the philztwocents lol im excited about price point on these the 5070 ti isnt a pipe dream anymore for me. Thank Jay and the whole team for keeping us up to date.
We all know that the 5080 has 5060 specs !!! Stop ignoring that. They are selling 5060s at 1000€MSRP (1300€ shelve price tag) under the 5080 name tag !!! Do not accept this !!!
People expecting the same supply issues this time may be in for a surprise... Nvidia opted to use the same 4nm process as with the 4000 series (slightly updated), which suggests supply constraints won't be as severe. They could have gone with 3nm and competed with Apple for the 3nm process, but they stuck with 4nm because it's a mature process, readily available, & they understand that if they don't meet consumer demand, THEY ARE LEAVING MONEY ON THE TABLE FOR SCALPERS! They prefer everyone who wants one to buy a 5090 at MSRP. because they don't get one dime of scalper profits. So chances are, supply will not be a problem after a week or two, and the scalpers who don't understand this will be left holding the bag.
When Jensen says the 5070 = 4090, he implies that the 5070 can create more frames using AI than the 4090 per real rendered frame, which compensates for the raw power disparity. I just felt the need to explain it plainly for folks because of the buzz surrounding Jensen's statement.
So in other words the 5090 in Australia price wise will cost more then a Rental deposit and Bond with another month of rent paid in advance, I wish I was joking. The exchange rate of 1 AUD$ being 0.60 USD$ and taxes/retailer mark up makes it over $5K AUD. Thanks Australian Gov.
Local MSRP of A$4,039 ain't much better but we haven't hit the 5k dystopia until Asus bring out the 5090 Strix 90, 80 and 70 all packing ~15% mark up after GST vs US prices, but 5070 Ti only set to cost ~5% more vs the US prices
RIP 4090 - bad take fellas, only the 5090 will surpase it without nonsense triple frame gen, missed that one didn't ya'll ;-)... The 5080 might have "MIGHT" have faster cores, but has well less and 16 GB vs 24 GB the 4090s, ram speed over all is faster and importantly, more GBs. so.. not so much.. Like Steve told ya'll slow down, get it right first, then post :)
lol yeah, i think this video will go down as poorly as the original 4060 ti review where this was the only channel saying that was a good card. lol i remember how jay had to apologize for that xD.
5080ti will be the only card to be potentially better than the 4090, at the same price ($1500-1600 likely) other than the 5090. Gonna just pick up a used 4090 if I do switch.
You can generate frames in the future by grabbing the current user input. Knowing that for example, the character is moving forward would give the frame gen enough information to determine 3 frames in the future that the character and map will be moving forward.
@bmoney2175w oh, I can afford one, but I don't trust the new 16 pin high power connector, I prefer my tech to be proven reliable first, which the high power connector has not been.
Alright my tribe, I need some help. I got some sad medical news - going blind in my right eye, early onset glaucoma - building a PC for the first time in 9 years to cheer myself up. X870 motherboard, Ryzen 9 9700x, in hand, and a 7900xt for $650 here by late January. Do I hold and run the 7900xt (Hell Hound edition) or kill that purchase and do the 5070? Help please.
@@dontmatter4423you honestly think supply will not be low? Come on dude. Live in the real world. They will sell out in seconds and unless you live near a microcenter you will have to pay a few hundred dollars more than MSRP
@@legiox3719 supply always declines at launch for popular products, but they always restock soon enough. I've never seen 40 or rx series cards out of stock in my region
@@legiox3719 supplies always decline for popular products at launch but they also restock soon enough. i haven't seen a single 40 or rx series card out of stock in my region.
Can't wait for AAA devs to spend even less on optimisation just to rely on upscaling and frame gen, bringing us further into the muddy world of upscale blur hell
People need to lean into it. Its not going away. If they did in fact lower latency by 75% then frame gen may be finally accepted by people. Also what people fail to realize is. You know its fake frames but the pc doesn't. Thats the entire point. Its tricking the pc. Thats how algorithms like this work. Regardless people can be in denial all they want to. 4 years from now it won't be something that's hated but instead will be loved. What do people expect? How can raster keep going guys? It can't keep getting big gains. We are at a spot now where even massive hardware improvements can't really push raster in huge directions. So this was bound to happen. Also from what nvidia is stating we will even get image enhancements this time. So they may have found a way of doing it without sacrificing details.
This is kinda my stance on it. I think the promises of the tech are great, but as for what it's done as of late what it has given lazy developers the ability to do (and, most consequentially, screwing over consumers/gamers that don't have access to such tech), there's of course a lot of pushback. If Nvidia can actually do what they set out to do with these AI tools and make them near indistinguishable from raw game rendering, both in look and feel, and at the price point they're showing, it might be a great time to be in the PC space in the next few years
on the part of cooling going by the direction of the fins it seems it is a cooler made for the rack so most likely AIBs will be uses either a thicker cooler or liquid cooling due to having to removing 600 Watts of heat. Yes watts can be a measurement of heat output not just a measurement of power. Why do I say 600 Watts of heat when nvidia has a number in the 500s? AIBs are known to overclock the graphics cards they sell thus making them more waste heat as nothing is 100% power efficient. For some I am thinking the AIB will ask to have 2 16-pin connectors so they can OC the card with their company's policy of headroom left. Like always I am unsure if nvidia will allow it but having 2 PCIE 5 power connectors is not a thing on most PSUs but adapters do exist.
can someone tell me how to get a geforce directly from nvidia? do I need to preregister or anything? or do you just need to be quick and lucky on the realese day?
@@TheGamingNorwegian uh huh when in reality u or none else has zero clue what it's performnace is accept with new dlss frame gen it's 4090 performance, maybe it has 4080 raster we dk either way u prob paid a hella of alot more for less is all, it's ok it'll be alright but fomo sucks
I know right, It is so dam cheap Is unbelivable, it should been 10k at a min due how much value it province and how long it last. In 4 years you make 2k and more shut up.
you can only predict the future of frame generation if you are holding back the entire system 2:00. This what I suspect Activision been doing on their servers with warzone and mulitplayer black opps and modern warfare, the players are seeing imaging quite a few MS later from real time imaging, allowing the server to cut out the lag and smoother play through
Glad to see XAI508P leading the charge. One thing to note about XAI508P tokenomics that was glossed over is that usage of the XAI508P Network burns, lowering supply and well... You know the rest ;)
The 4090 performance from a 5070 claim will go down as one of their most dishonest marketing claims to date. 😂
Yep & Make sure you pick up your 🧂🧂🧂🧂4X Salt. The Truth will Hurt but really when your doing Marking saying it's only 10%, 20% or 30% faster then the 4070 is not good Marketing & No I am not counting the 4070 Super.
I cannot wait for all reviewers to bust that one wide tf open. That was mind numbing when Jensen said that
4090 pure raster will be around 5070ti/5080 no doubt, main issue is how long it takes DLSS 4 to start hitting the games, still many many titles that dont even have dlss at all
what do you mean.4070super crushed 3090 already
@@subjectivereviewsyou said it yourself, "super", which came out a year later. The OG 4070 performed about as well as an OCed 3080.
You know it's a lie when the CEO proudly shows his flagship 4090 product at $1599 MSRP, then goes on to say you can buy the same performance (5070) for almost exactly 1/3rd the price. He's either totally distorting reality or a marketing buffoon. Or both.
Well it’s the outdated flagship product, that’s the point. We’ve seen the same with the 3070 vs 2080Ti, why are you all even surprised
@@drchtct EXACTLY the whole marketing ploy is ditch the 4000 series and buy the latest and greatest thing bc it’s “better”!
Jay look different somehow
Frame compression on
He's just tired.
He looks tall now
DLSS 4 turned on with full ray tracing.
That's JayzOneCent
4090 performance with half the ram... I don't believe it for a second.
Jensen went on and on about how many gazillion gigs of memory his high-end commercial AI machines have, and he wants us to believe 12GB is enough for a 5070 that uses AI, which needs memory, to be "as fast" as a 4090. He thinks we're idiots.
@@rangersmith4652 If you're blowing £2000 on a card that already makes you an idiot
@@craigmchugh9978 Unless you use it to make money. If you do, run the numbers and see if the faster speed makes sense. For gaming? Nope.
@@rangersmith4652 The amount of memory that Nvidia's enterprise or commercial GPU's have is really insanely high! Consumers seem to get the left-overs. 12GB for anything over $250 is really criminal. I know people will say "but the features" it doesn't matter, look at what AMD is offering consumers; a 16GB card for $315 in the 7600XT I just found by a quick search. And that is their last gen! You can get a 24GB card for approx. $1000. I'm glad they lowered the prices, but 12GB for $600!? I'm also glad that Intel has entered the market as a competitor.
@@craigmchugh9978 Not everyone is a broke boy
Will the 5070 outrun the 4090 in pure raster when DLSS 4 isn't available? I guarantee not.
The 5070 got 4070Ti performance in pure raster.
No chance lol, the lack of memory alone hinders it, without DLSS 4 the card is shit.
Actual performance to dollar will be interesting to see.
Well Jensen DID say these performance is not possible without Ai.
i fear 5070 may not even match the 4080 really.
I once saw a truck running on 4 temporary spare donut tires. Those can spin at the same speed as the usual 33's it had on it before so it's basically the same thing as the real ones, right?
If it keeps the truck moving somehow without that much noticable issues, then yeah kinda.
@Crazzzycarrot-qo5wr No, not even a little bit. Fake things are not real things.
@@Snerdles Oh the irony is absolutely brilliant. Shame you dont see it.
Eitherway, 3d games arent real either, since its just a flat screen that tricks your brain into thinking its actual 3d. If it looks good, it looks good. Thats really all i care about. Its a shame UE5-studios (in particular) are abusing there tools though.
@Crazzzycarrot-qo5wr blatant delusions don't increase frame rates.
@@Snerdles damn youre right. Im sure people agree with you and will never use dlss or fg again, or buy a nvidia card...
I'm getting tired of these companies shoehorning AI into EVERYTHING!
Sadly, it's automation...get "more" done with less. It's where everything is going.
The ai isnt bad its actually really cool, the problem is lazy devs not optimising games properly and relying on these technologies to make their games run as they should without A.I. How it should be is 5070 matching 4080( super/ti whatever) in raw performance then using A.I you can experience something similar to a 4090 in raw performance roughly while using all the pretty RT etc. I honestly think we should be pressuring devs (or the people that force them to rush unoptimised crap out) to squeeze every bit of performance out of our cards raw power then we use nvidias clever tech to enhance the experience
Just saying “A.I bad” is dumb and flat out wrong
AI ruined tf2 casual, for five whole years.
Ai is a big part of why gta definitive edition became known as the defective edition.
The last thing i want, is ai in the GPU aimbotting against me.
@@heroicsquirrel3195 AI is bad and does not actually exist. "AI" can only do what it has been programmed to do and has ZERO actual intelligence or independence. Which means it is not actually AI.
For the most part AI is being used unethically and used for such things as pushing Western Propaganda Narratives, controlling "wrong think" and suppressing free speech and access to information. Also allowing companies to sell inferior products for bloated prices.
My 1080ti has 11gb of VRAM, it's from 2017...
But it's DDR5 which doesn't compare to DDR6 which doesn't compare to DDR6X which doesn't compare to DDR7 non-X. Which is all BS because I bet the 1080ti can run DLSS 4 just fine. :D
@@VarriskKhanaar problem is, RT and DLSS increase VRAM ussage by a lot and once you go over edge DLSS wont save it. Once you run out of VRAM its game over.
@@VarriskKhanaar I'm not sure, don't you need a special chip in the GPU for it to run DLSS? But yeah i meant more like, AMD is giving more VRAM but nvidia refuses to add VRAM probably to limit you in the future.
@@VarriskKhanaar Except RAM-speed is not an issue when texture-sizes and frame-buffering outpace the size of your RAM. In other words: It doesn't matter if your sports-car can run at Mach 2, when the tasks demand you getting from A to B with some additional passengers and luggage in the trunk instead of just you.
And that 2017 1080Ti with 11/12GB? I'm running that too and it gets me from point A to point B just fine still. But increasingly often it already runs into the RAM-size brickwall, due to 4k textures, *despite* a distinct lack of DLSSing and RTing. You ain't gonna have much fun with a modern card that does all the extra bits and bobs and that much faster, when you can't carry the load.
Thats DDR5 also i run 8gb on 1440p and its basically perfect for 2025
How about a non ai version of all these for half price
You get what they give you and be happy about it, or you get nothing.
Those are called AMD, and even they do some AI.
B580 :P
@@Fishy-i2g Another Nvidia bot
@@Baconism I think he was being sarcastic
"4090 performance for $549" Yeah, they're totally not comparing raw performance to DLSS 4 AI framegen extreme upscaling hoo hah, no way...
Raster vs dlss comparison right? Lol
Well I mean shit that 4090 performance numbers was based on DLSS too... its just stacking dlss on dlss at this point
yea even i caught myself im like WAIT WHAT CRAZ....OHHH fg.....fg.......then pricing and im like damn, even nvidia marketing allmost caught me. AMD is still AMD, never failing to drop the ball in how they can announce stuff/drum up excitement.
Actually 4x frame generation. I'm not mad though. I'm down with framegen, just not the latency.
Thanks Phil and Jay
3:52 "the game tells us to do 1 thing and we just make the other 15 things up." Might look pretty and smooth sometimes but there's going to be so much input lag and artifacting =/
The bottom line here is this; you are NOT buying a GPU for PC gaming any longer, you are buying an AI chip pretending to be a PC gaming GPU. Let's see how these will all be garbage in two months trying to push 575 watts over those garbage 12V HP connectors.... Not buying....
Truth be told, the 12VHP-connector probably is, what bothers me the most right now. The marketing bullshit and AI-pushing is annoying, but that connector is still not market-ready despite several after-market mods.
Not buying booho cry me a river
Jensen in a snake print jacket, appropriate for the Year of Snakesss...
Its vram jacket
😂😂💯💯
He's actually trolling us now.
I believe the reptile your looking for is a alligator
@@TrinityRDS1 You sir are correct, I just like my snakess apropro
5:22 27 fps without DLSS is what is really impressive here. Imagine if games were optimized to have 60 FPS in 2k native for an average GPU. How awesome and accessible games would be. Now we have powerfull GPU where all the power is wasted on compensating bad optimized games with raw power and upscaling technologies to just keep up with the bare minimum acceptable... that's crazy
gamedevs got lazy, gpu devs got "ai flu", the wheel now turning infinitely. and we are moving deeper into the ai swamp.
So very true. Everything's just bloating and to keep up with it, other stuff bloats as well and uses divination magic to compensate, at the cost of... costs.
Yeah but if games were optimised and graphics cards wouldn't be as expensive and wouldn't have as little VRAM on budget model people wouldn't have to upgrade so often would they ? Just think of the sharehodlers
I wish they'd just talk about real performance at real resolutions not imagined performance setting a lower resolution and letting their AI 'imagine' the details of a higher resolution. I'm fed up with their smoke and mirrors.
I love how half the presentation was just "these AI will replace your tech jobs AND physical AI will take your blue collar jobs" and no one was clapping 🎃
Like you should be excited about losing your livelihood...
there's a vast array of jobs where AI is no threat to job security, and likely wont be for many, many years. .
Until the machines rise up and overthrow humanity, I see AI as being of more benefit than detriment.
The lamest part was when he said Software Developers are going to become HR - not really why many of us went into dev work! Thankfully I think AI is somewhat further behind than the snake oil salesman is selling!
In a perfect world, we wouldn't have to work jobs that machines can easily do (because why would I want to waste years of my life doing something that a machine could have done? It just wastes my life away) but we don't live in a perfect world. Once AI replaces us in jobs, we won't get anything in return to be able to pursue our actual hobbies and passions. But it's sad, humans wasting away on jobs that we don't need to be doing when we should be free to pursue actual meaningful stuff like working on creative endeavors.
I woke my neighbor laughing when he said our IT professionals will be the new AI HR....and no one clapped. You could tell it threw him off.
Guess we'll have to wait for third-party reviews to see how these really work, i.e. without DLSS.
Pretty soon that won't even be a thing. No more raw performance numbers, only AI performance.
No need to wait, Nvidia made the mistake of using the same Cyberpunk benchmark to promote the 5090 as they did for the 4090. The native performance increase is 30%.
@@CaptToilet If that day comes, it'll be the day i stop buying GPUs entirely and bury my gaming hobby
You guys cry for no reason , no problem with ai its not good enough right now for latency but ai power is a part of the gpu
@@Knaeckebrotsaegeand why
You'd need DLSS 5 to render the massive grain of salt I took these claims with
NVidia doesn't make GPUs anymore. They make AI hardware. They're just leveraging that in the face of zero competition from AMD
You can tell they don't want to make gpu's for gamers, they won't even try and add the expected 16gb vram to their 5070 card, forcing people to get 5070ti at a stupid price as usual.
My concern with DLSS is accuracy. With the significant number of generated pixels, will the cards maintain the intended art style of each game? And temporal smeared frames in high speed games such as Forza Motorsport should not be an acceptable way to render higher frame rates...
So far it is indeed retaining alot of the detail, in fact I've seen it improve on the base in the cyberpunk box of noodles comparison. Look at the cup lid for example.
When Jensen says the 5070 = 4090, he means that the 5070 generates more frames with AI than 4090 can per actual rendered frame, which is what is making up for the difference in raw power. Just feel the need to state that clearly for people due to the hype around Jensen's statement.
🧂🧂🧂🧂 4X the Salt
I understand where you’re coming from, but this reminds me of when people dismissed CPU branch prediction by saying it was “just guessing.” Advanced AI-based frame generation uses sophisticated models and extensive training-it’s not just blind guessing. If you’re not a fan of new tech, that’s totally your choice. But for many of us, these innovations are an exciting evolution that can really enhance performance, even if the raw hardware specs aren’t as high.
Price 5070 = Price 4090🤣
It's all AI frames. Lmao Blackwell GPUs are primarily optimized for AI, so cutdown Blackwells (50 series cards) are going to be all AI-generated frames
And we will see how that translates to gaming. If it's like ray tracing it probably won't for 2 to 3 years.
These performance gains are only going to matter with games that have DLSS/RT already - it seems to be a boost for those technologies rather than pure “grunt.” Far Cry 6 comparison there on the chart doesn’t show a great improvement over the last-gen counterparts, considering it’s only relying on RT with no form of Frame-Gen/DLSS. I’ll wait for proper benchmarks to draw a conclusion.
The 5070 cannot match the performance of the 4090 without all the AI enhancements. So, is access to this AI free or are we being setup for an ongoing subscription to 'unlock' GPU performance/functions?
Definitely required internet connection.
Frame gen is free in all games that have it I'm pretty sure, no payments needed.
There is no way we’re gonna have to pay to unlock the performance we paid for already. I don’t think they’re dumb enough to get themselves into that situation.
@@alikhaled2389 20 years ago the idea of renting software was dumb and yet here we are subscribing to lots of our software. It isn't dumb it is cleaver business to get subscribers as a constant income stream.
@@alikhaled2389 and then you remember all the bullshit nvidia did with gpu naming in past few years. yea, they can't be that big of a dum dum
Im guessing low vram right off the rip
DLSS4 was showing how it radically compresses the texture sizes, so thats likely how they can get away with less vram.
5070 coming in with the bare minimum 12GB
@@deuswulf6193 so instead of just more vram we get ai smudge and inputlag. just great, games don't look bad enough without framegen.
I really hate this direction we are going, relying on AI and framegen stuff rather than raw rasterization power. i even feel like some developers don't optimize their games anymore very well and just force to use framegen to get good FPS numbers
Bethesda is a perfect example for not bothering to optimise their games
@@foreveremoatheartI can play Indiana jones at 4K dlss balanced everything ultra bar texture cache at 60fps on my 2080ti… Bethesda are literal KINGS of optimisation.
it is like TVs integrated framegen
Dude s.t.f.u, don’t ever speak that idea. You’re giving Nvidia ideas man. Delete your comment and never mention this again.
New PhilzTwoCents video!
Aaaaahhh beat me to it 😂
should we celebrate 240 fake frames or 100 real frames from 1080ti?
So they're basically designing the new hardware and software to make lots and LOTS of frames to make benchmarks look good, in hopes that people won't notice that the latency and response times haven't improved? That's what it initially looks like to me. Maybe the faster memory and bandwidth will offset that, but I look forward to seeing what it looks like when hardware reviewers actually get their hands on the product and put it through some more 'real world' tests to see how it actually performs.
For 1999, I expect an AC unit in that card to cool it.
well you'll see, it's cooler is way way more better ;-) won't need it
supposedly its only a 2 slot card...
If it's double the performance as the 4090 like they say (it won't be) I'll gladly pay that. 4090 gets 80-100fps with native 4kRT with most games I play double that would be nuts
@AndrewB23 with DLSS
@@AndrewB23 With or without DLSS? The games where you'd want the extra frames from DLSS, don't work well with artifacting-risks and in games with low artifacting risks, you just don't need the extra FPS.
Here come the scalpers!
I saw the 5070 at $549 and immediately flashed back to the 3070 being $500, which I was only able to buy nearly a year late for $650! Thanks Crypto & scalpers!
covid was 5 years ago man
There’s nothing to scalp anymore. Demand isn’t nearly as high and supply is there
@@ryangray600There being little supply means scalping immediately gets more worth it. If there's a huge amount of supply scalping won't be worth it because you first have to buy out a lot of supply before you can sell at scalping prices.
And no supply won't be through the roof in the first 6 months.
@@ryangray600*tariffs have joined the chat*
This is the same marketing bs they gave when DLSS frame gen came out the first time.
I wish they would show parity settings to show actual generational improvement.
Not what six games can do with a feature only it can do.
oh, you are seeing a parody for sure ...
what you want to see is parity;
its ok, Jensen got it wrong too, you saw the prices and capabilities lol
@ fixed it. Not sure if that was spell check or lack of sleep that caused me to do that.
Pretty funny.
@@Abrasive-Heat oh, you were fine; reality is the one getting out of hand here lol
I can't even really watch it much; AI are going to take your jobs so you can't afford our mid tier cards that cost more than our Titan cards used to!
I'm bruised from pinching myself to wake from this nighmare planet ... I laugh to heal!
He’s jacket grows every billion dollar amount earned
The fact that no matter the inflation, his two cents have held its value very well.
😂😅👍💯💯💯💯💯💯
My guy is good.I love the start of the video but as a whole, a good video and inciteful appreciate it.
I want real frames, not artificial frames. How much less would these cost without DLSS?
And you get more "real" frames then ever before with this cards + the ability to use DLSS in any way you like. So what is your point?
@@Roland1405the fact that you don’t, the 4090 has way more raw performance than the 5070.
It called AMD
@rangagump5591 Amen!
@@Roland1405 it costs time, money and resources to develop and implement additional features (such as DLSS) on a mass-produced consumer product, but it's the consumer who pays for it. I'm not interested in DLSS as it does not generate genuine frames or improve responsiveness. It tricks you into feeling like the game runs smoother but at the cost of graphical fidelity. If I go to a burger restaurant, I want something made from real animal meat. Not the synthetic lab grown proteins.
Basically TLDW:
If you a competitive gamer and you got a 4090 stay where you're at. If you got 2k lying around and a PCIe 5 motherboard get the 5090 for its raw power.
Raw power = reaction speed
if you a casual gamer and you play games like the stellar blade (insert any other coomer game with physics) and you don't have a 4090, get the 5070 because the fake frames gonna smooth the booty jiggle out.
Fake frames= smooth motion but base FPS response time and feel.
in hindsight this is why a lot of COD mnk players were complaining about "floatiness" in MWIII but 4090 users were like "Le MAO what are you talking about?". The people complaining probably had frame generation on getting "240FPS" but their actual FPS speed was 120fps.
Sidenote: I bought a prebuilt corsair about 2 years ago now, with a 4090, 14gen, i9 64gb ram a little under 3k before taxes and delivery. After the 50 series announcement that PC or one close to it was listed at 4999 before taxes. You see the same pattern on Starforge PC's website with a lot of the 40series builds already "out of stock"
If there's an actual TLDR here that should tell you all you need to know.
Lets wait and see what the real performance numbers look like.
Only twice the price of my car.
550$ for 12gb on a small 192bit bus. That’s ridiculous
The B580 is around $300
Only needs half of that with their new compression algo
Sadly I paid £800 for that with my 4070ti….😢
@@karlreading3201 Same as.s me, prior card was gtx980, thinking finally good card to buy and...what a fckup. Maybe 4dlss can save us from total buyer remorse disaster
is it just me or has anyone else also noticed that this tiny RTX 5090 PCB that Huang showed off cannot properly connect to a PCIe slot because of where it sits in the graphics card (in the middle : too far to the front of the motherboard to be able to properly connect to a PCIe slot) and that it also lacks any kind of PCIe fingers/connector and most likely a bunch of additional electronics parts as well ????????????
(I don't see even a tiny PCIe x1 connector let alone an x16 one, and the tiny PCB only seems to contain a GPU chip, memory, the power stages and not much else)
The PCB that was shown (physically and in the parts animation) looks incomplete or it somehow connects to a 2nd PCB in order to connect to the motherboard that wasn't shown. Either way Huang didn't present the entire truth, and @JayzRwoCent's Phil fell for it hook, line and sinker (time stamp 00:00:24-...).
I wonder what else NVIDIA has hidden and/or skewed during this presentation...
So, frame gen boosted stats for something that will look sub par in game? Looks to me like Nvidia are just trying to claw some a.i money back
Nvidia is selling shovels, they aren't digging for gold.
Great job Phil!! Going beyond the keynote and right to the point.
Can’t wait until q4 2025 when they actually come
In stock and aren’t immediately scalped
No doubt the 5080 will not be 999$ scalpers will grab them up in seconds and mark them up to about 1300-1500$ and people will pay
If the 5070 is actually in stock at $550 that might be my next gpu. I thought the amd 9070xt would be a good deal this gen but it's going to have to come in at $450 retail now to compete, even if fsr4 is way better than current fsr.
£800 for a 5080 I couldn’t be more happy with that considering I’m still rocking the 2080 Seahawk ek x watercooled gpu. So close to finally being able to upgrade my pc.
well amma sit this series out so it's great to see they are tinkering with some new tech in the shaders etc so it'll be properly ready with the 60series :D
Watch that $1999.99 turn into $3500+ for the first 2 years of release. I waited on the 4090 over a year to sell at or below MSRP and I ended up getting one for $1500 with a $450 EK water block already installed. Doubt I will ever upgrade again.
Its just so misleading... At least during the 4000 gen keynote they showed both with frame gen on and off, this year they only showed the X4 multi frame gen on and claimed the 5070 had 4090 performance. The only real performance benchmark is the far cry 6 benchmark, the 5070 is a 4070Ti... 15-20% generational performance uplift across the board, probably the worst ever generational uplift from nvidia.
Were used to the new gen xx70 being as fast as the last gen flagship/xx80Ti/xx90. Like the 1070 had 980Ti performance, 2070 Super had 1080Ti performance, 3070 had 2080Ti performance, 4070 Super had 3090-3090Ti performance. Thats no longer the case, instead of 50-60% performance improvement, were seeing 15-20% across the board. Extremely bad, and AMD now got a golden opportunity to release something that got real 4090 performance for 5070-5080 price.
It sucks but i dont see AMD touching 4090 until even ankther generation. Nvidia is just another level right now but im rooting for AMD or even Intel eventually. We need some high end competition.
That’s impossible. They would have to drop prices on 7900xtx by at least 200$ to make 1,000-1,200$ 5080 equivalent cards. From what I heard, AMD aren’t even going to try to compete against Nvidia top end cards cus they failed to create. Hopefully they can read the room and not lean so heavily on frame gen stuff and gives us better RT and raw rasterization.
That’s impossible. They would have to drop prices on 7900xtx by at least 200$ to make 1,000-1,200$ 5080 equivalent cards. From what I heard, AMD aren’t even going to try to compete against Nvidia top end cards cus they failed to create. Hopefully they can read the room and not lean so heavily on frame gen stuff and gives us better RT and raw rasterization.
Can't wait to see your review of 50 series cards but wondering if DLSS frame generating is good for competitive gaming as well.
for fps, only real frames
frame gen is horrible for competitive gaming.. sorry to break the news
I don't think 4090 owners have anything to worry about 5 years forward, especially given some of the new DLSS juice will reach it too.
even PS6 won't be more powerful than the 4090.
I believe I'll be upgrading from 4090 in 3 ish years when the 60 series and smaller nm processing happens. By that time nearly all games will probably have DLSS enabled.
@@hrodgarthevegan games are getting more and more boring anyways
@@dieglhix I'm a pretty old gamer, 45+ . I can understand what you mean, but I have been able to find enjoyable newer games, the problem is that there are so many games these days, one has to wade thru so many. Also part of why I watch a lot of playthrus I guess lol.
@@hrodgarthevegan I am "only 35", have tried so many newer games and what I find about them is that they are too slow, too many RPG elements, they are less "bold" and lastly, the physics animations actually look worse than PS3 era games. For example, I just finished Max Payne 3 which was released in 2012, but it looks really great.
I then tried Red dead redemption 2 which was made by the same writer and studio but it was super slow. I am finding a lot of joy with Just Cause 4 now though. I still believe the "golden era" stopped in 2016 before they added RPG stuff to everything. I enjoy a lot my arcade cabinet with batocera front end system. I have discovered thousand of games with the screensaver videos.
Yeaaah. Your not going to pay $549 if scalpers have anything to say about it 😂
Can they just give us the raw horsepower? DLSS, FSR, ABCD, Why are cards stuck to software reliance for frame rate? Give us brute force with the benefit of frame generation if we need it for new games in a year or two. 5090 is 32GB while the 5080 is 16? couldnt give it the 24? And AMD....... Not even attempting to give us a raw 4k card? nope take the 9070 and use FSR. f outta here. Intel its your time to shine with the next gpu launch.
They want DLSS etc. to be the default. They want to talk about the highest possible numbers, no matter how they were produced.
It's to please shareholders, most of whom sadly only know FPS as the big number that must grow bigger.
A lot of gamers may interpret 15 out of 16 pixels being fake as an insult. To shareholders, that means 240fps instead of 60, which means in their mind that more consumers will buy these products so they can have 240 FakePS. The groaning and the cacophony of silence in the crowd during the keynote doesn't matter to most of them.
BTW 4k raw gaming has been a thing for 10 years already, and the RX 6800XT does a fine job at 4k high. Just not with antialiasing on top of that (why would you use it at that pixel density anyway).
@@michelvanbriemen3459 Im not hurting for an upgrade you know? I just finished my 7900x/7900xtx 6 months ago I am set for a while to come. It's just frustrating that AMD tucked tail and said you get 1440p for cheap or sell your family for 4k with Nvidia.
I also understand that in the grand scheme of business, consumer GPU purchases are just a ever so small fraction of their revenue so they care less. For the past 15 years we felt cared about. This year its like they both cant stand our peasant smell.
@@EpicValleysStill I mean yeah maybe they will release more cards later but idk. Didn't they say something like they won't be competing in the high end gpu market anymore ? Does anybody know why ?
Did you learn nothing from last year's CPU releases? Or that AMD gave up on the high end?
Moore's law is dead.
There is no brute force approach anymore, at least not without going to a insane power consumption nobody would tolerate anymore. (Most already complain about 575W.)
This cards already give you what you ask for, the highest rasterization performance ever with additional up-scaling or FG technic.
I dont understand why people are crying about "fake frames". The visual difference isnt even that significant. The smoother visuals due to higher fps is significantly better than any compromise made to the picture quality due to fsr or dlss. Why are people crying about the technicalities that these are "fake frames" and "the raw power hasnt seen an improvement". Why does it even matter if you have to slow down and zoom in the game to see a noticable difference between dlss on and off. Fsr and dlss are only going to get better over the coming years to the point you wont even be able to tell which is which.
It is the analog like old guys whining about new, modern cars, they are not real, they do not put out smoke and gas fumes...but they actually drive the same or better. I do not care how those fps are generated, if they look good then it is fine. And if the 5070 can give same frames with modern stuff then it can be equal to 4090 because 4090 cant use this modern "stuff" Period.
Another generation to skip, same bullshit claims like with the 40 series. Theye are selling hot air again.
Thank you Phil. For filling in for Jay.
Guess we're gonna have to wait for third party reviews to see how these really perform, meaning WITHOUT DLSS.
5080 at 4090 performace RAW DOG ZERO DLSS ANYTHING or EPIC FAIL!!
I don't understand this take. In purely gaming raw power doesnt matter. Nvidia is sticking with AI is the future and the price of the cards is more than just the raw power of the cards. It's the R&D of their AI. AMD, despite their best efforts, still can't compete with DLSS 3 and we don't know FSR4 can compete with DLSS 3 let alone DLSS 4. I have a 4080 Super and I love FG. DLSS is only going to get better Would you rather have a future where cards draw 800w or an AI powered card that draws half the power for similar performance. Me, I dont want to game in a room that is boiling hot because of my gpu
@@aHungiePanda and I love FG. Flaccid Gobbler? WAY to much personal information!!
@aHungiePanda He is in the "fake frames" crowd who thinks DLSS is blurry, unplayable and has major input lag. Not true, It's actually amazing and I love it.
@@aHungiePanda we all have different expectations to be honest. I personally don't use framegen. the only reason I spent 700 bucks on my gpu was not to use framegen. and I will add, that you can improve, at least to some degree, the power consumption and improving the performance at the same time (it is definitely not simple, but it can be done and it has been done in the past). that's part of hardware innovation.
6:55 what would that thing's production-value be (unless the thing is damaged beyond repair)? Somewhere over $100k right?
I really wish the 5080 had 20 or 24GB of memory.
i have rtx4080 and definitely not worth the upgrade to 5080 at this point only 16% more raw performance nahhh i will pass
3:16 The chip is larger and must be less dense in heat producing cores. If the heat is spread out it is much easier to cool. The vapour-chamber and cooling fins looked brilliant.
So glad i sold my 4090 with no loss 2 weeks ago 😅
Until you realize you can't secure a 5090. Never sell your card before you can secure the new one first. A very hard lesson i had to learn when i sold my 2080 to buy a 3090 card years ago.
@@thamisfit23 lol that's why i never sell GPUs, trying to get hold of a 5090 is going to be an absolute nightmare
@@thamisfit23Yeah but if he did that than he would have sold it at a HUGE loss, just wait a few months and get the new one
@PrestigeRich you don't even know how much he sold it for. Buying cards like that means they'll almost always sell at a loss, unless you're trying to scalp them on release.
@@thamisfit23 I once, stupidly sold my 1080Ti two weeks before the 3080's were released. I ended up going two years without a serious graphic card LOL, because there was a shortage during the pandemic and the prices were insane.
Whats the laptop used in the video? Hard to tell with the skin
I love that the price of the 5080 has not increased from the 40 series.
I don’t because I recently bought a 4080 Super for 1000 being absolutely sure the 5080 would be 1500.
Oh well, i can at lease hope, availability will suck. 😛
@@daray666they couldn’t sell the 4080 at 1200 which is why the super was 999. Ofc the 5080 wasn’t going to be 1500
@daray666 It's all good, that's a great card either way that will be relevant for years.
The pricing of the RTX 5080 at $999.99 USD is quite surprising considering the rumoured price range was $1,299~$1,599 USD.The question is will the cards actually retail for the announced MSRP of $999.99 USD without the imposed 25% tariff from the incoming trump administration?Third party reviews of these new Blackwell based GPU’s are going to be indispensable.
@@Unamatrix01 not really. They had to drop the 4080s price lol. It was not selling at 1200
Only care about raster performance. Hope we can see some testing on the cards soon!
It seems like Nvidia kept the pricing more in line with its core count than it did in the 40 series. The 5090 seems expensive but it also has 20000 core which is a metric ton.
starting to enjoy the philztwocents lol im excited about price point on these the 5070 ti isnt a pipe dream anymore for me. Thank Jay and the whole team for keeping us up to date.
We all know that the 5080 has 5060 specs !!!
Stop ignoring that.
They are selling 5060s at 1000€MSRP (1300€ shelve price tag) under the 5080 name tag !!!
Do not accept this !!!
Yeah literally everyone is going to accept this just letting you know.
@profaa927 I am convinced.
That's why Ngreedia has the courage to take your kidneys for their GPUs.
I was just looking around and Sam or smart access memory seems to have came and gone . Or is it still a thing when using both amd newer gpu&cpu..
Can't wait to see 50 series build
People expecting the same supply issues this time may be in for a surprise...
Nvidia opted to use the same 4nm process as with the 4000 series (slightly updated), which suggests supply constraints won't be as severe. They could have gone with 3nm and competed with Apple for the 3nm process, but they stuck with 4nm because it's a mature process, readily available, & they understand that if they don't meet consumer demand, THEY ARE LEAVING MONEY ON THE TABLE FOR SCALPERS! They prefer everyone who wants one to buy a 5090 at MSRP. because they don't get one dime of scalper profits.
So chances are, supply will not be a problem after a week or two, and the scalpers who don't understand this will be left holding the bag.
When Jensen says the 5070 = 4090, he implies that the 5070 can create more frames using AI than the 4090 per real rendered frame, which compensates for the raw power disparity. I just felt the need to explain it plainly for folks because of the buzz surrounding Jensen's statement.
And 4090 users definitely use dll3 whenever possible so 5070 is hella impressive no matter how mad you are.
@@filippetrovic845 Just like the 4070 and 4070Ti, the 5070 is a very good card until you consider that 12GB VRAM buffer.
@@filippetrovic845 This is a bot.
Are they PCIE 5.0x16 ?
Yea the 5070 is the Wish version of the 4090, 75% fake frames 💀
Still going to wait for the real-world performance benchmarks. Though this might be the generation I'll be upgrading into this year.
So in other words the 5090 in Australia price wise will cost more then a Rental deposit and Bond with another month of rent paid in advance, I wish I was joking. The exchange rate of 1 AUD$ being 0.60 USD$ and taxes/retailer mark up makes it over $5K AUD.
Thanks Australian Gov.
Local MSRP of A$4,039 ain't much better but we haven't hit the 5k dystopia until Asus bring out the 5090 Strix
90, 80 and 70 all packing ~15% mark up after GST vs US prices, but 5070 Ti only set to cost ~5% more vs the US prices
@@YayDanMan I have yet to see an Nvidia card sell for MSRP in Aus till the next gen launches.
So don't buy it. Stop being a Consumer Slave.
What should I upgrade from my old 3080Ti?
Thanks!
Factor in scalpers and the 5070 will be well over 1k to buy
The 5070 looks like junk, at most it should be called a 5060. The rest are not bad but we just need games now.
RIP 4090 - bad take fellas, only the 5090 will surpase it without nonsense triple frame gen, missed that one didn't ya'll ;-)... The 5080 might have "MIGHT" have faster cores, but has well less and 16 GB vs 24 GB the 4090s, ram speed over all is faster and importantly, more GBs. so.. not so much.. Like Steve told ya'll slow down, get it right first, then post :)
lol yeah, i think this video will go down as poorly as the original 4060 ti review where this was the only channel saying that was a good card.
lol i remember how jay had to apologize for that xD.
5080ti will be the only card to be potentially better than the 4090, at the same price ($1500-1600 likely) other than the 5090.
Gonna just pick up a used 4090 if I do switch.
You can generate frames in the future by grabbing the current user input. Knowing that for example, the character is moving forward would give the frame gen enough information to determine 3 frames in the future that the character and map will be moving forward.
Jay: Phil needs a raise.
Laying down on the job again, is he? 😂
Well done phil! nice to see you on camera.
RIP to all 4090 resellers 😂
4090 is even more expensive.
Right, the 5080 at 1000 gonna kill it.
They can keep their 4090.
@@ThunderGod9182we planned on it. Don’t be salty because you can’t afford it 😂
@bmoney2175w oh, I can afford one, but I don't trust the new 16 pin high power connector, I prefer my tech to be proven reliable first, which the high power connector has not been.
Alright my tribe, I need some help.
I got some sad medical news - going blind in my right eye, early onset glaucoma - building a PC for the first time in 9 years to cheer myself up.
X870 motherboard, Ryzen 9 9700x, in hand, and a 7900xt for $650 here by late January.
Do I hold and run the 7900xt (Hell Hound edition) or kill that purchase and do the 5070?
Help please.
It seems like you want the 5070. Just the 5070 friendo.
finally i can purchase 1080 finally 2:04
Great to see ya on the other side of the camera Phil!!!!
eveybody raving about 5080 price "leaks" must be embarrassed now
The prices are still stupid high, now add in scalper and nvida limiting supply to spike prices.
@zack9912000 $1000 for a 5080 is perfectly fine
Cards aren't out yet and you're complaining about nonexistent problems.. i smell fanboyism
@@dontmatter4423you honestly think supply will not be low? Come on dude. Live in the real world. They will sell out in seconds and unless you live near a microcenter you will have to pay a few hundred dollars more than MSRP
@@legiox3719 supply always declines at launch for popular products, but they always restock soon enough. I've never seen 40 or rx series cards out of stock in my region
@@legiox3719 supplies always decline for popular products at launch but they also restock soon enough. i haven't seen a single 40 or rx series card out of stock in my region.
Can't wait for AAA devs to spend even less on optimisation just to rely on upscaling and frame gen, bringing us further into the muddy world of upscale blur hell
what PIN is it using this time?? the good one or the bad one??
People need to lean into it. Its not going away. If they did in fact lower latency by 75% then frame gen may be finally accepted by people. Also what people fail to realize is. You know its fake frames but the pc doesn't. Thats the entire point. Its tricking the pc. Thats how algorithms like this work. Regardless people can be in denial all they want to. 4 years from now it won't be something that's hated but instead will be loved. What do people expect? How can raster keep going guys? It can't keep getting big gains. We are at a spot now where even massive hardware improvements can't really push raster in huge directions. So this was bound to happen.
Also from what nvidia is stating we will even get image enhancements this time. So they may have found a way of doing it without sacrificing details.
This is kinda my stance on it. I think the promises of the tech are great, but as for what it's done as of late what it has given lazy developers the ability to do (and, most consequentially, screwing over consumers/gamers that don't have access to such tech), there's of course a lot of pushback. If Nvidia can actually do what they set out to do with these AI tools and make them near indistinguishable from raw game rendering, both in look and feel, and at the price point they're showing, it might be a great time to be in the PC space in the next few years
@@cr0ss414 i do agree devs need to get there shit together.
My rig finally bit the dust. It had a 980 in it, how do I ensure I get a 5080 at launch??
6:50 Captain Blackwell !
on the part of cooling going by the direction of the fins it seems it is a cooler made for the rack so most likely AIBs will be uses either a thicker cooler or liquid cooling due to having to removing 600 Watts of heat. Yes watts can be a measurement of heat output not just a measurement of power. Why do I say 600 Watts of heat when nvidia has a number in the 500s? AIBs are known to overclock the graphics cards they sell thus making them more waste heat as nothing is 100% power efficient. For some I am thinking the AIB will ask to have 2 16-pin connectors so they can OC the card with their company's policy of headroom left. Like always I am unsure if nvidia will allow it but having 2 PCIE 5 power connectors is not a thing on most PSUs but adapters do exist.
If anyone believes the 5080 will out perform the 4090 I strongly suggest wearing a helmet for the rest of your life.
Phil it gets better with time! i could tell a bit that you were nervous but it does get better.
5080 at 1000 dollars, must really only be a little better than the 4080. NVidia isn’t that generous
missing a 0, it's $10,000
can someone tell me how to get a geforce directly from nvidia? do I need to preregister or anything? or do you just need to be quick and lucky on the realese day?
just bought 4070ti Vulcan last july 😅
Big mistake
5070 got 4070Ti performance, no difference at all.
@@TheGamingNorwegian uh huh when in reality u or none else has zero clue what it's performnace is accept with new dlss frame gen it's 4090 performance, maybe it has 4080 raster we dk either way u prob paid a hella of alot more for less is all, it's ok it'll be alright but fomo sucks
@@djbpresents9584 yea.. but what to do.. haha never mind. already bought hahah
@@TheGamingNorwegian its okay .. waiting for jay or phil do benchmark new gpu... haha
$2000 for a graphics card is insane
I know right,
It is so dam cheap
Is unbelivable, it should been 10k at a min due how much value it province and how long it last. In 4 years you make 2k and more shut up.
you can only predict the future of frame generation if you are holding back the entire system 2:00. This what I suspect Activision been doing on their servers with warzone and mulitplayer black opps and modern warfare, the players are seeing imaging quite a few MS later from real time imaging, allowing the server to cut out the lag and smoother play through
I don't need it... I don't need it... I definetly don't need it, I NEED ITTT!!!!!!
Glad to see XAI508P leading the charge. One thing to note about XAI508P tokenomics that was glossed over is that usage of the XAI508P Network burns, lowering supply and well... You know the rest ;)