@f.iph7291 30 fps without any upscailing or frame generation is still impressive when we're talking native 4k with full blown path tracing on a relatively modern AAA game. I remember when computer scientists used to say that real-time ray tracing would be forever impossible even for super computers, let alone a personal computer being able to do it, but here we are.
@@mikelay5360 It will be faster than 4090, probably like 10-20%. This is rule and it always been like it. I cant imagine they would make it slower than last gen next tier card, it would result in poor sales and bad reputation.
For what RTX cards cost, it should be even more... 32GB. Make it future proof, and also VRAM ain't too expensive to produce. But ofc Nvidea wont do that... They really like to keep VRAM down as much as possible.
@@Smexbi Yeah thats true. we don't have enough vram at all. RTX 5060 should have 12GB GDDR7, 5070 16GB, 5080 20-24GB, and 5090 32GB. I think thats fair but we all know thats not happening. RTX 5060 Ti could have 16GB and 5070 20GB if it were my choice, but that definitely won't happen. unless it's 128 bit and 160 bit.
Looks very much like it! But If you'd have asked me if NVidia would still do that for the 2024 generation I'd would have guessed in the liklihood of 70%....
I feel that's a bit high outta the gate for MSRP. The Federal Reserve just cut rates by .5% recently as well. I'm kinda expecting an MSRP around $1899 or $1999. They are apparently bringing back the Titan as well, I'd certainly expect that to be close to $2499 if not more depending on the spec's.
@@OneTomato They were trying to call the 4060 a 4080 12GB and instead they unlaunched it and released it as 4070 Ti. They also released another 4060 as the 4070 and then the 4050 as the 4060, 4060 Ti and 4060 Ti 16GB. Yeah - it's been a really confusing generation.
@@cracklingice You decide a class of a card by thy memory bus XDDDDDDDD Don't even bother replying, this is too stupid to engage with. I just hope nobody even less informed than you buys your bullshit.
it's pretty much useless if you upgrade from a 4090. the 5090 is the only one that makes sense here, and nvidia is trying hard to upsell it with twice the memory and the fp32 performance. probably also twice the price.
Unless they pull 12GB 5070 BS again, this is as it should be. The 70 and 80 class are both 256-bit class cards. With the amount of space between the 5080 and 5090 rumors - there is plenty of space for a 24GB 384-bit 5080 Ti if Nvidia is able/willing to make the cut.
Simply buy AMD then. People need to stop whining about a card that hasn't even launched yet. If that 5080 performance matches the 4090 at a lower wattage and price then it's a no brainer.
Naja Speicherausstattung ist doch schon seit der 2000er Serie bei den 70ern ein Witz. Alles nur noch zum Upselling da oder man muss auf Super und Ti Gedöns warten...
They better price that 5080 very cheap for how cut down crap it is. But i bet they won't, they will just try to trick ignorant gamers with the naming. Personally i will bite the bullet and get the 5090 as i have delayed buying anything since before the 3090. As long as Nvidia gives me the performance and puts it out sooner rather than later ill pay the Nvidia tax. Most people should probably wait for 5080ti price to performance king 9 months later
Going to cope hard and hope DLSS 4 is texture upscaling, and that’s the reason for 16GB on the 5080. Was really planning put one in my AM4 pc with a 5700x3d 😭
The RTX 5090 is looking solid if those rumored specs hold up. The RTX 5080 however is not. Just like the RTX 4080 that launched, the RTX 5080 is a 70 series card if those rumored specs hold up. It’ll be slower than an RTX 4090 in most games. Especially when the resolution/settings are pushed for demanding titles.
Hopefully, the leaked specs for the 5080 are incorrect. If they are correct, it will be dead on arrival. I don't think Nvidia will skimp this much on the 5080. The 5080 is still an important GPU that Nvidia could make lots of profit from.
@@Gamer-q7vNah man they are true the last good 80 class GPU was the 3080 12gb. The last good 80ti was the 1080ti. Which is why I have a 4090 and 4070 Super. They are the best performance and value GPU’s of the 40 series. I will be upgrading my 4090 to the 5090.
@@ZackSNetwork The 4080 is good, excluding its price. Even though the 4080 is on AD103, it outperforms the 3090 Ti by a significant margin. My two complaints about the 4080 were the stupidly high launch price and it using a slightly cut-down AD103 chip. If the 4080 had launched for $800 on a full AD103 chip, it would have been a killer GPU for high-end PC gamers.
@@ablet85 Making the 80 class GPU crap is a big mistake. Nvidia would waste their own money by doing that. There is no way consumers will be willing to pay for an overpriced nerfed GPU. It would just sit on shelves and gather dust. If Nvidia were to also give the 80 class GPU considerable hardware upgrades and price it accordingly, it will likely sell very well. That means more profit for them. This wouldn't cannibalise the sales of the 5090 because that would be a much higher spec GPU with a higher price aimed at a different audience.
Rather than giving us bigger chips/memory Nvidia seems to be pushing wattage up. Running GPUs hard and hot. Hopefully we can tune the power allocation and cut power draw with less impact on performance.
600w is really worrisome, there could be risk of degradation like Intel CPUs when pushed to such high wattage. Also, its 33% more wattage for around 40-50% performance gains. That means the architectural improvements isn't anything too great. Seems like nVIDIA is trying to brute force, by increasing wattage and cooler size to hit performance targets generation after generation.
@@skychaos87 i mean, i feel like nvidia would play it safe after seeing the intel shit, considering how much they have to lose in the data center market, but i agree that it sounds like they're starting to hit problems internally.
If 6000 Gen would get a 90 series with 48GB, a 80 series with 32GB, a 70 series with 24GB, and everything below with 16GB That would be a peak period for consumers, which means also their sales... NVIDIA just has to listen... Or AMD...
It would be seriously interesting if the 5080 and 5090 were both actually the same die but fused together. I'm having a heck of a time trying to figure out how to have both a 5080 Ti and 5070 on the same silicon though. I suppose they could make a second 256-bit design for laptop and 5070 that is not fused together and then have a 75% cut to the memory system and cuda core cut but fused together for the 5080 Ti.
600 Watts is insane dude. if you add a 9950X3D, the rest of the PC and a high end monitor you're looking at a setup approaching 1000W of constant power draw under max. load. I would need a dedicated AC in my room if I were to get that lmao
5080 is a joke, I'll wait for the 5080Ti with 20-24Gb of VRAM and at least 14-16000 shaders at $1000-1200... if that doesn't exist then I'll just buy a 4070Ti S equivalent for $499 and wait another 4 years...
@@mikelay5360 Can't, already waited 4 years... I'm actually surprised that my 1070ti survived 7 years of use, my 6600/8800(and warranty 9800) all died or were dying after 2-6 years
80 class is a 256-bit product historically and that would make it 16GB until the larger capacity modules are released so it makes perfect sense. It depends more on the price IMO than the memory capacity as 16GB is fine. If the 90 class is indeed a 512-bit product that leaves space for a 384-bit 24GB 80 Ti - if Nvidia is able or willing to make the cut.
80 class is a 256-bit product as is 70 class. 16GB makes perfect sense because it is either that or 32GB if it's available yet and that would make the 5080 likely cost more than the 4090. 16GB is fine. I'd also point out that there is a massive space between the rumored 5090 and 5080 - enough for a cut down 24GB 384-bit product. The real problem will be if they try to release a 192-bit 60 class configuration as a 70 class product again. Then we should riot.
@@cracklingice It will also be too expensive, so 20-24Gb would be fair and if a Super or Ti follows it will only be more expensive. Anyone who sees any sense in something like this only reinforces this greedy nonsense!
@@-Biber As I have already said. 80 class is historically a 256-bit card. That leaves only 16GB or 32GB as options and we both know Nvidia would not go 32GB. Whenever they have gone > 256-bit it has usually resulted in lower memory amounts than it should probably be or as a transitionary card between doubling of memory IC capacity. Rumor is there's 24Gb (Gb is gigabit while GB is gigabyte or 8 Gb so eight 24Gb modules is 24GB of VRAM). I would expect next gen to be 256-bit 24GB if Nvidia doesn't have to pay more than 32Gb ICs would be. It's a weird volume discount thing. Basically if no one uses 24Gb then it'll cost too much.
@@cracklingice I have a 16Gb card and in some games it goes up to 13.5 with max settings, sometimes almost 14. That won't be enough for much longer. And about your “history” with the 256 bits, I call that conscious stagnation instead of further development. But hey, a 24Gb card with a little more than 256 bits would hurt the strategy of "gently" getting people who always buy 80s to spend more on a 90s. A smart move, albeit a shabby one!
If 600w is the "standard" max power draw of the 5090, that would explain why PSU makers are starting to add a second high power connector to support these cards. The partner cards will inevitably be pushed to even higher power consumption, so they will need another power connector. I kinda want to get one, but I don't want to have to upgrade my air conditioning to offset the heat this thing will pump out.
The 5080 is such a huge cutdown from the 5090, it's crazy. Its like its actually supposed to be a 5070. You could easily fit another tier of graphics cards between the 5080 and 5090 with a 320 to 384 bit bus with more vram and overall more performance.
Well, the thing with getting specs and possible oerformance for the balo Nvidia prosuct is that it doesnt tell us anything about the cards we would a tually buy (5060-5080). The gap between the halo and the second one have always varied so much. It could be 5% or it could be 25%.
I wonder how much the 4070 and 4080 will come down in price after the release of the 50 series. Current prices are still crazy for me. I come for the land f 1070 prices- which is the current card I have.
512-bit 32GB? I wasn't expecting that. That's insane. I ain't complaining, though. This should provide a massive performance increase at resolutions such as 4K. I think the 5090 will have anywhere between 168-176 SMs.
If they really are going to sell a fully enabled 102 die, it's going to be very expensive with very low supply. I have to assume it's going to be on a 5 nm variant because if it's 3nm this thing is going to be an absolute beast.
5080 is a joke but no one's laughing. At least if those specs are true. And it's probably still going to cost like 1300-1400€ while 5070 start from 1100€ with 12GB again 😂 (And these prices include tax.) They won't sell and then comes Ti and Super models with the same price and more VRAM. History often repeats itself because humans won't ever learn from their mistakes.
Nvidia you need to increase the Vram amount by 4gb for the 5080 and every card below that. If you're gonna push technologies like RT and frame gen that increase vram usage then you need start giving enough vram to use them properly. Especially with the current card pricing. 5080/20gb, 5070/16gb, 5060/12gb. Creatives are using these cards as well and they also want more Vram.
When the 3090 came out and it was 1500$ (MSRP, that is...) I looked into my crystal ball and predicted that the 6090 will cost at least 3000$. Well, with these specs of the 5090 I don't think it will cost any less than 2000$, but probably more like 2500$, meaning that prediction for the 6090 won't be too far off...
I can already imagine the comments... "I'm a creator so I need the 5090!" Jason is soon out to get your money guys and I guess he won't be disappointed.
To the people that are pissed about the specs or think the price is too high (which they will be) - no one is forcing you to buy these cards. But yea, once they release these the price of the 4090 series cards will be way more affordable. and tbh its a great card still.
My prediction was 168SMs with 512 bit bus. This would make a similar ratio of cut SMs as the 4090. So this is definitely going to be 60%-70% faster than the 4090. They're not gping to make a Titan so this will cost at least $1999.99 up to $2499.99. Then the 5080 will be 10% faster than the 4090 and launch at $999.99. A 5080 Ti will be probably 20% faster than a 5080 have 22GB of VRAM and be $1399.99 to $1599.99.
Geez seems like they're really making the 5090 double the card of the 5080. Almost everything is double. That'd be a heck of a performance jump over the 4090 if the 5080 is about on par with it
I am calling the 5090 is NOT going to be a Gaming card but rather a Titan one. Especially if it receives 600W I don't think it's going to be "normal". I also think it's going to cost ~2,5k
To be frank, the RTX 3090/3090 Ti and RTX 4090 *are* the Titan cards of their respective generations. Titans by generation (descending order) vs gaming flagship: 20xx (Turin): Titan RTX, $2500 - RTX 2080ti , $1000 10xx (Pascal): Titan X Pascal+Titan Xp, $1000 & Titan V, $3000 - GTX 1080ti, $700 9xx (Maxwell): Titan X, $1000 - GTX 980ti, $650 --- 7xx (Kepler) GTX Titan, $1000, GTX Titan BLACK, $1000, GTX Titan Z*, $3000 - GTX 780ti, $700 GTX 690* (Kepler) GTX 590* (Fermi) GTX 490* (Fermi) *dual-GPU with 2x GPU, 2x VRAM Of all of these, only the Titan V wasn't really a gaming card. I agree that the 5090 will be a $2500 "Titan"-like product. Note that the 90-class of old (GTX 490, GTX 590, and GTX 690) were all *dual* GPU setups and no generation since had both a 90-class *and* a Titan product, though some generations had multiple Titan products (e.g. 7xx had *three* including the Titan Z dual GPU, as well as 10xx Pascal with the Titan X, Titan Xp, and Titan V).
I also just got a monster 4070ti last week but can't use it to its full power. Waiting on my RMA PSU, should I keep my card until next year and sell when the 5070 comes out or should I keep the card?
Glad I’m still on 1440p with a 3090. Runs everything fine. Wouldn’t mind switching to 4k 240hz but I’m not paying $1000 for a 16gb card and that’s probably what the 5080 will cost.
Is it like 7th time they leak different specs? and it is leaked by the same leaker, don't know how credible he is after leaking so many contradicting info. not saying that it is not possible, probably Nvidia indeed was thinking about all of those different specs, but it is annoying.
600 watts is a lot, but it sounds like it is going to be worth it, in terms of power usage. How much it will cost and if it will be worth the price is another matter. Thumbs up!
Yea, the only way to get a RTX 5080 with more than 16GB of VRAM is if it was a seriously cutdown GB202 die, which I still believe is on the table, as there is a long way down from GB202 to GB203, and I doubt anyone will have high yields for dies meeting the specs claimed here for RTX 5090 given in size and transistor count. It would be more appropriate for Nvidia to have the highly cutdown GB202 dies be the RTX 5080 so they have a chance to be close to the RTX 4090D in performance. GB203 would be better as the RTX 5070 tier.
@@mikelay5360 Bruh... games are only going to use more and more rams as visual quality improves. Its nonsensical to expect games to forever use under 16gb and have improved quality generation after generation. If anything its the GPU manufacturer that is retarded to gimp on the VRAM, especially for high end and expensive cards.
Have a nice trip! Why are you treating this as close to fact? Kopite is wrong as often as he is correct. I think 512 bus and 600w is too high for regular 5090
@@Mayeloski Knowing Nvidia the MSRP can't be less than $2000, it could legitimately just be $2500. The market price will of course be higher at least in the beginning.
RTX 5080 with a full GB203 die.... 😮 YES! That's what I've been waiting for. Hopefully the price is right. I'm probably gonna shoot for the FE edition, because the Nvidia 4090 cooler (if they use the same one that is), is build well enough to handle it and it's a slick premium design. From a 3070 to a 5080..., good upgrade id say.
The 32gb memory makes me think the ti must be in the works. 32gb isn't meaningful for a.i. models- but 48gb is. Once you hit a 48gb vram, it opens a lot of doors for local a.i. models, etc. So I thought it was weird for 32gb - there's nothing in a.i. that needs 32gb, but maybe it's for running gta6 😂
I plan to do a new PC new year, waiting for the new CPU/GPU that will fit my 1440p configuration (I don't want a 4k one, way too expensive). 5080 400w... no thanks ! the price will be over 1200 € anyway so no way that I buy a +1000 € video card. I'll certainly go full red on CPU and GPU, 1440p fit my need knowing that I plan to move to an OLED screen and the OLED 1440p screen are more affordable than a 4K one... and if you take a OLED 4K you have to have a great powerful video card (so a +1400€ one) otherwise it's pointless.
I have only pcie 4 on my motherboard. Is it possible the 5080 will be pcie 4 or not? Also if it is pcie 5 would I lose a lot of performance with pcie 4?
My guess is 5080 cut so much because of the US government's block on China and other countries. Assuming the government won't allow it out without a lot of rework.
Wow! 5080 with only 10500 cuda cores, that is hilarious 😂. That is barely HALF of the 5090, and people pointed out the 60% difference of the 4080 vs 4090
400Watt for 5080 is way too much and cannot believe what kind of heat it will have to dissipate. I am way more interested in what Intel has been cooking.
well I'm expecting the 5090 to be at least $2000 some are saying 2500, its possible but it's insane, but we gamers don't matter to NVIDIA anyway, the 5090 will sell out like hotcakes... as for the 5080 I don't think we will be getting the same price as the 4080 super at $1000, most likely gonna go back to $1200 if not more... no competition on the high end NVIDIA can do whatever they want, in the end they will sell it all... mainly to AI right now
The 1070 is slower that a 980 ti, I had both. The 980 ti is as powerful as the 1070 ti. A 1080 ti is as powerful as the 2080, and a 2080 ti is a little more powerful then a 3070, probably more like a 3070 ti.
I would not spend 1000+ dollars on a card with only 16gb of memory. I would wait to see if they release a 16gb 5070ti super and go with that, if they don't just go with a 4070 ti super and wait 5 years.
I was looking forward to this, but now I think I will wait until AMD brings out their next top end GPU to see how it performance, the amount of VRAM, and the price.
Need to see the 5090 take the Cyberpunk path tracing 4k test
I bet it can take it in 8K, but we'll see
30 fps without upscaling lol
@f.iph7291 30 fps without any upscailing or frame generation is still impressive when we're talking native 4k with full blown path tracing on a relatively modern AAA game. I remember when computer scientists used to say that real-time ray tracing would be forever impossible even for super computers, let alone a personal computer being able to do it, but here we are.
@@03chrisv Would not wood.
1440p native with path tracing 🙏🏾
If the 5080 only has 16gb I wouldn’t even consider it for a second unless it costs $600
Try $1300 instead.
@@BrazenC 😂 like they care.
@@NicholasKoeppel If it's faster than the 4090 at that price it will sell out
@@mikelay5360 It will be faster than 4090, probably like 10-20%. This is rule and it always been like it. I cant imagine they would make it slower than last gen next tier card, it would result in poor sales and bad reputation.
@@sh_chef92 👀😂 NVIDIA you had me at bad reputation. They don't care about that. You will buy NVIDIA regardless.
DLSS 4 for new gen cards is going to be really interesting, wonder what new technologies and techniques will be used this time
I heard it's texture generation
It was in 5060m leak, DLSS4 is a FG X3, probably only for 5000 series like DLSS3 FG x2 on 4000.
A 512-bit memory interface…
Brings back fond memories of my old r9 290X.
5080 should have 20gb VRAM.
Yeah honestly even 24 gb and a 384 bit bus
For what RTX cards cost, it should be even more... 32GB. Make it future proof, and also VRAM ain't too expensive to produce. But ofc Nvidea wont do that... They really like to keep VRAM down as much as possible.
@@Smexbi Yeah thats true. we don't have enough vram at all. RTX 5060 should have 12GB GDDR7, 5070 16GB, 5080 20-24GB, and 5090 32GB. I think thats fair but we all know thats not happening. RTX 5060 Ti could have 16GB and 5070 20GB if it were my choice, but that definitely won't happen. unless it's 128 bit and 160 bit.
but then you wont buy the 6080 card 2 years later...
5080 16GB VRAM and the 5090 with 24GB of VRAM would not only be plenty but more then enough.
RedGaming “it’s going to be interesting” Tech
The 5080 sounds like it should be labeled the 5070.
Looking at another 12gb 70 series it seems 😂
Which is lame as hell because the 4070ti Super had 16gbs of vram on a 256bit bus.
@@redinthesky1 We complain in comments then proceed to buy😂.
Looks very much like it! But If you'd have asked me if NVidia would still do that for the 2024 generation I'd would have guessed in the liklihood of 70%....
you know nvidia would do 6gb of gddr6 on a 5050 if they thought they could get away with it
70 and 80 are both supposed to be 256-bit cards. If Nvidia releases another 60 class piece of silicon as a 70 card - we need to riot.
5090 MSRP will probably start at $2499 because people will pay that much.
+ 15% tax
I feel that's a bit high outta the gate for MSRP. The Federal Reserve just cut rates by .5% recently as well. I'm kinda expecting an MSRP around $1899 or $1999. They are apparently bringing back the Titan as well, I'd certainly expect that to be close to $2499 if not more depending on the spec's.
@@Demonoid1990 MSRP of $1999 would be reasonable as the top end card at the time, but it's Nvidia so you never know.
@@Demonoid19905090 on TSMC 3nm with 32G of GDDR7? Absolutely, $2500. & it will dissipate 600W!
They will pay all for it..
It looks like 4080 12gb situation all over again, but this time they went for 5080/5080ti instead of the same name
12? i thought it was 16
@@OneTomato They were trying to call the 4060 a 4080 12GB and instead they unlaunched it and released it as 4070 Ti. They also released another 4060 as the 4070 and then the 4050 as the 4060, 4060 Ti and 4060 Ti 16GB. Yeah - it's been a really confusing generation.
@@cracklingice My guy, are you tripping? Since when a xx60 is as powerful as the last flagship?
@@IMajst3RI Since when is a xx70 a crappy 192-bit bus? It's not, it's 256-bit just like the xx80 it's cut down from.
@@cracklingice You decide a class of a card by thy memory bus XDDDDDDDD
Don't even bother replying, this is too stupid to engage with. I just hope nobody even less informed than you buys your bullshit.
RTX 5080 with 16GB vram...nope, hard pass
it's pretty much useless if you upgrade from a 4090. the 5090 is the only one that makes sense here, and nvidia is trying hard to upsell it with twice the memory and the fp32 performance. probably also twice the price.
24gb would be hand in hand with 32gb ram which would be better balanced like 12Gb is pretty good with 16gb vram
Unless they pull 12GB 5070 BS again, this is as it should be. The 70 and 80 class are both 256-bit class cards. With the amount of space between the 5080 and 5090 rumors - there is plenty of space for a 24GB 384-bit 5080 Ti if Nvidia is able/willing to make the cut.
@@pirx9798 problamente vendera rtx 5090a 1100 dolares y la rtx5080 a 880 dolares el mundo entero explotara con la serie 5000
Agreed, must be 24GB. I skip, waiting 5080 Ti in 2025
All I heard was this gonna make the 4080 & 4090 look like bargain buys.🤦♂️
hmmm was jazzed about the 5080, but now i'm wondering if I wait for the 5080 ti or RDNA 5.
If you can wait and not in a rush.
Just wait.
Im waitint till 2028-2030 to upgrade my 7800xt
@@snakeinabox7220😂😂😂😂 this statement had me on the floor.
If RDNA 4 comes in early 2025, when do you expect RDNA 5 to release? Just wait for the 4090 to hit 1000😂.
@@mikelay5360 600-700 used.
Possibly less after 50 series launches
@@mikelay5360It's supposed to be end of 2025.
Why does this seem like the 5090 is the $3000 AI professional model?
Do not buy XX90, just buy xx80 but for 5080 are not worth it with 16GB, must be 24GB. Just waiting 5080 Ti, this is waste money.
@@vtg1800yeah or super version
Papa Jensen gonna be dropping premium jackets
Those 5080 specs surely look like another case of "4080" 12GB edition. ¬¬
Simply buy AMD then. People need to stop whining about a card that hasn't even launched yet. If that 5080 performance matches the 4090 at a lower wattage and price then it's a no brainer.
@@mikelay5360 Damn not even paid by Nvidia and you are a boot licker , must be very proud of yourself .
Cant wait to pick up a 4090 for cheap when everyone starts dumping their gpus to buy this.
Echt.....16Gb auf einer 5080.......was eine Verarschung wenn das so ist.
Mach dich bereit. Die 5070 wird wahrscheinlich eine 192bit 12Gb Karte. Lächerlich!!!
Ich wäre mit der Karte zufrieden, wenn der Preis in Ordnung wäre und die Karte tatsächlich leistungsstärker als die 4090 ist.
@@Bnd002p und damit gibst du dich zufrieden ?! die 3070 hat die 2080 ti geschlagen für 500 !
Ja wird sie auch. Daten sind dafür ja schon geleaked. Die wird wieder 192bit bus haben und das werden wieder 12.@@FenixROFL
Naja Speicherausstattung ist doch schon seit der 2000er Serie bei den 70ern ein Witz. Alles nur noch zum Upselling da oder man muss auf Super und Ti Gedöns warten...
Wierd if they up the 5090 to 32GB but keep the 5080 at 16 🙃
blame china
They better price that 5080 very cheap for how cut down crap it is. But i bet they won't, they will just try to trick ignorant gamers with the naming. Personally i will bite the bullet and get the 5090 as i have delayed buying anything since before the 3090. As long as Nvidia gives me the performance and puts it out sooner rather than later ill pay the Nvidia tax. Most people should probably wait for 5080ti price to performance king 9 months later
Going to cope hard and hope DLSS 4 is texture upscaling, and that’s the reason for 16GB on the 5080. Was really planning put one in my AM4 pc with a 5700x3d 😭
The RTX 5090 is looking solid if those rumored specs hold up. The RTX 5080 however is not. Just like the RTX 4080 that launched, the RTX 5080 is a 70 series card if those rumored specs hold up. It’ll be slower than an RTX 4090 in most games. Especially when the resolution/settings are pushed for demanding titles.
Hopefully, the leaked specs for the 5080 are incorrect. If they are correct, it will be dead on arrival. I don't think Nvidia will skimp this much on the 5080. The 5080 is still an important GPU that Nvidia could make lots of profit from.
@@Gamer-q7vNah man they are true the last good 80 class GPU was the 3080 12gb. The last good 80ti was the 1080ti. Which is why I have a 4090 and 4070 Super. They are the best performance and value GPU’s of the 40 series. I will be upgrading my 4090 to the 5090.
@@ZackSNetwork The 4080 is good, excluding its price. Even though the 4080 is on AD103, it outperforms the 3090 Ti by a significant margin. My two complaints about the 4080 were the stupidly high launch price and it using a slightly cut-down AD103 chip. If the 4080 had launched for $800 on a full AD103 chip, it would have been a killer GPU for high-end PC gamers.
@@Gamer-q7vbut it didn’t and the 5080 won’t. Nvidia this generation learnt that make the 80 series crap and everyone buys the 4090.
@@ablet85 Making the 80 class GPU crap is a big mistake. Nvidia would waste their own money by doing that. There is no way consumers will be willing to pay for an overpriced nerfed GPU. It would just sit on shelves and gather dust. If Nvidia were to also give the 80 class GPU considerable hardware upgrades and price it accordingly, it will likely sell very well. That means more profit for them. This wouldn't cannibalise the sales of the 5090 because that would be a much higher spec GPU with a higher price aimed at a different audience.
Rather than giving us bigger chips/memory Nvidia seems to be pushing wattage up. Running GPUs hard and hot. Hopefully we can tune the power allocation and cut power draw with less impact on performance.
yeah 1000 megawatts so you can spend all your money to the utility bill!
600w is really worrisome, there could be risk of degradation like Intel CPUs when pushed to such high wattage. Also, its 33% more wattage for around 40-50% performance gains. That means the architectural improvements isn't anything too great. Seems like nVIDIA is trying to brute force, by increasing wattage and cooler size to hit performance targets generation after generation.
@@skychaos87 i mean, i feel like nvidia would play it safe after seeing the intel shit, considering how much they have to lose in the data center market, but i agree that it sounds like they're starting to hit problems internally.
If the 5070 isn’t a 16GB card I’m going to be pissed
the 5080 will have 16gb... so...well bad news
Prepare for a looong shower.
Don't be shocked it's turning out to be like this:
5090 32GB
5080 16GB
5070 12GB
5060 8GB
5050 6GB
If 6000 Gen would get a 90 series with 48GB, a 80 series with 32GB, a 70 series with 24GB, and everything below with 16GB
That would be a peak period for consumers, which means also their sales... NVIDIA just has to listen... Or AMD...
im expecting £2k for the 5090, because they fucking can
Why not, YOLO, milk the cows!
Probably pretty close
@@TheRimBrakeGuyDaddy Jensen can milk me 😂
If they can, why wouldn't they?
You wish, gonna be 2.5K.
It would be seriously interesting if the 5080 and 5090 were both actually the same die but fused together. I'm having a heck of a time trying to figure out how to have both a 5080 Ti and 5070 on the same silicon though. I suppose they could make a second 256-bit design for laptop and 5070 that is not fused together and then have a 75% cut to the memory system and cuda core cut but fused together for the 5080 Ti.
600 Watts is insane dude. if you add a 9950X3D, the rest of the PC and a high end monitor you're looking at a setup approaching 1000W of constant power draw under max. load.
I would need a dedicated AC in my room if I were to get that lmao
5080 is a joke, I'll wait for the 5080Ti with 20-24Gb of VRAM and at least 14-16000 shaders at $1000-1200... if that doesn't exist then I'll just buy a 4070Ti S equivalent for $499 and wait another 4 years...
There will not be a 5080ti so keep waiting the same reason there was no 4080ti. It’s either the 5090 or nothing.
@@ZackSNetwork i think that depends how well amd can compete. nvidia has always had xx80 ti ready just in case if amd is able to compete
😂 seriously wait for 4 years. Thank me later.
@@mikelay5360 Can't, already waited 4 years... I'm actually surprised that my 1070ti survived 7 years of use, my 6600/8800(and warranty 9800) all died or were dying after 2-6 years
80 class is a 256-bit product historically and that would make it 16GB until the larger capacity modules are released so it makes perfect sense. It depends more on the price IMO than the memory capacity as 16GB is fine. If the 90 class is indeed a 512-bit product that leaves space for a 384-bit 24GB 80 Ti - if Nvidia is able or willing to make the cut.
Still saying Kopitite, I hear.
5080 with 16gb vram is a greedy slap in the face and makes no sense at all🤡
80 class is a 256-bit product as is 70 class. 16GB makes perfect sense because it is either that or 32GB if it's available yet and that would make the 5080 likely cost more than the 4090. 16GB is fine.
I'd also point out that there is a massive space between the rumored 5090 and 5080 - enough for a cut down 24GB 384-bit product. The real problem will be if they try to release a 192-bit 60 class configuration as a 70 class product again. Then we should riot.
@@cracklingice It will also be too expensive, so 20-24Gb would be fair and if a Super or Ti follows it will only be more expensive. Anyone who sees any sense in something like this only reinforces this greedy nonsense!
@@-Biber As I have already said. 80 class is historically a 256-bit card. That leaves only 16GB or 32GB as options and we both know Nvidia would not go 32GB. Whenever they have gone > 256-bit it has usually resulted in lower memory amounts than it should probably be or as a transitionary card between doubling of memory IC capacity. Rumor is there's 24Gb (Gb is gigabit while GB is gigabyte or 8 Gb so eight 24Gb modules is 24GB of VRAM). I would expect next gen to be 256-bit 24GB if Nvidia doesn't have to pay more than 32Gb ICs would be. It's a weird volume discount thing. Basically if no one uses 24Gb then it'll cost too much.
@@cracklingice I have a 16Gb card and in some games it goes up to 13.5 with max settings, sometimes almost 14. That won't be enough for much longer. And about your “history” with the 256 bits, I call that conscious stagnation instead of further development. But hey, a 24Gb card with a little more than 256 bits would hurt the strategy of "gently" getting people who always buy 80s to spend more on a 90s. A smart move, albeit a shabby one!
@@-Biber 16Gb card? Sheesh. GTX 1050 2GB?
If 600w is the "standard" max power draw of the 5090, that would explain why PSU makers are starting to add a second high power connector to support these cards. The partner cards will inevitably be pushed to even higher power consumption, so they will need another power connector. I kinda want to get one, but I don't want to have to upgrade my air conditioning to offset the heat this thing will pump out.
The 5080 is such a huge cutdown from the 5090, it's crazy. Its like its actually supposed to be a 5070. You could easily fit another tier of graphics cards between the 5080 and 5090 with a 320 to 384 bit bus with more vram and overall more performance.
They will. It will be called a 5080 ti.
Well, the thing with getting specs and possible oerformance for the balo Nvidia prosuct is that it doesnt tell us anything about the cards we would a tually buy (5060-5080). The gap between the halo and the second one have always varied so much. It could be 5% or it could be 25%.
I wonder how much the 4070 and 4080 will come down in price after the release of the 50 series.
Current prices are still crazy for me. I come for the land f 1070 prices- which is the current card I have.
512-bit 32GB? I wasn't expecting that. That's insane. I ain't complaining, though. This should provide a massive performance increase at resolutions such as 4K. I think the 5090 will have anywhere between 168-176 SMs.
If true, it is probably the first ever 8k capable gpu.
@@daniil3815 Maybe it could do native 8K high settings at 60+ FPS in quite a few games without RT.
@@Gamer-q7vTrue it’s the only 50 series that will be good unfortunately.
@@ZackSNetwork We will have to wait and see. These leaked specs for the 5080 aren't official. There is still hope it could be much better.
Would it be able to run path tracing Alan Wake 2 at native 4k 60?
If they really are going to sell a fully enabled 102 die, it's going to be very expensive with very low supply. I have to assume it's going to be on a 5 nm variant because if it's 3nm this thing is going to be an absolute beast.
RTX 5080 = RTX 4080 12 GB
Damn 😂. Funny but no way.
5080 is a joke but no one's laughing. At least if those specs are true. And it's probably still going to cost like 1300-1400€ while 5070 start from 1100€ with 12GB again 😂 (And these prices include tax.) They won't sell and then comes Ti and Super models with the same price and more VRAM. History often repeats itself because humans won't ever learn from their mistakes.
Nvidia you need to increase the Vram amount by 4gb for the 5080 and every card below that. If you're gonna push technologies like RT and frame gen that increase vram usage then you need start giving enough vram to use them properly. Especially with the current card pricing. 5080/20gb, 5070/16gb, 5060/12gb. Creatives are using these cards as well and they also want more Vram.
the only thing that this 50 series will surprise me is the price release, anything less than 2k for 5090 will be surprising 😥
When the 3090 came out and it was 1500$ (MSRP, that is...) I looked into my crystal ball and predicted that the 6090 will cost at least 3000$. Well, with these specs of the 5090 I don't think it will cost any less than 2000$, but probably more like 2500$, meaning that prediction for the 6090 won't be too far off...
I can already imagine the comments... "I'm a creator so I need the 5090!"
Jason is soon out to get your money guys and I guess he won't be disappointed.
To the people that are pissed about the specs or think the price is too high (which they will be) - no one is forcing you to buy these cards.
But yea, once they release these the price of the 4090 series cards will be way more affordable. and tbh its a great card still.
My prediction was 168SMs with 512 bit bus. This would make a similar ratio of cut SMs as the 4090. So this is definitely going to be 60%-70% faster than the 4090. They're not gping to make a Titan so this will cost at least $1999.99 up to $2499.99. Then the 5080 will be 10% faster than the 4090 and launch at $999.99. A 5080 Ti will be probably 20% faster than a 5080 have 22GB of VRAM and be $1399.99 to $1599.99.
I'm the type who will going for the 5090. I was one waiting in the best buy que back in 2020 for the 3090, which i was able to get.
32 gigabyte is nice...but 600 watt...I've owned space heaters with lower wattage.
What I am looking for is slightly more perf than my 4080 but less power. Of these are just super power heavy then I may skip.
Geez seems like they're really making the 5090 double the card of the 5080. Almost everything is double. That'd be a heck of a performance jump over the 4090 if the 5080 is about on par with it
This is great news I’m ready for the RTX 5090.
Huge diff between 5080 and 5090, the 5080 Ti is certainly in there somewhere!
I am calling the 5090 is NOT going to be a Gaming card but rather a Titan one. Especially if it receives 600W I don't think it's going to be "normal". I also think it's going to cost ~2,5k
To be frank, the RTX 3090/3090 Ti and RTX 4090 *are* the Titan cards of their respective generations.
Titans by generation (descending order) vs gaming flagship:
20xx (Turin): Titan RTX, $2500 - RTX 2080ti , $1000
10xx (Pascal): Titan X Pascal+Titan Xp, $1000 & Titan V, $3000 - GTX 1080ti, $700
9xx (Maxwell): Titan X, $1000 - GTX 980ti, $650
---
7xx (Kepler) GTX Titan, $1000, GTX Titan BLACK, $1000, GTX Titan Z*, $3000 - GTX 780ti, $700
GTX 690* (Kepler)
GTX 590* (Fermi)
GTX 490* (Fermi)
*dual-GPU with 2x GPU, 2x VRAM
Of all of these, only the Titan V wasn't really a gaming card. I agree that the 5090 will be a $2500 "Titan"-like product.
Note that the 90-class of old (GTX 490, GTX 590, and GTX 690) were all *dual* GPU setups and no generation since had both a 90-class *and* a Titan product, though some generations had multiple Titan products (e.g. 7xx had *three* including the Titan Z dual GPU, as well as 10xx Pascal with the Titan X, Titan Xp, and Titan V).
I also just got a monster 4070ti last week but can't use it to its full power. Waiting on my RMA PSU, should I keep my card until next year and sell when the 5070 comes out or should I keep the card?
400W for the 4080? That's insane.
8800XT it is.
8800XT would be good only if priced at 500 , if 600 or more then maybe just get like 5070
@@letsplay1097 Nah man a $600+ 5070 with 12gb vram is not even usable
Nshitia become right like Shintel before Ryzen appear , they deserve the same treatment !
It the leak is correct 5090 will have to a dual 12VHPWR card for third party OC versions. This is just nuts.
I still have not ever used all of the 24GB of VRAM on my 4090, 32GB is crazy but why not I guess!
Glad I’m still on 1440p with a 3090. Runs everything fine. Wouldn’t mind switching to 4k 240hz but I’m not paying $1000 for a 16gb card and that’s probably what the 5080 will cost.
And when should the release window be?
Is it like 7th time they leak different specs? and it is leaked by the same leaker, don't know how credible he is after leaking so many contradicting info. not saying that it is not possible, probably Nvidia indeed was thinking about all of those different specs, but it is annoying.
Titan Series will be 48gb with a dual power connector.
600 watts is a lot, but it sounds like it is going to be worth it, in terms of power usage. How much it will cost and if it will be worth the price is another matter. Thumbs up!
16gb for the 80 series while the 90 series gets 32 makes no sense to me. There will definately be a 5080 super or Ti with 20gb coming later.
Yea, the only way to get a RTX 5080 with more than 16GB of VRAM is if it was a seriously cutdown GB202 die, which I still believe is on the table, as there is a long way down from GB202 to GB203, and I doubt anyone will have high yields for dies meeting the specs claimed here for RTX 5090 given in size and transistor count. It would be more appropriate for Nvidia to have the highly cutdown GB202 dies be the RTX 5080 so they have a chance to be close to the RTX 4090D in performance. GB203 would be better as the RTX 5070 tier.
A 1000$ card with 16GB of memory. God bless nvidia 😂
If the performance checks out then why not. If a game takes more than 16GB vram we need to cancel the devs to oblivion.
🤢
@@mikelay5360 Bruh... games are only going to use more and more rams as visual quality improves. Its nonsensical to expect games to forever use under 16gb and have improved quality generation after generation. If anything its the GPU manufacturer that is retarded to gimp on the VRAM, especially for high end and expensive cards.
@@skychaos87 Total BS and laziness for devs. We will cancel them to oblivion.
@@mikelay5360 I remember when people said that about 512mb.
Poor 5080 84SM, wait for 5080 Ti, there was leaks about 108SM xx80 class card
Have a nice trip! Why are you treating this as close to fact? Kopite is wrong as often as he is correct. I think 512 bus and 600w is too high for regular 5090
how much is the expected (market price) of the 5090, not the msrp
$2500 to $3000 is my guess
So now for the price of a used car you can play fortnight in 4k 120..insane
$2000 at least, $2500 at most
@@mmmidnight1812get a ps5 and stop bitching. You dont have to get a 5090.
@@Mayeloski Knowing Nvidia the MSRP can't be less than $2000, it could legitimately just be $2500. The market price will of course be higher at least in the beginning.
RTX 5080 with a full GB203 die.... 😮 YES! That's what I've been waiting for. Hopefully the price is right. I'm probably gonna shoot for the FE edition, because the Nvidia 4090 cooler (if they use the same one that is), is build well enough to handle it and it's a slick premium design. From a 3070 to a 5080..., good upgrade id say.
Watching the footage of RTX racer makes me sad we never got to play the tech demo…
Lol that's a pre-rendered scene like a animated movie . SMH!
Lol that's a pre-rendered scene like a animated movie . SMH!
Perhaps so, but they said multiple times a playable tech demo of it (as pretty as the trailer or not) was going to be available… yet it never was
Could the 5090 be two 5080 dies glued together like their data centre cards
Why did he sound so surprised on the gap between 5080 and 5090. Currently AD103 4080 die size is almost half of AD102 4090 die size
60%, not 100%, it's an entirely new kind of mental
The 32gb memory makes me think the ti must be in the works. 32gb isn't meaningful for a.i. models- but 48gb is. Once you hit a 48gb vram, it opens a lot of doors for local a.i. models, etc. So I thought it was weird for 32gb - there's nothing in a.i. that needs 32gb, but maybe it's for running gta6 😂
I def will be waiting on the 5080 Ti
Will they have 40%+ IPC increase? Or maybe 140% IPC increase?
I plan to do a new PC new year, waiting for the new CPU/GPU that will fit my 1440p configuration (I don't want a 4k one, way too expensive).
5080 400w... no thanks ! the price will be over 1200 € anyway so no way that I buy a +1000 € video card.
I'll certainly go full red on CPU and GPU, 1440p fit my need knowing that I plan to move to an OLED screen and the OLED 1440p screen are more affordable than a 4K one... and if you take a OLED 4K you have to have a great powerful video card (so a +1400€ one) otherwise it's pointless.
Get 7900XTX on blackfriday deals or wait for the 8900XTX.
It's a 1440p Beast of a GPU it's best fps vs price ratio.
You think it'll be able to handle portal rtx 4k native? It'd be really impressive if it can do 60 fps or more sense 4090 can't even do 30.
my 3080 OC2 is setup with a 400w bios soo eh 16GB of ram is fine. 256bit should be fine with GDDR7 it should be still over 1000GB/s
I have only pcie 4 on my motherboard. Is it possible the 5080 will be pcie 4 or not? Also if it is pcie 5 would I lose a lot of performance with pcie 4?
My PSU could run a 5090 if I didn't use the CPU, Mobo, fans...
My guess is 5080 cut so much because of the US government's block on China and other countries. Assuming the government won't allow it out without a lot of rework.
Wow! 5080 with only 10500 cuda cores, that is hilarious 😂. That is barely HALF of the 5090, and people pointed out the 60% difference of the 4080 vs 4090
400Watt for 5080 is way too much and cannot believe what kind of heat it will have to dissipate.
I am way more interested in what Intel has been cooking.
well I'm expecting the 5090 to be at least $2000 some are saying 2500, its possible but it's insane, but we gamers don't matter to NVIDIA anyway, the 5090 will sell out like hotcakes... as for the 5080 I don't think we will be getting the same price as the 4080 super at $1000, most likely gonna go back to $1200 if not more... no competition on the high end NVIDIA can do whatever they want, in the end they will sell it all... mainly to AI right now
will pci e 4.0 be a botleneck to next rtx gpus?
21000 cuda cores only 😭 why not 24000 fully
RTX 5080 at 16GB. Bhahahahahahahahahahaha!!!!
5080 will still be significantly faster than anything AMD or Intel will have. Cope.
@@johnc8327 Doesnt matter how fast it is, when you spill over 16GB of VRAM. Lol
In what games do you need 16GB VRAM? Most games seem to use 10GB VRAM.
@@Shadowsmoke11 4k Cyberpunk, Ray Tracing. Easy...
since I have been gaming i have upgraded every series , not this year I'm going to stick with the 4090 and wait for the 60 series.
I hope the 5070 has 16 gb of vram
Hopefully the 5080 is better than what leaks or saying if not honestly I’m just gonna spend the money on the 5090
80 class is garbage go 5090 or nothing.
Fakecheck:
1070 > 980 ti
2070S > 1080 ti
3070 > 2080 ti
4070 TI S < 3090 ti
5080 < 4090 (guess)
@@Uthleber Maybe a reason Nvidia is taking 4090 off the market, 5080 is 1/2 of 5090, which won't outperform it
In what world 3090 ti is faster than 4070ti super?
@@f.iph7291It’s probably only faster in certain games in 4K.
Your info is wrong, LOL
The 1070 is slower that a 980 ti, I had both. The 980 ti is as powerful as the 1070 ti. A 1080 ti is as powerful as the 2080, and a 2080 ti is a little more powerful then a 3070, probably more like a 3070 ti.
I would not spend 1000+ dollars on a card with only 16gb of memory. I would wait to see if they release a 16gb 5070ti super and go with that, if they don't just go with a 4070 ti super and wait 5 years.
Are you sure it's a 5090 and a 5080? It sounds more like a Titian and a 5090 at this point?
Honestly for the price and period in time, that card should be 64GB. Just like my 4090 should of been 48GB.
5090 is going to be an absolute AI beast.
Nvidia should bump the 5080 to 24gb, more so because the 5070 should get 16gb, 5060 12gb, i hope we dont see 8GB nvidia cards this gen.
400watts? ridiculous
I was looking forward to this, but now I think I will wait until AMD brings out their next top end GPU to see how it performance, the amount of VRAM, and the price.
it should be $10k...f'ing unbelievable.
on my country will be 50k
@@Magovit honestly f Nvidia and their prices