Starting to reach a point where I can't even if I want to, Prices are getting ridiculous that instead of upgrading every generation like I used to pre crypto days, It's now skip a gen and if they keep bumping it up it will be skip 2 gens. Not paying for a gpu that costs as much as the rest of the pc.
People really need to stop bitching about the price of every new tech product. The GPU market has changed. There is a ton of competition for cutting edge process nodes. Nvidia can sell GPUs to data centers and enterprise clients if you don’t want to pay for it.
Considering how shrunk the 5080 is compared to the 5090 II cannot see it faster than the 4090 if the 5090 is just 30-35% faster. The 4090 is up to 35% faster than the 4080 in heavy games, and around 25 on average. Either the 5090 is 35% faster than the 4090 and 5080 is at parity with 4090 or the 5090 is 45-50% faster than 4090 and 5080 is 10% faster than 4090.
Everyone and their mom are going to be trying to buy a 5000 series GPU at launch before any tariffs are implemented, so unless you are lucky and able to get one early, you could easily end up with no GPU until mid 2025 when you are finally able to buy a 5090 for $2500-3000 after tariffs.
None of these companies will have a GPU faster than a 5070ti.... So maybe you can get a 5070ti for a bit cheaper than what Nvidia sells it for, but with worse raytracing and upscaling. If the 5070ti is roughly the performance of a 4080 super, then that's probably all you will need even for 4k gaming. If you are looking for cheap, you can probably get a used 30 series card cheap right now.
I'm gonna go against the grain and say that I don't think Nvidia is going to have big price increases for most of Blackwell. They'd be dumb to put the 5080 at over $1200, and even that seems excessive. 5070Ti is gonna be a little faster than a 4080 Super for $800 or $900. This isn't great, but they are technically a little bit better price-performance than Ada.
@@AmUnRA256 5070 and 4070 definitely look like older 60 class GPUs. 5080 looks a lot like an 80 class to me. The 5090 is weird because it's their first 512b bus GPU in almost 20 years, so it's beyond the 4090, 3090 and old Titan cards. There's a big 4090 shaped hole in this lineup where they could have another GPU with 128SM and 24GB on a 384b bus.
@@coryray8436 Did you see the difference between the 5090 and 5080?!?!? If the leak is to be believed, which part of 50% less VRAM and Cores says 80 class GPU to you?
Why would they? They dont need to win any popularity contests and they dont have any competition. Jensen is gonna rob PC gamers blind this generation... again!
It probably won't burst. It's so fresh that there's still oodles of things to discover, and when all of that is discovered we'll use it a stepping stone into the next level and keep innovating.
@@yomatokamariYou are absolutely mistaken. AI is already being used to increase productivity in multiple areas. AI is already being used to review police body cam footage and writing a police report. This is just one but there are a lot of other ways it’s being used right now.
At some point… the entities that endlessly buy up Nvidia GPUs (Amazon,Microsoft,Google,Meta and every cloud provider of GPUs under the sun) will have to actually make a return on that investment otherwise they wont be able to buy more and more cards. Simply the AI products need to make a return to justify the investment in hardware. The most obvious example of this would be OpenAI….at some point THEY NEED to make money and actual PROFIT from their AI products otherwise they will go bust from their electricity bill alone
Its definitely the 60 class buyers who have gotten screwed the hardest in the last 5 odd or whatever years. Used to be that a new 60 class card would game at Ultra settings at or around 80 to 100 fps or more in most new games. Mind you this is 1080p but more ppl were on 1080p monitors back then but you were solid even at 1440p for at least a couple years. Now... lol its so much worse and AMD is well... a decent alternative but missing out on stuff like dlss doesnt feel great!
According to Coreteks RTX 5090 = 30-35% faster than the RTX 4090, and the RTX 5080 would be 10% faster than the RTX 4090? LOL, that's BS as the RTX 5090 has double the specs of a RTX 5080, the difference will be way higher than 20%.
Not sure about lower cards, But 5090 is surely over 40% faster than a 4090 with those specs, They are going back to the good old days of 512bit bus, I miss those days.
Its sad no maddening actually that it has come to this but yeah old cpu's and now even older gpu's are fine for many people's needs unless they want to be $#tards and want to drive 4k at Ultra with 100 fps! I say downgrade to 1080p 27 inch monitors and stick to older gpus!
A 13900k and 4090 setup is uncomfortably hot in the summer. I put my system in a different room because of that. I can heat up the other room as much as it wants.
This is why we need a serious competition in GPU segment, somehow the CPU market and competition is decent of what you'll get with your money such as 7800X3D or 9800X3D for almost $500 but feels premium. The GPU pricing is on the other hand is getting more and more ridiculous where you have to spend $500 on a GPU but you'll be getting at least a RTX 5060 which is laughable. I really don't care if chinese market will somehow enter the competition just like how smartphones back then when it significantly changed after they entered the market.
All the industry surrounding chip development that China now controls will likely go to Mexico's NE region, bordering with Texas. And the low level stuff will be the easiest to migrate.
At this point just waiting for the Super cards is the way to go. Though I do wish AMD does some real undercutting with their MSRP's. Some countries base all their pricing off MSRP so AMD always ends up barely undercutting nvidia.
Nvidia is about to leave a gap in the market with their expensive pricing. This is the perfect time for AMD to come in with the RX8000 Series to fill that gap.
You were probably not going to buy an Nvidia anyway. And there is no way AMD can benefit out of it with its current tactics because they don't want to compete. Nvidia will probably be the way to go this gen too.
Intel is the secret future video. Looking forward to hear your thoughts on if they will get their shit together and release a compelling product. Maybe things get SO bad that the ARC skew mostly as-is becomes a compelling buy? My 1080 deperately needs an upgrade. Got it for 300$ second hand, only 1.5 year after release. The guy paid 750$ for it new, still have his receipt. If i find a good deal like that again im taking it. Otherwise the only thing left is praying for AMD to deliver on the value side.
Question from a non-gamer is gaming on AMD Radeon cards so terribly bad that people actually consider these stupid Nvidia cards? Also would a person benefit by using both an AMD CPU and GPU compared to using Nvidia? I'm curious because I don't play games but I was considering building myself a nice Linux TV computer with an all AMD CPU & GPU build.
AMD and Nvidia most likely have a non-compete agreement to allow Nvidia to occupy the halo slot. That way, they set the prices and AMD gets the profit margin benefits while maintaining the status quo. When they had a node advantage, they deliberately stayed off the top performance tier. I gave up on this hobby, it's full of useful idiots that reward this kind of anti-consumer behavior. Still like to support this channel, though.
Nothing new here, Nvidia doing the minimal necessary. AMD won't compete this generation (nothing but a low midend GPU) so NVidia doesn't do much. Maximizing profits.
Don't care ! *Nvidia can go to hell ! Will never buy one !* I'll stick to Radeon ! Even if they are slower ! *My RX-6800XT is still rocking, and I will see, what RDNA-4 will do @ CES 2025 !* So, I'm NOT in any hurry to "upgrade" my perfect RX-6800XT ! But will upgrade to an ASRock B650 Riptide, 64GB fast Mhz with tight CL DDR5, and an Ryzen 9800X3D !
Your price and performance predictions are normal from generation to the nexr. But I disagree that nvidia doesn't have competition. The people suckered by Nvidia marketing tactics is the real problem. The 7900xtx is 80% to 110% of a 4090 but at less than half the price in rasterization performance.....thats a deal/steal. Nobody really cares about RT, dlss or any of those other gimmicks.
The 4090 leverages GDDR6X, not GDDR6. And again, the units are "square millimetres", not "millimetres squared" - houses aren't sized in "feet squared" or "metres squared" - it's really not that hard. Why do you keep making this mistake?
What a huge problem, I hope you can correct these mistakes so the literally zero people who expressed problems understanding the term can finally relax. Keep nagging!
@@theevilmuppet It's not constructive. You didn't even have a problem translating "millimetres squared." It literally means exactly what it says. Solving problems nobody has is cringe.
I called BW 200 1600 mm^2 / 2 at 6 months ago for 202 and got full reticle die area correct less dark silicon on this 744 mm^2 leaker is correct that follows my primary research looking at the die shots (yawn) and got the SMs correct and Corteks u missed that. Recall I commented here 4 compute arrays in 800 mm^2. You're not following along on my reports as I follow yours. And u missed the best calc possible 6 months ago with all the cost : price margin data then and now. Current cost : price / margin assessment is at my Disqus profile. Not much different than calculating GB 202 at 1600 / 2 and GB 203 at 1600 / 4 close enough then and way ahead of everyone else now. Yes I am tired of Nvidia segmentation on bus and VRAM and Nvidia knows that on my regular reports where I begged Nvidia to correct on features and price pointing out the gamer whin would be intolerable for all in industry causing a major distraction to the day-to-day business that was not worth all the time addressing the whining. mb
Let's hope Intel foundries ramp up well enough to save us... I don't game in the high end much these days so I'd gladly buy their midrange GPUs. But AMD otherwise. I own Nvidia stocks and I still wouldn't buy their GPUs. vote with your wallet people. Unless your daddy is rich of course. ;)
People really need to stop giving nvidia money.
They definitely won't
@@BlueMax109 True, unfortunately.
But we need faster cards.
Starting to reach a point where I can't even if I want to, Prices are getting ridiculous that instead of upgrading every generation like I used to pre crypto days, It's now skip a gen and if they keep bumping it up it will be skip 2 gens.
Not paying for a gpu that costs as much as the rest of the pc.
People really need to stop bitching about the price of every new tech product. The GPU market has changed. There is a ton of competition for cutting edge process nodes. Nvidia can sell GPUs to data centers and enterprise clients if you don’t want to pay for it.
$800 for 60 class silicon. Man these guys are insane.
People will stupidly pay the markup with a smile, as they’ve been doing since 2020.
Since 2016 you mean. Probably even longer than that 😂
@@xthesayuri5756 i know for a fact since 2011. nvidia didn't even have the best card then.
@@xthesayuri5756 the 980Ti was the beginning of steady markups before that it was up and down with halo GPUs.
What markup? 4090 was actually quite good value, at least if used for content creation. For gaming, well, less so.
@@zapadorcontent creation? You mean making dogshit 4k videos no one needs or VR content 7 people are engaging with
People seem not to understand that these GPUs are scraps from the big AI chips that didn't pass the QC.
Would you sell a chip at 2k when it can be sold for over 10k and having people knife battle over it?
That's fine, But if they are scraps then they can give us cheap prices for the leftovers.
People normalized BS from companies.
Guess ill wait for 6000 series then to get my 5080.
Considering how shrunk the 5080 is compared to the 5090 II cannot see it faster than the 4090 if the 5090 is just 30-35% faster.
The 4090 is up to 35% faster than the 4080 in heavy games, and around 25 on average.
Either the 5090 is 35% faster than the 4090 and 5080 is at parity with 4090 or the 5090 is 45-50% faster than 4090 and 5080 is 10% faster than 4090.
Just buy a RTX 4090 and cruise until 5 yrs later.
Sell your 4090 now for more than you paid for it while you still can and get the 5090 when it releases
Everyone and their mom are going to be trying to buy a 5000 series GPU at launch before any tariffs are implemented, so unless you are lucky and able to get one early, you could easily end up with no GPU until mid 2025 when you are finally able to buy a 5090 for $2500-3000 after tariffs.
Good luck with that. Ill hold onto mine, thanks.
Remember how that went last time when people sold their 2080 Tis and couldn't buy a 3090?
No way am I gonna spend 2-3k to get 80fps instead of 60
how about 60 instead of 40
But many others will
Hoping for more competition from Intel to bring down the prices of previous gen products. But have been disappointed for last 7 to 8 years.
None of these companies will have a GPU faster than a 5070ti.... So maybe you can get a 5070ti for a bit cheaper than what Nvidia sells it for, but with worse raytracing and upscaling. If the 5070ti is roughly the performance of a 4080 super, then that's probably all you will need even for 4k gaming. If you are looking for cheap, you can probably get a used 30 series card cheap right now.
@johnc8327 there are no used 30 series here, if anything is there it will be as costly as new one.
I'm gonna go against the grain and say that I don't think Nvidia is going to have big price increases for most of Blackwell. They'd be dumb to put the 5080 at over $1200, and even that seems excessive. 5070Ti is gonna be a little faster than a 4080 Super for $800 or $900. This isn't great, but they are technically a little bit better price-performance than Ada.
the 5080 price looks compelling before noticing it is a 5070 (ti), same goes for 5070 -> 5060
@@AmUnRA256 5070 and 4070 definitely look like older 60 class GPUs. 5080 looks a lot like an 80 class to me. The 5090 is weird because it's their first 512b bus GPU in almost 20 years, so it's beyond the 4090, 3090 and old Titan cards.
There's a big 4090 shaped hole in this lineup where they could have another GPU with 128SM and 24GB on a 384b bus.
@@coryray8436 yeah the hole is missing a true 5080 with 24 gb
@@coryray8436 Did you see the difference between the 5090 and 5080?!?!? If the leak is to be believed, which part of 50% less VRAM and Cores says 80 class GPU to you?
Why would they? They dont need to win any popularity contests and they dont have any competition. Jensen is gonna rob PC gamers blind this generation... again!
Cant wait to see the AI Buble burst
Gonna be a long wait.
It probably won't burst. It's so fresh that there's still oodles of things to discover, and when all of that is discovered we'll use it a stepping stone into the next level and keep innovating.
AI is overvalued by almost 500 percent. It has no monetary value in the short, mid, and long term. The only ones making money is Nvidia.
@@yomatokamariYou are absolutely mistaken. AI is already being used to increase productivity in multiple areas. AI is already being used to review police body cam footage and writing a police report. This is just one but there are a lot of other ways it’s being used right now.
At some point… the entities that endlessly buy up Nvidia GPUs (Amazon,Microsoft,Google,Meta and every cloud provider of GPUs under the sun) will have to actually make a return on that investment otherwise they wont be able to buy more and more cards.
Simply the AI products need to make a return to justify the investment in hardware.
The most obvious example of this would be OpenAI….at some point THEY NEED to make money and actual PROFIT from their AI products otherwise they will go bust from their electricity bill alone
You deserve what you tolerate.
It wont be 600w.
Holy ****! 600W gpu!! Defecting to Team Red ASAP!
I wonder if the 2060-->5060 will have a 50% uplift for 50% more money. 7 years of upgrades :/
2060->5060 is much more than 50% lol. More than 100% even.
Its definitely the 60 class buyers who have gotten screwed the hardest in the last 5 odd or whatever years. Used to be that a new 60 class card would game at Ultra settings at or around 80 to 100 fps or more in most new games. Mind you this is 1080p but more ppl were on 1080p monitors back then but you were solid even at 1440p for at least a couple years.
Now... lol its so much worse and AMD is well... a decent alternative but missing out on stuff like dlss doesnt feel great!
According to Coreteks RTX 5090 = 30-35% faster than the RTX 4090, and the RTX 5080 would be 10% faster than the RTX 4090? LOL, that's BS as the RTX 5090 has double the specs of a RTX 5080, the difference will be way higher than 20%.
Not sure about lower cards, But 5090 is surely over 40% faster than a 4090 with those specs, They are going back to the good old days of 512bit bus, I miss those days.
Most people should stick to buying old and/or used cards. Im very happy with the $200 rtx3060 i got 2 months ago
Its sad no maddening actually that it has come to this but yeah old cpu's and now even older gpu's are fine for many people's needs unless they want to be $#tards and want to drive 4k at Ultra with 100 fps! I say downgrade to 1080p 27 inch monitors and stick to older gpus!
we have witnessed monopoly we are literally giving them money for their cheap decisions.....
I don't know much about business but having a 60% net profit margin is actually crazy as far as I know
What, no 5050 ti super?
$1000 for a 5070 ti is INSANE!
Most studio's changing to UE5 so....StutterEngine 5 going to ruin any upgrade you planned to make anyway... Save your money, see how it all plays out.
power here is a bit expensive
I'll keep on buying AMD or perhaps even an Intel GPU, depending on how RDNA4 performs vs Battlemage.
NGreedia is an absolute joke.
If 5080 was 20 or 24GB at that price i would probably buy it. Guess i will wait for a theoretical 5080 TI i guess.
600W for a 5090. Damn. That's going to heat up a room.
A 13900k and 4090 setup is uncomfortably hot in the summer. I put my system in a different room because of that. I can heat up the other room as much as it wants.
A 5090 at 2400 USD is the same as a 4080 Laptop with a full warranty. You will get what? 50 more FPS?
You are wrong , the 5080 is the 5080 that won't get unlaunched because in reality that die should go to a 5070Ti at best
This is why we need a serious competition in GPU segment, somehow the CPU market and competition is decent of what you'll get with your money such as 7800X3D or 9800X3D for almost $500 but feels premium. The GPU pricing is on the other hand is getting more and more ridiculous where you have to spend $500 on a GPU but you'll be getting at least a RTX 5060 which is laughable. I really don't care if chinese market will somehow enter the competition just like how smartphones back then when it significantly changed after they entered the market.
All the industry surrounding chip development that China now controls will likely go to Mexico's NE region, bordering with Texas. And the low level stuff will be the easiest to migrate.
At this point just waiting for the Super cards is the way to go. Though I do wish AMD does some real undercutting with their MSRP's. Some countries base all their pricing off MSRP so AMD always ends up barely undercutting nvidia.
Thank goodness I already have the 3090 and 4070.
I hope this cards are as expensive as possible , gamers deserve it 🤣🤣
Nvidia is about to leave a gap in the market with their expensive pricing. This is the perfect time for AMD to come in with the RX8000 Series to fill that gap.
Uhm guys? Not everyone is in the States. I wonder if tariffs will actually make them cheaper in Europe.
ahahahahahah your hilarious of course the cost will be passed to the consumer
Nope what usually happens is they charge everywhere else more so the can discount the cards for USA
Companies being the baskets they are, just increase the price worldwide as they did last time during Trumps presidency.
Your next build will be epic!!! I cannot wait to see what cpu you use lol
You were probably not going to buy an Nvidia anyway. And there is no way AMD can benefit out of it with its current tactics because they don't want to compete.
Nvidia will probably be the way to go this gen too.
Intel is the secret future video.
Looking forward to hear your thoughts on if they will get their shit together and release a compelling product.
Maybe things get SO bad that the ARC skew mostly as-is becomes a compelling buy?
My 1080 deperately needs an upgrade. Got it for 300$ second hand, only 1.5 year after release. The guy paid 750$ for it new, still have his receipt.
If i find a good deal like that again im taking it. Otherwise the only thing left is praying for AMD to deliver on the value side.
They are going to be too expensive won't be buying
Question from a non-gamer is gaming on AMD Radeon cards so terribly bad that people actually consider these stupid Nvidia cards? Also would a person benefit by using both an AMD CPU and GPU compared to using Nvidia? I'm curious because I don't play games but I was considering building myself a nice Linux TV computer with an all AMD CPU & GPU build.
amd it is
People say that every generation only to realize that AMD fucks them over too. Companies exist to make money.
not 3nm ?
AMD and Nvidia most likely have a non-compete agreement to allow Nvidia to occupy the halo slot. That way, they set the prices and AMD gets the profit margin benefits while maintaining the status quo. When they had a node advantage, they deliberately stayed off the top performance tier. I gave up on this hobby, it's full of useful idiots that reward this kind of anti-consumer behavior. Still like to support this channel, though.
Nothing new here, Nvidia doing the minimal necessary. AMD won't compete this generation (nothing but a low midend GPU) so NVidia doesn't do much. Maximizing profits.
We have the rumours, we don't have the GPUs.
PS5 Pro looks very attractive if it's true though.
would a 4060 ti work with a amd ryzen 7 5700x cpu
Why wouldn't it? Of course it'll work.
300w seems doable
Don't care ! *Nvidia can go to hell ! Will never buy one !* I'll stick to Radeon ! Even if they are slower ! *My RX-6800XT is still rocking, and I will see, what RDNA-4 will do @ CES 2025 !*
So, I'm NOT in any hurry to "upgrade" my perfect RX-6800XT ! But will upgrade to an ASRock B650 Riptide, 64GB fast Mhz with tight CL DDR5, and an Ryzen 9800X3D !
Your price and performance predictions are normal from generation to the nexr.
But I disagree that nvidia doesn't have competition. The people suckered by Nvidia marketing tactics is the real problem.
The 7900xtx is 80% to 110% of a 4090 but at less than half the price in rasterization performance.....thats a deal/steal.
Nobody really cares about RT, dlss or any of those other gimmicks.
This now, after AMD has conceded the high end. I hope you all have fat wallets.
your underrated
Well, all of you decided that only Ngreedia gpus exists, so enjoy your monopoly and prices.
😂😂
April is Dogecoin lols
The 4090 leverages GDDR6X, not GDDR6.
And again, the units are "square millimetres", not "millimetres squared" - houses aren't sized in "feet squared" or "metres squared" - it's really not that hard. Why do you keep making this mistake?
What a huge problem, I hope you can correct these mistakes so the literally zero people who expressed problems understanding the term can finally relax. Keep nagging!
@nimrodery correctness is important, and it's really not that hard in this case.
@@theevilmuppet Nagging is really important, it lets people know you're annoying!
@nimrodery Ah yes - providing constructive advice is annoying.
Good luck with that.
@@theevilmuppet It's not constructive. You didn't even have a problem translating "millimetres squared." It literally means exactly what it says. Solving problems nobody has is cringe.
17 minutes of worthless crap
This guy can moan all he wants, GAMERS WANT NVIDIA. Look at the hardware stats! Not even close.
@@bmwofboganville456 A 12 GB 5070, yeah I don't think you've been listening to gamers.
I called BW 200 1600 mm^2 / 2 at 6 months ago for 202 and got full reticle die area correct less dark silicon on this 744 mm^2 leaker is correct that follows my primary research looking at the die shots (yawn) and got the SMs correct and Corteks u missed that. Recall I commented here 4 compute arrays in 800 mm^2. You're not following along on my reports as I follow yours. And u missed the best calc possible 6 months ago with all the cost : price margin data then and now. Current cost : price / margin assessment is at my Disqus profile. Not much different than calculating GB 202 at 1600 / 2 and GB 203 at 1600 / 4 close enough then and way ahead of everyone else now. Yes I am tired of Nvidia segmentation on bus and VRAM and Nvidia knows that on my regular reports where I begged Nvidia to correct on features and price pointing out the gamer whin would be intolerable for all in industry causing a major distraction to the day-to-day business that was not worth all the time addressing the whining. mb
Let's hope Intel foundries ramp up well enough to save us... I don't game in the high end much these days so I'd gladly buy their midrange GPUs. But AMD otherwise. I own Nvidia stocks and I still wouldn't buy their GPUs. vote with your wallet people. Unless your daddy is rich of course. ;)
I hope nobody buys this bullshit
Epyc it shall be👏👏
All this for some CUDA support forget that, going with AMD something I can afford and keep up with Nvidia lower tier.
Nvidia will outsell AMD 10 to 1: it IS good for gamers. Objective hard facts.
only because nvidia has the best driver
No, monopolies help nobody including NVidia, considering they don't have to make a compelling product.
You are delusional 12:33 . Who in right mind would pay more money for less vram.