@@stangamer1151Why won't they have a choice? What's stopping them from doing the exact same thing over again since so many Nvidia sheep bought it the first time?
@@Aspire705 Looks like you weren`t paying attention over the past 18 months "I meant and Im sure you know" changing the connector to stop them melting.
7990 XTX would probably make sense given how AMD is not gonna make a high end RDNA 4 GPU and to stay competitive they can just re-release an overclocked version of the RX 7900 XTX and call it a 7990 XTX. RDNA 3 can overclock really well so it's possible for AMD to push out an extra 15-20% performance by overclocking a 7900 XTX. The power consumption and heat would be crazy but it's AMD's only chance to compete against NVIDIA besides the 5070 is definitely gonna be as fast as a 7900 XTX or faster.
If there is a refresh of RDNA 3, it will likely be for the flagship Navi 31 chip that powers the 7900 XTX. This might be the 7990 XTX with faster memory, higher clocks, and more cache, which could be a possibility. If you take all those upgrades into account, then you could end up with a 7990 XTX that comfortably outperforms the 7900 XTX by around 25% in gaming, possibly slightly more. I do think a refresh for the 7900 XTX would make sense.
@@ZackSNetwork RDNA 3 didn't meet AMD's performance targets. If AMD can fix that with a refreshed RDNA 3 GPU and then significantly increase the clock speeds, memory speed, and amount of Infinity Cache, the 7990 XTX could definitely be a robust upgrade over the 7900 XTX. A 25% performance increase isn't out of the question. It's actually fairly doable if AMD decides to create a 7990 XTX with all those upgrades I just mentioned. Clock speeds and memory speeds tend to scale well in terms of performance. The 6950 XT is only around 10% faster than the 6900 XT. A potential 7990 XTX could easily exceed those sorts of minor performance gains.
That gpu from amd is not coming out just like the 7950xtx didn't and it was on the list earlier also. It happens. 7990xtx won't be a new gpu we will see 8000 seriees next. Imo it's ridiculous to see a 256bit bus in the 5080 then a 512 in the 5090. They could have done over 300 for the 5080. I mean why not make it 384bit like the 4090? Can tell anything under the 4090 is going to be super watered down and give smaller boosts compared to last gen then any other generation previously going by all specs. Yes we know thwy can perform better then the specs show but damn it's still not gonna be massive or anything. 5090 will be something insane that will chnage gaming probably due to its 4k and 8k gaming experiences. 4090 can game in 8k at times ok so yeah I just can't imagine watch us get 120fps in 4k in any game that woupd be game changing.
@johndoh5182 regardless it should have been 384 bit idc what anybody says. It should match last years flagship and help memory we know it will be 16gb again also. Also the remark was just speaking on how advanced we are getting. That these things are that good. As you see I ended it with 4k 120fps because thats what we want. A stable full 4k 120fps on almost all games just without ray tracing is fine also. Can't expect to much lol
Fun thing is that RTX 5090 will be capable to run maxed out CP2077 sequel only at 1080p/30 without FG. I suppose it's gonna be released in the end of RTX 70 cycle and closer to PS6 Pro announcement. The future looks bright if we survive.
Path Tracing right now is sorta ghosty, has blurry reflections, lights have lag etc. I assume, they're pumping every ounce out of the 5090 to bump up raycounts, cut back on denoising etc to finally get PT to produce the fine shadows and details to back up the tonal balance that PT naturally brings.
Let me get this straight... we still have connectors melting on the 4090 and jensen's answer to that is to draw more power through that inferior component? Sounds like a nice fire hazard. Besides melting plastic i smell a lawsuit waiting to happen.
Most 3090tis drew 480watts. And alot of them went up to 520 watts on this connector too. So 500 watts is the same although I assume this is just the base
@@user-st7jk2ze1n First, leaks like this are often wrong. Next if you want to compare, you're stuck with numbers, not just saying "most". TDP is TDP so if a 4090 and 3090 Ti TDP is 450W and the 5090 is 500W, then 500W > 450W by 50W regardless of what some AIB will do with them. If an AIB pushed a 3090 Ti up to 500W then you could assume they'd push a 5090 up to about 550W.
@c.s127 the problem is the connector is pushed alot closer to it's limits than the 8 pin is. Thr 8 pin is safe up to 288w and was rated for 150watts each and high end cards have 3 even for 350watt tdps. The pcie 5 12 pin connector is rated up to 600 and cards are pulling close to that. We need 2 of the 12 pin connectors on 90 tier cards
For that the 5070 would need to have a 256-bit bus and the 256-bit bus is gonna be taken by the 5080. VRAM depends on the bus size. 128 bit can accommodate 8GB or 16GB clam shelled 192 bit can accommodate 12GB or 24GB clam shelled 256 bit can accommodate 16GB or 32GB clam shelled and so on...
Because they want people to buy something else as right or wrong I'm suspecting many many people will be skipping 12GB (at that price point) just incase it turns out like 8GB cards this gen.
given that both Nvidia and AMD are working on a new texture compression standard we might actually see a decrease in the vram consumption in the next 2-3 years , if its to be believed it can reduce the vram consumption of up to 60% from what it is now :)
7990 xtx seems like a legit potential refresh for AMD , maybe they got some high quality navi 31 chips so they can throw more SMs with higher frequency and maybe gddr6x ram , that can actually bump up the performance up to 20% compared to 7900xtx. This to me seems very much like an AMD move , since the rumors say they don't intend to continue the chiplet design for GPUs
Pau, dude, never apologize for wearing a muscle shirt - in fact - just make the videos while shirtless, and stand in a T-pose to assert dominance. OOH RAH!
5090 is looking like the fire-breathing monster we thought it would be. How will that little power connector hold up? We might find out with using 500 watts out of the 600 watt capacity that many of those cables have. We may need heatsinks on those power connectors. Thumbs up!
Don't forget that there are more people using NVIDIA GPUs for CUDA than gaming. CUDA is NVIDIA's bread and butter. I'm hoping TechPowerUp's 100 FP32 TFLOPs isn't the actual performance to expect from the 5090, since it's only marginally better than an overclocked RTX 4090. Given that it's 3nm instead of 5nm, I'd hope to see power efficiency improvements.
For me its crazy with the 12 Pin connector, people already Set IT can Delivery max 600watts and after it can burn itself, but Nvidia doesnt care at all
Meanwhile my undervolted 4090 maxes out at 250W. Some people like to overclock as a hobby. I like to undervolt. I enjoy pulling the power draw down as far as possible while keeping the performance as close to stock as possible. :)
If 5090 will be 500w in stock I will do the same) 350w is maximum I want to have in my GPU, otherwise I will need to run air condition in my apparment even in winter😅
Just give us the best setup for the card and the cheapest it can get. They're going to go more into AI anyway so just let the gaming consumers have their fun with the consumer gpus.
My Gigabyte Eagle OC 4080 combined with the intel 13600k can heat my room during gaming. Anything more than say a combined +80 watts more would be intolerable. I was hoping for 5080’s to be more efficient. But no. Sad.
I could see Nvidia doing the 5090 at 448bit bus that way they can do a 512bit 5090ti if they need to respond to AMD releasing something that is very powerful
5090 performance is a difficult decision for Nvidia. One the one hand: If performance uplift is not high enough people might not upgrade from tje 4090 and the 3090 owners that thought 1600$ for a 4090 is not worth it will probably not pay 2K+ for any GPU. On the other hand: Moores Law is pretty dead. So if they go all out now they will have a hard time convincing 5090 owners to upgrade again.
10:09 >5090TI or TITAN. Yes but when? cause if they release it 15 months later, and then they're sold out everywhere for 3-4 months then might aswell wait for the next gen which is only a few months away at that point... A TITAN at launch would be pretty nice but it depends on the VRAM (they usually have double) so 64gb Vram if they price it well at around 3.5k-4k it would be nice... still less that the pro version which cost 6k+ but also more expensive (thus profitable for them) than the original RTX Titan which cost 2.5k.
While that is true, nvidia will still use the D moniker because it will sell more in china. Im not up with nvidia since their prices are outrageous, etc. Is their 5090 launching this year? Regardless, as mentioned above, they'll still use it.
A 5090D doesn't make any sense. Wouldn't a 5090D be more powerful than a 4090? So they're only banning the most powerful GPU of a generation but anything else is fine even if it's better than the best of the previous gen.?
@@Thrawn655 my 4080 has a much higher clock than my 3080ti. I have a great chip in my 4080 gets up to 2850mhz and 3080ti only got up to 1950mhz. that's not it. I think the leaks are wrong.
It looks either Nvidia will make ready absolute waste as 5090 if they come out with another 20 percent cut in SM from already cut full chip. That would explain higher clock speeds over previous 2,5GHz (techpowerup) to get previous performance targets. The more you buy the less you get. But I don't worry performance uplift to be small, finally Nvidia is software company😅.
It's time to say goodbye. Fed up with these clickbait titles over and over .. and the windows key selling scheme. PEOPLE! You do not need a windows key. And even if you need, plenty google tutorials on how to do it for FREE.
@@jklappenbachMicrosoft literally removed people’s access to Minecraft if they didn’t move from mojang to Microsoft account in time. A game you legally paid for no longer belongs to you. So I say they can handle some lost key sales. Would be a shame if someone could google windows key activation GitHub power shell and hurt the billion dollar profits
Uh. Being paid by advertising is the way media has worked since the 1800’s. Also, the title was the only topic discussed in this video. You need medication. 💊
I really want them to release the 5050 because then when someone says how’s the performance and then he can say it’s 50/50
unfortunately there was no RTX 2020, for that "what can you see?? everyting, it's 20/20"
If they release 5050, they will have no choice, but relesse a true 5060 (not the fake one, like it was with 40(5)60).
@@stangamer1151Why won't they have a choice? What's stopping them from doing the exact same thing over again since so many Nvidia sheep bought it the first time?
How'd you get like that?
Paul: Every time Nvidia raises prices I do 1 push up.
Ladies and gentlemen I’m Paul’s traps
😂
💀💀💀
paulo dianabolo
Darth Jensen shall gain unlimited power
And it will be interesting to see
The more buy, the more you save
I would buy 2 but since they KILLED sli I'll just have to buy 1
Bigger question about the 5090! Are they changing the power cable???????
He already answered that in this very video. Weren't you paying attention?
2:19 "oh by the way, all of these are using a 1 x 16 pin power connector".
@@Aspire705 Looks like you weren`t paying attention over the past 18 months "I meant and Im sure you know" changing the connector to stop them melting.
My RTX 4090 already uses a power limit of 500W, so the 500W 5090 will be fine.
Ok, Imma hit the gym now, see you when I get back.
Damn son, Paul getting ready to lift full pallets of RTX 5090s
7990 XTX would probably make sense given how AMD is not gonna make a high end RDNA 4 GPU and to stay competitive they can just re-release an overclocked version of the RX 7900 XTX and call it a 7990 XTX.
RDNA 3 can overclock really well so it's possible for AMD to push out an extra 15-20% performance by overclocking a 7900 XTX. The power consumption and heat would be crazy but it's AMD's only chance to compete against NVIDIA besides the 5070 is definitely gonna be as fast as a 7900 XTX or faster.
If there is a refresh of RDNA 3, it will likely be for the flagship Navi 31 chip that powers the 7900 XTX. This might be the 7990 XTX with faster memory, higher clocks, and more cache, which could be a possibility. If you take all those upgrades into account, then you could end up with a 7990 XTX that comfortably outperforms the 7900 XTX by around 25% in gaming, possibly slightly more. I do think a refresh for the 7900 XTX would make sense.
@@Gamer-q7vA refresh is not going to gain that much more performance. It will be like the 6900xt to 6950xt.
@@ZackSNetwork RDNA 3 didn't meet AMD's performance targets. If AMD can fix that with a refreshed RDNA 3 GPU and then significantly increase the clock speeds, memory speed, and amount of Infinity Cache, the 7990 XTX could definitely be a robust upgrade over the 7900 XTX. A 25% performance increase isn't out of the question. It's actually fairly doable if AMD decides to create a 7990 XTX with all those upgrades I just mentioned. Clock speeds and memory speeds tend to scale well in terms of performance. The 6950 XT is only around 10% faster than the 6900 XT. A potential 7990 XTX could easily exceed those sorts of minor performance gains.
Holy shit red gaming tech turned into hulk wtf u look crazy good man have honestly !!!
I just got a 4070, so I'm set untill 6070.
Será?
I think I've watched videos on this channel going back like 7-8 years and I didn't know you were diesel until today bro.
That gpu from amd is not coming out just like the 7950xtx didn't and it was on the list earlier also. It happens.
7990xtx won't be a new gpu we will see 8000 seriees next.
Imo it's ridiculous to see a 256bit bus in the 5080 then a 512 in the 5090. They could have done over 300 for the 5080.
I mean why not make it 384bit like the 4090? Can tell anything under the 4090 is going to be super watered down and give smaller boosts compared to last gen then any other generation previously going by all specs. Yes we know thwy can perform better then the specs show but damn it's still not gonna be massive or anything. 5090 will be something insane that will chnage gaming probably due to its 4k and 8k gaming experiences. 4090 can game in 8k at times ok so yeah I just can't imagine watch us get 120fps in 4k in any game that woupd be game changing.
the 4090 is not a 4k gpu at all, more like 1440p
people still act like 4k60fps should be the minimum, why 60? are you stupid
Yeah the 1% are just DYING to get their hands on 8K gaming.
@johndoh5182 regardless it should have been 384 bit idc what anybody says. It should match last years flagship and help memory we know it will be 16gb again also.
Also the remark was just speaking on how advanced we are getting. That these things are that good. As you see I ended it with 4k 120fps because thats what we want. A stable full 4k 120fps on almost all games just without ray tracing is fine also. Can't expect to much lol
I wonder what the slot widths will be, if I go 4090 to 5080 will I get the same performance with less slot width?… curious
Fun thing is that RTX 5090 will be capable to run maxed out CP2077 sequel only at 1080p/30 without FG. I suppose it's gonna be released in the end of RTX 70 cycle and closer to PS6 Pro announcement. The future looks bright if we survive.
Personally all I honestly care about is 5060 or a even a 5050 when it comes to future possible buyings when it comes to GPUs.
If it beats current gen or 30 series 😏
Then you are wasting your money and time.
@@theanglerfish5000 yes probably but by not muvh 6000 wins 😂
Path Tracing right now is sorta ghosty, has blurry reflections, lights have lag etc. I assume, they're pumping every ounce out of the 5090 to bump up raycounts, cut back on denoising etc to finally get PT to produce the fine shadows and details to back up the tonal balance that PT naturally brings.
Let me get this straight... we still have connectors melting on the 4090 and jensen's answer to that is to draw more power through that inferior component? Sounds like a nice fire hazard. Besides melting plastic i smell a lawsuit waiting to happen.
Most 3090tis drew 480watts. And alot of them went up to 520 watts on this connector too. So 500 watts is the same although I assume this is just the base
@@user-st7jk2ze1n First, leaks like this are often wrong.
Next if you want to compare, you're stuck with numbers, not just saying "most".
TDP is TDP so if a 4090 and 3090 Ti TDP is 450W and the 5090 is 500W, then 500W > 450W by 50W regardless of what some AIB will do with them. If an AIB pushed a 3090 Ti up to 500W then you could assume they'd push a 5090 up to about 550W.
The connector sucks.
Engineers even Said, connector can Delivery max 600 watts after it gonna burn
@c.s127 the problem is the connector is pushed alot closer to it's limits than the 8 pin is. Thr 8 pin is safe up to 288w and was rated for 150watts each and high end cards have 3 even for 350watt tdps. The pcie 5 12 pin connector is rated up to 600 and cards are pulling close to that. We need 2 of the 12 pin connectors on 90 tier cards
That tank top is either smelling ripe or that tank top gets washed on the daily or there are 10 of those black tank tops in the tank top closet
so the 5070 = 4070ti super in performance but with less power and only 12GB.... Why can't they just put 16 in it?
The VRAM options depend on the bus size. 16 GB doesn't work with the smaller bus.
For that the 5070 would need to have a 256-bit bus and the 256-bit bus is gonna be taken by the 5080. VRAM depends on the bus size.
128 bit can accommodate 8GB or 16GB clam shelled
192 bit can accommodate 12GB or 24GB clam shelled
256 bit can accommodate 16GB or 32GB clam shelled
and so on...
Because they want people to buy something else as right or wrong I'm suspecting many many people will be skipping 12GB (at that price point) just incase it turns out like 8GB cards this gen.
given that both Nvidia and AMD are working on a new texture compression standard we might actually see a decrease in the vram consumption in the next 2-3 years , if its to be believed it can reduce the vram consumption of up to 60% from what it is now :)
@@Integroabysal Compression algorithms have been around for decades, they are at diminishing returns.
Without competition and all available silicon going to AI accelerators, they're going to be charging 2000-3000 for flagship GeForce GPU's
Paul is bring the heat to Derrick from MPMD.
The muscle show-off is starting to become an issue.
All of us nerds are happy of your achievements as a great voice, but why rub it in?
These cards are getting heavier and heavier. 👀🔥😉
Wait, when you said "My sources revealed the potential gains" were you talking about AIB insiders or Gym Bros?!
7990 xtx seems like a legit potential refresh for AMD , maybe they got some high quality navi 31 chips so they can throw more SMs with higher frequency and maybe gddr6x ram , that can actually bump up the performance up to 20% compared to 7900xtx. This to me seems very much like an AMD move , since the rumors say they don't intend to continue the chiplet design for GPUs
Pau, dude, never apologize for wearing a muscle shirt - in fact - just make the videos while shirtless, and stand in a T-pose to assert dominance. OOH RAH!
5090 is looking like the fire-breathing monster we thought it would be. How will that little power connector hold up? We might find out with using 500 watts out of the 600 watt capacity that many of those cables have. We may need heatsinks on those power connectors. Thumbs up!
damn. bro got big 😮😮😮
Don't forget that there are more people using NVIDIA GPUs for CUDA than gaming. CUDA is NVIDIA's bread and butter. I'm hoping TechPowerUp's 100 FP32 TFLOPs isn't the actual performance to expect from the 5090, since it's only marginally better than an overclocked RTX 4090. Given that it's 3nm instead of 5nm, I'd hope to see power efficiency improvements.
7500xt makes sense to be likley arriving as there HAS to be some cutdown yields from the 7600
Ladies and gentlemen, I'm swole.
Yes, yes you are. Nice gains
Again...if their meltnector can't handle 4090 how they dare to use it in 5090?? Green brains = rotten brains?!
@@theanglerfish they need to give it 2 power connectors to not cheap out. Especially given the msrp we know it will be
@@user-st7jk2ze1n yep but i guess they don't x)
7990 XTX would be fun. "Oh hey guys, we finally figured out multi-chip RDNA3 😈"
I wonder what the range is for power draw from running notepad to running cyberpunk.
For me its crazy with the 12 Pin connector, people already Set IT can Delivery max 600watts and after it can burn itself, but Nvidia doesnt care at all
Meanwhile my undervolted 4090 maxes out at 250W.
Some people like to overclock as a hobby. I like to undervolt. I enjoy pulling the power draw down as far as possible while keeping the performance as close to stock as possible. :)
If 5090 will be 500w in stock I will do the same) 350w is maximum I want to have in my GPU, otherwise I will need to run air condition in my apparment even in winter😅
ever heard of capping your fps smart person?
Just give us the best setup for the card and the cheapest it can get. They're going to go more into AI anyway so just let the gaming consumers have their fun with the consumer gpus.
Still curious to see the tflops of the 5050 at just 100 watt tdp 🤔
My Gigabyte Eagle OC 4080 combined with the intel 13600k can heat my room during gaming. Anything more than say a combined +80 watts more would be intolerable. I was hoping for 5080’s to be more efficient. But no. Sad.
Looking great man! Keep up the good work!
Omg 5090 will be at least with 30% strong than 4090 what a beast
I think we all need to chip in so he can buy a whole shirt. I know times are hard but we could help.
I could see Nvidia doing the 5090 at 448bit bus that way they can do a 512bit 5090ti if they need to respond to AMD releasing something that is very powerful
Rumor about high end of RDNA4 would compete with RTX5070 but much cheaper.
5090 with 172SM is my gathering.
what is this channel now - redgaming tech GRINDER?
5090 performance is a difficult decision for Nvidia.
One the one hand: If performance uplift is not high enough people might not upgrade from tje 4090 and the 3090 owners that thought 1600$ for a 4090 is not worth it will probably not pay 2K+ for any GPU.
On the other hand: Moores Law is pretty dead. So if they go all out now they will have a hard time convincing 5090 owners to upgrade again.
10:09 >5090TI or TITAN.
Yes but when? cause if they release it 15 months later, and then they're sold out everywhere for 3-4 months then might aswell wait for the next gen which is only a few months away at that point...
A TITAN at launch would be pretty nice but it depends on the VRAM (they usually have double) so 64gb Vram if they price it well at around 3.5k-4k it would be nice... still less that the pro version which cost 6k+ but also more expensive (thus profitable for them) than the original RTX Titan which cost 2.5k.
I really don't think they build a 5090ti... No competition no need to use the top chips for gamers
@MarcoGiacon 5090ti will release like 18 months later like the 3090ti did and compete vs rdna5 top end.
@@user-st7jk2ze1n like the 4090ti?
@@user-st7jk2ze1n like the 4090ti?
Damnnnn when did you get so buff?????
How many grams of protein do you think Paul eats per day?
Who are they competing with?
it's 5090S not D, 2025 is year of the Snake, not Dragon.... such a fitting name for $2500+ GPU
While that is true, nvidia will still use the D moniker because it will sell more in china.
Im not up with nvidia since their prices are outrageous, etc. Is their 5090 launching this year?
Regardless, as mentioned above, they'll still use it.
@@bear5016 some say 5090 is paper launch, but 5080 goes first to be able to export to china, so it could be either way really, but should be Q4/Q1
Good. Big Power = Big Performance! 💝💪👍
Dam Paul you getting swole up huh lmao💪
A 5090D doesn't make any sense. Wouldn't a 5090D be more powerful than a 4090? So they're only banning the most powerful GPU of a generation but anything else is fine even if it's better than the best of the previous gen.?
Am i the only person looking forward to the 5080?
Burnt Connectors will be aplenty.
isn't this 5090 a smaller node? going from 5nm to 4nm? why would it use so much more juice?
coz it will be alot faster
@@maramark my 4080 uses a lot less power than my 3080ti did.
Higher clock
@@Thrawn655 my 4080 has a much higher clock than my 3080ti. I have a great chip in my 4080 gets up to 2850mhz and 3080ti only got up to 1950mhz. that's not it. I think the leaks are wrong.
It looks either Nvidia will make ready absolute waste as 5090 if they come out with another 20 percent cut in SM from already cut full chip. That would explain higher clock speeds over previous 2,5GHz (techpowerup) to get previous performance targets.
The more you buy the less you get.
But I don't worry performance uplift to be small, finally Nvidia is software company😅.
Is it going pcie 5.0 video card
500W lol....
In the navy!
Sup Swole 👊🏻
It's time to say goodbye.
Fed up with these clickbait titles over and over .. and the windows key selling scheme.
PEOPLE! You do not need a windows key. And even if you need, plenty google tutorials on how to do it for FREE.
That's called theft.
@@jklappenbachMicrosoft literally removed people’s access to Minecraft if they didn’t move from mojang to Microsoft account in time. A game you legally paid for no longer belongs to you. So I say they can handle some lost key sales. Would be a shame if someone could google windows key activation GitHub power shell and hurt the billion dollar profits
Uh. Being paid by advertising is the way media has worked since the 1800’s. Also, the title was the only topic discussed in this video. You need medication. 💊
❤😂🎉 first like good luck