Keep in mind that my story on AMD's X870 boards is officially from AMD. It came from my rep with them. I'm not even sure if I needed to blur out his name or anything, but I went ahead just in case so he wouldn't get a bunch of spam. But yeah. That's official word from AMD.
new power cable on nvidia dident fix anything. if you check repair centers they are swamped with burned connectors to this day. its just a bad move. Basic electrical principles. takeing high current cables and shove them into a small wire is a recipe for disaster at any time.
I have heard that AMD is claiming behind the scenes that they are going to release the X870 boards at the end September do you know if there's any truth to this,I honestly don't know why they didn't release them at the same time,I want to build a 9900X system and have to wait for the X870E to be released.
My fidelio x2hr was a good deal tho, they don't make them anymore and they are goated as a gaming headset with a microphone from vmoda, tip for all competitive players
I've noticed this for years now as I'm someone that keeps up with prices often, and I use an extension like Honey to see the price changes over a certain period of time, and yes on a lot of items they seem to artificially inflate the list price to make it look like your getting a way better deal then you really are, like yeah some items are actually a decent deal but many other items are actually about the same price or maybe slightly less then the price they were before the sale.
@@pearce_nz With the numbers of 4090s melting connectors, I'm not going to buy the company narrative that it is just customers don't know how to plug in a connector.
@@KurNorock I mean that’s pretty much the only and main narrative on all help forums. I do admit a fix for it can’t be all that hard and I would be surprised if Nvidia doesn’t address the issue on the 5090.
@@pearce_nz They probably have a game plan this time, but I'm glad I never paid for a used card only to be concerned about or possibly actually experiencing such a cable meltdown on top of paying so much for a single component. Even if there weren't problems, unless I was using such a card as part of my work, that's a crazy price to pay just to play video games. One may as well be pay extreme prices for TP, which is actually more useful for most people. :D
@@itayonplay352 sadly they get really hot and blue screen after a year of use or so. Intel has to do a recall on the CPU's. I'm using a 14900k so I might be in that basket of people who'll need to get my cpu replaced.
@@genejones7902 While I'm glad it hasn't happened to you, it doesn't discount that plenty others have experienced it to the point it's been covered by most major outlets.
I would NOT CALL the 13900k a value. The 13900k and 14900k have been having serious issues. We may be talking a small percentage of chips and not much to worry about OR it MAY be a very widespread issue the severity of the situation is way up in the air at the moment.
From my understanding of watching some related videos, it's a widespread issue and one that likely will affect many more down the road. I'm very glad I got a good deal on a 12700k right when I was about to click buy on a 13700k or 14700k. Still, it's bad enough that there is no upgrade path without problems past the first gen on this architecture.
Likely affect 100% of 13/14900k's eventually in major ways as reported by server owners. Gamers might only crash in a game a couple times a week to begin with though but it gets worse and adjusting power usage doesn't seem to help either.
@Joegengstah I have no proof that is the case but my gut says you nailed it. This processor situation reminds me of The xbox 360 red ring of death. By all accounts it looked like a very small amount of consoles were failing. Then the numbers just kept growing. Things got bad enough so Microsoft actually extended warranty coverage for the issue by a couple of years. Sure it took a LOT of arm twisting but they did. At this point it is pretty much accepted that it is a matter of when NOT if the original fat consoles fail. Microsoft never released numbers but they are clearly up there.
@@brianwalker7771 intel should adress it but they are being tight lipped for some reason which is costing a lot of good will. its very obvious how widespread this is
@Joegengstah Most likely the PR AND leagle teams said SHUT UP until we say other wise! I agree they should make some statement on the situation it just seems logical to get ahead of the situation. However corporate logic is often anything but logical to those of us outside of the corporate world..
@@ehenningsen amd was ahead with rx 6000, they just didnt improve power efficiency much while nvidia did. next gen will most likely be the other way around again. also, amds entire radeon team is smaller than the driver development team at nvidia so its impressive amd can even compete at all
@@bal7ha2ar Which is fine. I hope to see it. I am a tech enthusiast and pro consumer at all levels. My point was to refute claims how AMD is the prima-bona-fide efficiency king of GPUs
@@bal7ha2arampere was behind in efficieny because nvidia was using samsung 8nm. But if AMD and nvidia are using the same node nvidia usually will come on top when it comes to power efficiency.
man this reminds me of the times when intel fanboys were making fun amd but now have to use half their salary to pay for their intel hardware in energycosts
1800 actually on the average 15 amp breaker. New houses are putting in 20 amp breakers allowing for 2400 watts per circuit, but yeah most houses only 1800 watts
Agree, but not every 13th and 14th gen CPU is Raptor Lake. So for people, I'll post which one are Alder Lake as they are not affected i.e. they are safe buy: 13600 (non-K only) 13500/T 13400/F/T (C0 revision only / B0 revision is Raptor Lake) 13100/F/T 14500/T 14400/F/T 14100/F/T 300/T
If you look deep at the 7900x3d its quite impressive that its not too far despite having both 6 cores 3d and non 3d v cache i would expect a lot worse actually but unfortunately its expensive and thats the main problem of it.
@@ZackSNetworkyes its what i said but aside from it its a great cpu and it's impressive its not far behind the 7800x3d and luckily its not the worst priced cpu the 7950x3d is the worst priced cpu of the x3d despite having the power to handle gaming very similar to the 7800x3d while sometimes drop behind it in performance and still provide good enough productivity despite it being slower than the 7950x which probably may be sucks for some but overall this cpu could also be very good and pricing is a major problem of this cpu as well.
If at least RTX 5060 comes performing like a RTX 4070 Super / RTX 4070Ti with that TGP increase it will be at least good, but if it comes with just 5-15% increase which is probably what will happen then it will suck just like the RTX 4060.
personally i think it was stupid to allow them to rise limit for gpu to over 300W. People are saving electricity with led bulbs and looking at efficient electronics, but when it comes to GPU no-one seems to care. I am bit afraid that games will take performance of GPU:s over 300W as requirment in the future. Good think i don't live in warmer climate, but I really think 500W GPU + 170W CPU for entertainment purpose is far too much, especially when industry wants to push required performance more and more every year.
video watching would be like 50w or smth, and really if you can afford such pc youll most likely have no problem w 20$ increase in monthly electricity bill
The 5050's gonna use a 16 pin connector anyway even thought the card is supposed to be PCIe 75 watts! WHY!!! Lower the TDP to PCIe and skip a step, IT'S NOT HARD.
The gap is obviously for 5080 Ti, 5080 Super, 5080 Ti Super, 5080 Ti Super Ultra, 5080 Ti Super Ultra Pro, 5080 Ti Super Ultra Pro X, and of course 5080 Ti Super Ultra Pro X Ultimate. All the while still being nowhere near 5090.
fail without a doubt? All you have to do is change some bios settings and that will limit the performance by less than 5% and the CPU's will never degrade. Of course, the average idiot like yourself wouldn't think of doing this
Still, nothing here says that the RTX 5080 will be a seriously cutdown GB202 or the GB203. I think Nvidia can deliver near RTX 4090 performance on a highly cutdown GB202 with only 350W. I don't see any way for the GB203 to get close to RTX 4090D performance. Also, I would love to find PC hardware deals on Amazon, but Amazon's browsing ability, especially for PC hardware, is trash.
Nvidia has to try to make since of the price disparity between 5080 and 5090. So they make the lower tier card preform massively lower than the flagship card. I wonder if there is a difference in preformance between the 7800x3d and 7900x3d if you park those extra non 3d v-cache cores on the 7900x3d. Or if the 7800x3d still preforms better in games.
No. At some point reasonable people should ask themselves if it's worth that much money and hassle just to play video games, assuming they're not using the same gear as part of their jobs.
Yup, when you sit down and think about it, its mental to be spending that much just for a gaming card. Cool if you are going to use it for rendering or ai, but games? no thanks.
For gamers 70 class is the go to option and the 80 class is if you want a little more performance for more money, the 90 class is if you don't want to buy another GPU for 6-8 years or your insanely rich.
That is because the cost to manufacture for Nvidia is minimal, but the profit of you buying the XX90 series is HUGE compared to the lower cards. What do you think they want you to do?
and then the 5090 stability/deterioration over time complaints come in .... Yes you can push chips really FAR, always at a cost ...see Intel´s I9 and I7 13 &14´s series stability issues
@@ZackSNetwork Not talking about NVidia here, but it's worth watching GamersNexus and Level1Techs regarding related info on the CPU's. Even server CPU's ran at typical conservative values have ended up having major problems, and I was reading numerous comments on related videos of people stating they've made such settings changes on their 13th/14th i7's/i9's and the problems still worsened over time for them.
@@ZackSNetwork I didn't cancel my 4080 super/14900KF pre built system order that is $600 cheaper than it would have cost me to build myself because I read about the bios fix.. Maybe people should try to gain some knowledge in their life instead of being total morons
*If Intel manages to SOMEHOW do well in real-world benchmarks with the Ultra 200 series (vs the 14th gen), then Ultra 300(?) should finally be a good leap forward. Maybe 6.0+ GHz p-core frequency.*
I knew nvidia would doubledown on those cables. But to even increase the wattage without making sure those cables has been proven to work or lessen the failure rate. I would imagine at least 10-20%, 2 for every 10 5090s might burn this time. I normally skip one or two gen before upgrading and it seems I would be skipping two this time, as it seems AMD won't be releasing anything compelling on their end.
One would hope given the failure of the connector used on the 4090 at 450 watts (It simply isn't secure enough for the tiny safety margin) they would either use a new connector or two connectors on a 500 watt card. The 40 series connect really isn't fit for over 300 watts IMHO.
How can you reccomend people the 13900 when its been proven to have a degradation problem? You should be protecting new buyers by informing them not promoting crap to them.
I just bought this 13900 even tho I heard it has problems. I dont know much about it and it hasnt arrived yet BUT I can return it no problem. But can you explain in detail why it is a bad deal? Would appreciate
@@sherdox6672 Newest data show a rapid degradation on 13900 and 14900 regardless of power stage used etc on a alarming percentage. Even lower tiers of cpu are affected from said generations. See wendell's video on it and gamernexus one. If you like to gamble keep it.
@@sherdox6672 It's not a bad deal, the 13900/700 and 14900/700 are literally defective...just look it up I hate fan boy behavior but there's no way I'd buy an Intel platform right now.
You can change some bios settings that will prevent the chips from degrading and you will lose less than 5% total performance.. Yeah, this shouldn't be a thing and intel effed up major, but people who also say there is 0 you can do are also spreading lies
I stopped buying higher tier CPU's and GPU's because it's just become stupid with 500W GPU and 300W CPU (if you get Intel at least) I don't care for sitting next to a space heater.
Man I know there're issues to address about the 13th and 14th Gen cpu's, like how they get really hot so you need at minimum an aio liquid cooler, but what else is worrying about these cpus?
I'd speculate those TDP increases are more so for maximum power draw, not typical averages. Plus, it makes sense since the memory has upgraded from GDDR6X to GDDR7. It appears Nvidia is going through an alternating cycle of performance and then efficiency. Ie, 10 series, performance 20 series, efficiency 30 series, performance 40 series, efficiency 50 series, performance 🤷♂️
This next generation of GPUs has no high other than 5090. .. Rumor is cheap rx7950xtx etc plus next generation cheap 8800xt Vs and kills anything in that price zone. Sounds good for gamers.
Nvidia and intel: we are going to throw more watts at the problem AMD: throws more vram at the problem and leaves the high end gpu market and focuses on more efficiency for their cpus
I used to always get the latest and greatest GPU despite the power draw... Not anymore! I don't mind a little performance downgrade if NVIDIA can target a TDP of at least 200-350w for their MID to TOP GPU. 400-500w+ is just crazy... I know 'coz my 3rd party RTX 3090 can draw up to 430w+ running stock just the GPU alone. Total draw from the wall is about 900w+ 'coz of all the other stuff I have running. I hope they focus more on power efficiency now on both the CPU & GPU... in fact, on all PC components for that matter. 😩
Well yeah Ofc there is going to be a massive gap. If consumers are willing to spend $2000.00 on a 90 card because it performs better than the 80 class by a large margin that’s more money for NVIDIA. No competition = higher price ceiling
People are willing to give Nvidia $2000 for a 4090. Until that changes this environment we’re in will only continue and likely worsen for the consumer.
1x16 pin for 170w? 100w? anw i don't understand, but i'm now exciting and worried at the same time 5060 has 55w higher than 4060, which is should be atleast 15% performance boost than the 4060ti... right?
The 5050 is basically the 4060 in power draw wise, but the 4060 had basically 0 generation uplift in performance to the 3060, so I think the 5060 won't have the same performance as the 4060.
Just becz the gap of 4090 and 4080 is bigger, doesnt mean the performance % difference is much bigger. RTX 4080 super has 71.1% TDP of 4090. On the other hand, 5080 has 70% TDP of 5090. They are basically having the same gap. Also if 5090 is 500W TDP, how are they going to pack them under 2 slots cooling solution? I call that BS unless NV is going watercooling AIO this time.
I'm looking forward to the 5090 as the 4090 is starting to show its age. Won't be long till the ole 4090 be collecting dust on the shelf with the 3090. No melted connectors either as it's so simple to plug them in correctly.
bro, if you plan to cover your sources, next time blur the time and date of the mail as well lol Unless it was a mass mailing, I doubt they sent many mails at 9:47AM precisely
3:10 ~ Next gen Intel has leaked it's name. Super-cali-fragile-istic-expo-ali-docius (If you say it loud enough, it does sound quite atrocious.) The major question, does this one also degrade over time and crash? The performance is great ~ sure. Will it still be great in a year? If it becomes a problem, will Intel RMA it? Can they afford to RMA every 13th and 14th gen high end system they've made and sold so far? "Intel are rich." No ~ Intel were rich seven or 8 years ago. Intel today are smaller than nVidia, and AMD are making ground on them fast. Intel are smaller than Apple. Apple and AMD are vulnerable to side channel exploits ~ slightly. Intel are vulnerable to Spectre and Meltdown and about 200 others that have been found. Their whole speculative out of order execution system is wide open. But they fixed that! No, they're fixing it, and their fixing is likely what's behind the crashes and degradation of their 13th and 14 gen chips. Intel are in a hole, and they're frantically digging themselves deeper. If AMD tell me the 9ooo series has an avg IPC improve of 16% ~ I believe them. If Intel tell me they're purple and beautiful and nobody listens to benchmarks anyway, what matters is real-world bullshit ~ I'd take that on board as well. It used to be you could depend on Intel ~ AMD were a bit flaky. That's not the way things are today.
wattage of 5090: "raising it by a ton" - 50w is *NOT* a ton, lol. in fact, that's the same amount of tdp increase as the 3090 to 4090. - FURTHER, given this is a power supply company supplying these numbers in the context of choosing a power supply, it's EXTREMELY likely that either nvidia didn't raise the tdp at all, and seasonic is just suggesting 100w of overhead allowing for handling of transient power spikes (which is CERTAINLY a thing that happens (even on 3090 and 4090 you can see current spikes that cause the card to push over it's rated 450w for very short periods of time), OR seasonic doesn't know the rated tdp's of these new cards and instead they are simply making an educated guess and allowing for a bit of overhead (as you would want to do in the context for such a calculator).
blurs out names and signature block of internal amd email, but left time stamp which would take anyone in their IT department 15 seconds to find and identify the outgoing mail.
@@AssassinIsAfk i see what you mean but its rumored to have 12 gb from some sources, and also the 60 class cards usually are for 1080p so we don't need an extra amount of Vram.
Misleading title (Again) "as you say in it" and recommending a product you KNOW is failing upwards of 50% (13900k/14900k) that you mentioned on your channel multiple times is failing in mass is just tone depth and just says you care not for your viewers in making good PC hardware decisions
You can prevent the CPU's from degrading by making bios changes. How much it affects the chips performance, I don't know for sure, but If I were to believe people it would be under 5%
Wonder if Intel have fixed the crashing bugs i 13th and 14th gen. Also 500W just for a graphics card is ridiculous. Would need 1200+ W power supply for a decent PC. All for a few extra FPS.
Stopped watching at Core i9-13900K... I mean after everything what is going on with Raptor Lake CPUs and especially with 13900K and 14900K, imagine having audacity to recommend that e-waste.
Nvidia: We have new 3/4nm proces and can finaly lower power consumption. Also Nvidia: nah we pump power consumption even higher. Lets melt those 16pins completely.
the 4090 connector in 450 watts have melting issues, how much more issue will the consumer will encounter for the 500 watts 5090? noone should be recommending any intel 13th and 14th gen processors cos of the issues..
So long as there is a massive price gap that's fine. But there wasn't for 4090 vs 4080. I'm honestly done with Nvidia. i've said it before and I'll say it again. I don't think I'll ever buy another Nvidia product.
Keep in mind that my story on AMD's X870 boards is officially from AMD. It came from my rep with them. I'm not even sure if I needed to blur out his name or anything, but I went ahead just in case so he wouldn't get a bunch of spam. But yeah. That's official word from AMD.
new power cable on nvidia dident fix anything. if you check repair centers they are swamped with burned connectors to this day. its just a bad move. Basic electrical principles. takeing high current cables and shove them into a small wire is a recipe for disaster at any time.
I have heard that AMD is claiming behind the scenes that they are going to release the X870 boards at the end September do you know if there's any truth to this,I honestly don't know why they didn't release them at the same time,I want to build a 9900X system and have to wait for the X870E to be released.
Amazon Prime Days are a single scam: Amazon rises prices weeks or days prior to this and then drops them back to the normal baseline.
That's why you watch the prices throughout the year as new tech releases so you can find a good deal when it hits!
My fidelio x2hr was a good deal tho, they don't make them anymore and they are goated as a gaming headset with a microphone from vmoda, tip for all competitive players
I was just telling someone this a couple hours ago
Can you give an example of the price?
I've noticed this for years now as I'm someone that keeps up with prices often, and I use an extension like Honey to see the price changes over a certain period of time, and yes on a lot of items they seem to artificially inflate the list price to make it look like your getting a way better deal then you really are, like yeah some items are actually a decent deal but many other items are actually about the same price or maybe slightly less then the price they were before the sale.
4090 - "I am almost burned down this kids house..heheheehe" 5090 - "Hold my beer"
wish it can be sli, 4k gaming wo framegen will become reality
That 1x 16pin power connector is sweating already... 500w...
That is true!
@@iikatinggangsengii2471 3060ti can already run 4k fine, you're just running extremely unoptimized settings.
Maybe 24pin?
4090s are still melting connectors. Bumping the wattage by another 50 is just going to make the problem worse.
Just gotta plug it in properly, aka shove that shit in there and hope it doesn’t fry itself.
@@pearce_nz With the numbers of 4090s melting connectors, I'm not going to buy the company narrative that it is just customers don't know how to plug in a connector.
@@KurNorock I mean that’s pretty much the only and main narrative on all help forums. I do admit a fix for it can’t be all that hard and I would be surprised if Nvidia doesn’t address the issue on the 5090.
there is a reason i dont like gpu's that need more than 300 watts
@@pearce_nz They probably have a game plan this time, but I'm glad I never paid for a used card only to be concerned about or possibly actually experiencing such a cable meltdown on top of paying so much for a single component. Even if there weren't problems, unless I was using such a card as part of my work, that's a crazy price to pay just to play video games. One may as well be pay extreme prices for TP, which is actually more useful for most people. :D
Recommending the 13900K CPU is almost criminal given the issues
How?
Which issues?
@@itayonplay352 sadly they get really hot and blue screen after a year of use or so. Intel has to do a recall on the CPU's. I'm using a 14900k so I might be in that basket of people who'll need to get my cpu replaced.
@@hudatcha4077 Most motherboard manufactureros already launched a bios which that fixed all those issues 3 months ago.
@@hudatcha4077its so funny, my friend used to make fun of my budget cpu. Well now im glad I have it😅😂
500W with the same 12WHP connector????
@@theMedicatedCitizen Translation: Cheap greedy bastards.🤣
No. It's the improve version 12v 2x6 cable.
@@arenzricodexd4409 Oh I completely forgot about the new version
@@genejones7902 While I'm glad it hasn't happened to you, it doesn't discount that plenty others have experienced it to the point it's been covered by most major outlets.
@@genejones7902 Thats like saying I dont have any problems with AMD drivers so they dont exist
I would NOT CALL the 13900k a value. The 13900k and 14900k have been having serious issues. We may be talking a small percentage of chips and not much to worry about OR it MAY be a very widespread issue the severity of the situation is way up in the air at the moment.
From my understanding of watching some related videos, it's a widespread issue and one that likely will affect many more down the road. I'm very glad I got a good deal on a 12700k right when I was about to click buy on a 13700k or 14700k. Still, it's bad enough that there is no upgrade path without problems past the first gen on this architecture.
Likely affect 100% of 13/14900k's eventually in major ways as reported by server owners. Gamers might only crash in a game a couple times a week to begin with though but it gets worse and adjusting power usage doesn't seem to help either.
@Joegengstah I have no proof that is the case but my gut says you nailed it. This processor situation reminds me of The xbox 360 red ring of death. By all accounts it looked like a very small amount of consoles were failing. Then the numbers just kept growing. Things got bad enough so Microsoft actually extended warranty coverage for the issue by a couple of years. Sure it took a LOT of arm twisting but they did. At this point it is pretty much accepted that it is a matter of when NOT if the original fat consoles fail. Microsoft never released numbers but they are clearly up there.
@@brianwalker7771 intel should adress it but they are being tight lipped for some reason which is costing a lot of good will. its very obvious how widespread this is
@Joegengstah Most likely the PR AND leagle teams said SHUT UP until we say other wise! I agree they should make some statement on the situation it just seems logical to get ahead of the situation. However corporate logic is often anything but logical to those of us outside of the corporate world..
RTX 5090 will melt 16pin connector very easy. It will not fix problem with tiny pins in connector. Low area wher pins claps.
Nvidia : GPUs :: Intel : CPUs
All they know how to do is throw more watts at the problem.
Nah. NVIDIA GPUs are far more efficient than AMDs.
Difference here is efficiency plus total power.
AMD needs to start trying
the 4060 tdp is 115 watts rx 7600 is 190 with that make what you want
@@ehenningsen amd was ahead with rx 6000, they just didnt improve power efficiency much while nvidia did. next gen will most likely be the other way around again.
also, amds entire radeon team is smaller than the driver development team at nvidia so its impressive amd can even compete at all
@@bal7ha2ar Which is fine. I hope to see it. I am a tech enthusiast and pro consumer at all levels.
My point was to refute claims how AMD is the prima-bona-fide efficiency king of GPUs
@@bal7ha2arampere was behind in efficieny because nvidia was using samsung 8nm. But if AMD and nvidia are using the same node nvidia usually will come on top when it comes to power efficiency.
The next thing in the market are liquid cooled power connectors for nvidia’s rtx
man this reminds me of the times when intel fanboys were making fun amd but now have to use half their salary to pay for their intel hardware in energycosts
If it works then
@@W-memeapparently 13 and 14th gen eventually don't so...
Someone needs to remind Nvidia and Intel that the standard US wall outlet can only supply 1500W total.
didnt know abt this
1800 actually on the average 15 amp breaker. New houses are putting in 20 amp breakers allowing for 2400 watts per circuit, but yeah most houses only 1800 watts
Why, do you think they're not aware?
The first thing I did when I bought my house last year was upgrade from 15 amps to 30... Who cares about poor people?
@@sethobrien8523 It's actually 1440 on 15 amps, but there would have to be over 1800 pumping without the breaker tripping for there to be a fire risk
Bro that wattage looks like my power bill
Maybe don't promote Intel 13th & 14th Gen right now, given the quickly growing instability reports.
Agree, but not every 13th and 14th gen CPU is Raptor Lake.
So for people, I'll post which one are Alder Lake as they are not affected i.e. they are safe buy:
13600 (non-K only)
13500/T
13400/F/T (C0 revision only / B0 revision is Raptor Lake)
13100/F/T
14500/T
14400/F/T
14100/F/T
300/T
@@DukeofSerbiaalder lake actually has been reported to be affected….
OMG I need the same wattage for my powersupply as a Delorian Flux capacitor.... ROFL
So Nvidia are getting more performance from upping the TDP..
If you look deep at the 7900x3d its quite impressive that its not too far despite having both 6 cores 3d and non 3d v cache i would expect a lot worse actually but unfortunately its expensive and thats the main problem of it.
It sucks you’re paying a lot of money for 6 cores because it cuts its self in half. Either go 7800x3d or 7950x.
@@ZackSNetworkyes its what i said but aside from it its a great cpu and it's impressive its not far behind the 7800x3d and luckily its not the worst priced cpu the 7950x3d is the worst priced cpu of the x3d despite having the power to handle gaming very similar to the 7800x3d while sometimes drop behind it in performance and still provide good enough productivity despite it being slower than the 7950x which probably may be sucks for some but overall this cpu could also be very good and pricing is a major problem of this cpu as well.
I'm predicting that the 800 Series boards will release along side the x3D chips.
This card going to cost a kidney
And a lung. 😂
Save a kidney, buy AMD
ore sell 2 kidney's and get a 5090
@@mar504Nah blud gotta get that PiSS upscaler cuz FSR looks 0.3% worse if you use a microscope to analyse every pixel of your screen.
And a right ball
The 7800X3D is actually a bargain at Micro Center. Can get a good mobo and 6mt 32gb ram included for the price of the CPU
Microcenter is best ! 👍Also for 10 dollars more you can upgrade the bundle to 48Gb of RAM
The 5090 will need its own PSU wtf
If at least RTX 5060 comes performing like a RTX 4070 Super / RTX 4070Ti with that TGP increase it will be at least good, but if it comes with just 5-15% increase which is probably what will happen then it will suck just like the RTX 4060.
Hard to consider spending so much money on a high end GFX card's until they get their quality under control.
personally i think it was stupid to allow them to rise limit for gpu to over 300W. People are saving electricity with led bulbs and looking at efficient electronics, but when it comes to GPU no-one seems to care. I am bit afraid that games will take performance of GPU:s over 300W as requirment in the future. Good think i don't live in warmer climate, but I really think 500W GPU + 170W CPU for entertainment purpose is far too much, especially when industry wants to push required performance more and more every year.
its gaming only watt, and its considered low for highend gaming, we used to use 1200w ish psus for highend sli/crossfire
video watching would be like 50w or smth, and really if you can afford such pc youll most likely have no problem w 20$ increase in monthly electricity bill
There's 5 different models, just buy a cheaper card.
@@guyza123 for real
The 5050's gonna use a 16 pin connector anyway even thought the card is supposed to be PCIe 75 watts! WHY!!! Lower the TDP to PCIe and skip a step, IT'S NOT HARD.
The gap is obviously for 5080 Ti, 5080 Super, 5080 Ti Super, 5080 Ti Super Ultra, 5080 Ti Super Ultra Pro, 5080 Ti Super Ultra Pro X, and of course 5080 Ti Super Ultra Pro X Ultimate. All the while still being nowhere near 5090.
The 5090 with one 16 pin is really sketchy tbh
Why on earth would you buy an intel cpu that is going to fail without a shadow of a doubt? DO NOT BUY 13th nor 14th gen intel core CPUs
fail without a doubt? All you have to do is change some bios settings and that will limit the performance by less than 5% and the CPU's will never degrade. Of course, the average idiot like yourself wouldn't think of doing this
These graphics card prices are rigoddamndiculous and a huge risk to every buyer.
I am just waiting for the 5050 card for the name lmao
lol
I'm waiting for the 6090......
There's a 5050 chance that you get it
Still, nothing here says that the RTX 5080 will be a seriously cutdown GB202 or the GB203. I think Nvidia can deliver near RTX 4090 performance on a highly cutdown GB202 with only 350W. I don't see any way for the GB203 to get close to RTX 4090D performance.
Also, I would love to find PC hardware deals on Amazon, but Amazon's browsing ability, especially for PC hardware, is trash.
Nvidia has to try to make since of the price disparity between 5080 and 5090. So they make the lower tier card preform massively lower than the flagship card. I wonder if there is a difference in preformance between the 7800x3d and 7900x3d if you park those extra non 3d v-cache cores on the 7900x3d. Or if the 7800x3d still preforms better in games.
7800x3d performs better in games confirmed
No. At some point reasonable people should ask themselves if it's worth that much money and hassle just to play video games, assuming they're not using the same gear as part of their jobs.
Yup, when you sit down and think about it, its mental to be spending that much just for a gaming card. Cool if you are going to use it for rendering or ai, but games? no thanks.
For gamers 70 class is the go to option and the 80 class is if you want a little more performance for more money, the 90 class is if you don't want to buy another GPU for 6-8 years or your insanely rich.
any hobby can get unreasonably expensive really fast
Stay at 1440 and use a 3080, 4070 Super or 6900xt or even a 7900GRE. Save yourself thousands of dollars.
People who want the absolute best looking picture and biggest tv they can get spend more money on the tv than the pc to play games on
A 50watt increase is not massive. If it was 100watt+ then yeah. However undervolting is a thing regardless.
exciting perf it seems, who knows maybe even 2x 4090 w smth like new dlss
huge yay for 4k players
The high ass tax of my country mostly takes out the savings from the Amazon deal
RIP that 16pin power connector that has to run 500W to the 5090...
That is because the cost to manufacture for Nvidia is minimal, but the profit of you buying the XX90 series is HUGE compared to the lower cards. What do you think they want you to do?
and then the 5090 stability/deterioration over time complaints come in ....
Yes you can push chips really FAR, always at a cost ...see Intel´s I9 and I7 13 &14´s series stability issues
Stop lying that’s not been a thing from Nvidia. You can fix the Intel situation in the bios by stopping the insane 1 and 2 cores clock speed boosts.
@@ZackSNetwork Not talking about NVidia here, but it's worth watching GamersNexus and Level1Techs regarding related info on the CPU's. Even server CPU's ran at typical conservative values have ended up having major problems, and I was reading numerous comments on related videos of people stating they've made such settings changes on their 13th/14th i7's/i9's and the problems still worsened over time for them.
@@ZackSNetwork I didn't cancel my 4080 super/14900KF pre built system order that is $600 cheaper than it would have cost me to build myself because I read about the bios fix.. Maybe people should try to gain some knowledge in their life instead of being total morons
I can not wait to get my 5090 and it MELTS THE CONNECTOR...
It will melt the entire pc 😭
@@RajwarsonuL
*If Intel manages to SOMEHOW do well in real-world benchmarks with the Ultra 200 series (vs the 14th gen), then Ultra 300(?) should finally be a good leap forward. Maybe 6.0+ GHz p-core frequency.*
Im looking forward to buying the 5090
Same
Hope you're home is insured 😂🎉
@@Oliver-sn4be There are no other legit GPU manufacturers, so I'll take my chances
The SN850X is a beast!
5090 plus 11% TDP
"TDP raised a TON." 😂
intel 13900k is on such deal, because nobody want unstable chip that explodes after a year.
$3000-$5000 msrp maybe?
The more you buy…
No
Dumb comment
The more you save
paid 280 for the 7900x3d 2-3 weeks ago sold by amazon
I knew nvidia would doubledown on those cables. But to even increase the wattage without making sure those cables has been proven to work or lessen the failure rate. I would imagine at least 10-20%, 2 for every 10 5090s might burn this time. I normally skip one or two gen before upgrading and it seems I would be skipping two this time, as it seems AMD won't be releasing anything compelling on their end.
One would hope given the failure of the connector used on the 4090 at 450 watts (It simply isn't secure enough for the tiny safety margin) they would either use a new connector or two connectors on a 500 watt card. The 40 series connect really isn't fit for over 300 watts IMHO.
How can you reccomend people the 13900 when its been proven to have a degradation problem? You should be protecting new buyers by informing them not promoting crap to them.
because these guys would rather earn a quick buck instead of helping their viewers. clickbait titles and crap promotions
I just bought this 13900 even tho I heard it has problems. I dont know much about it and it hasnt arrived yet BUT I can return it no problem. But can you explain in detail why it is a bad deal?
Would appreciate
@@sherdox6672 Newest data show a rapid degradation on 13900 and 14900 regardless of power stage used etc on a alarming percentage. Even lower tiers of cpu are affected from said generations. See wendell's video on it and gamernexus one. If you like to gamble keep it.
@@sherdox6672
It's not a bad deal, the 13900/700 and 14900/700 are literally defective...just look it up
I hate fan boy behavior but there's no way I'd buy an Intel platform right now.
You can change some bios settings that will prevent the chips from degrading and you will lose less than 5% total performance.. Yeah, this shouldn't be a thing and intel effed up major, but people who also say there is 0 you can do are also spreading lies
Sad thing is, it's going to sell faster than hot cakes.
I stopped buying higher tier CPU's and GPU's because it's just become stupid with 500W GPU and 300W CPU (if you get Intel at least) I don't care for sitting next to a space heater.
People won't be happy until their house just explodes when the 6090 is introduced I'm not signing up for the still a homeowner beta test!?!?!?😂😂
Why would you be selling advertising for Intel 13900k! All the problems with the 13 and 14 series
Man I know there're issues to address about the 13th and 14th Gen cpu's, like how they get really hot so you need at minimum an aio liquid cooler, but what else is worrying about these cpus?
Latet this year, SIGH! How can you even launch a cpu without the new chipset?!
How many watts does a single 16-pin power rail typically deliver?
Go figure 990 pro still 30$ more than it was last holiday/Jan. Crazy use to be 120 or less for 2tb
I'd speculate those TDP increases are more so for maximum power draw, not typical averages.
Plus, it makes sense since the memory has upgraded from GDDR6X to GDDR7.
It appears Nvidia is going through an alternating cycle of performance and then efficiency.
Ie, 10 series, performance
20 series, efficiency
30 series, performance
40 series, efficiency
50 series, performance 🤷♂️
Finally someone gets it, unlike these idiots in the other comments.
@@YoureBreathtakinghow is 4000 ef 😂
Increase in TDP is worrisome as it might be the same thing as Intel 14th gen
oh well, AMD coming in hot for me, no way I'm giving these bandits $2k+ for a GPU nope not gonna do it and ifs messed up they gimp the 5080
This next generation of GPUs has no high other than 5090. ..
Rumor is cheap rx7950xtx etc plus next generation cheap 8800xt Vs and kills anything in that price zone. Sounds good for gamers.
I can believe the product but doubt it'll be very cheap.
@@Rushtallica best be at 16gb of drr6 256bits bus. That is poor and needs to be £400 ish
Nvidia and intel: we are going to throw more watts at the problem
AMD: throws more vram at the problem and leaves the high end gpu market and focuses on more efficiency for their cpus
I picked up that 4TB sn850x earlier, been waiting for a price like that
I used to always get the latest and greatest GPU despite the power draw... Not anymore! I don't mind a little performance downgrade if NVIDIA can target a TDP of at least 200-350w for their MID to TOP GPU. 400-500w+ is just crazy... I know 'coz my 3rd party RTX 3090 can draw up to 430w+ running stock just the GPU alone. Total draw from the wall is about 900w+ 'coz of all the other stuff I have running. I hope they focus more on power efficiency now on both the CPU & GPU... in fact, on all PC components for that matter. 😩
Really looking forward to building a new machine, been without a desktop for years now )
What i heard for the 800 series boards was September release.
Well yeah Ofc there is going to be a massive gap. If consumers are willing to spend $2000.00 on a 90 card because it performs better than the 80 class by a large margin that’s more money for NVIDIA. No competition = higher price ceiling
500w on a single 16-pin connector is risky af. They should've gone with 2 connectors. Maybe they will..
I'd be much more impressed with slightly faster cards slightly more ram and less power consumption
People are willing to give Nvidia $2000 for a 4090. Until that changes this environment we’re in will only continue and likely worsen for the consumer.
1x16 pin for 170w? 100w?
anw i don't understand, but i'm now exciting and worried at the same time
5060 has 55w higher than 4060, which is should be atleast 15% performance boost than the 4060ti... right?
The 5050 is basically the 4060 in power draw wise, but the 4060 had basically 0 generation uplift in performance to the 3060, so I think the 5060 won't have the same performance as the 4060.
So you say its performance gap then proceed to show ups power increase? That is a misleading title.
Just becz the gap of 4090 and 4080 is bigger, doesnt mean the performance % difference is much bigger. RTX 4080 super has 71.1% TDP of 4090. On the other hand, 5080 has 70% TDP of 5090. They are basically having the same gap.
Also if 5090 is 500W TDP, how are they going to pack them under 2 slots cooling solution? I call that BS unless NV is going watercooling AIO this time.
I'm looking forward to the 5090 as the 4090 is starting to show its age. Won't be long till the ole 4090 be collecting dust on the shelf with the 3090. No melted connectors either as it's so simple to plug them in correctly.
Here's amd can pull it if the tdp around 350 watt for the top tier yet for the prive much cheaper and performance in game equal
true generally amd products are more greener than nvidia
@5:18 stealth Steve from Gamers Nexus cameo! ;-)
bro, if you plan to cover your sources, next time blur the time and date of the mail as well lol
Unless it was a mass mailing, I doubt they sent many mails at 9:47AM precisely
Microcenter bundle.
7800x3d.
Msi x670 open box.
32 gigs of ram Cl32 6000.
520$
Amazon is a rip off
3:10 ~ Next gen Intel has leaked it's name. Super-cali-fragile-istic-expo-ali-docius
(If you say it loud enough, it does sound quite atrocious.)
The major question, does this one also degrade over time and crash?
The performance is great ~ sure. Will it still be great in a year?
If it becomes a problem, will Intel RMA it? Can they afford to RMA every 13th and 14th gen high end system they've made and sold so far?
"Intel are rich."
No ~ Intel were rich seven or 8 years ago. Intel today are smaller than nVidia, and AMD are making ground on them fast. Intel are smaller than Apple.
Apple and AMD are vulnerable to side channel exploits ~ slightly.
Intel are vulnerable to Spectre and Meltdown and about 200 others that have been found. Their whole speculative out of order execution system is wide open.
But they fixed that!
No, they're fixing it, and their fixing is likely what's behind the crashes and degradation of their 13th and 14 gen chips.
Intel are in a hole, and they're frantically digging themselves deeper.
If AMD tell me the 9ooo series has an avg IPC improve of 16% ~ I believe them.
If Intel tell me they're purple and beautiful and nobody listens to benchmarks anyway, what matters is real-world bullshit ~ I'd take that on board as well.
It used to be you could depend on Intel ~ AMD were a bit flaky. That's not the way things are today.
500 watts for gpu alone 😮
Why are we going backward in technology
wattage of 5090: "raising it by a ton" - 50w is *NOT* a ton, lol. in fact, that's the same amount of tdp increase as the 3090 to 4090. - FURTHER, given this is a power supply company supplying these numbers in the context of choosing a power supply, it's EXTREMELY likely that either nvidia didn't raise the tdp at all, and seasonic is just suggesting 100w of overhead allowing for handling of transient power spikes (which is CERTAINLY a thing that happens (even on 3090 and 4090 you can see current spikes that cause the card to push over it's rated 450w for very short periods of time), OR seasonic doesn't know the rated tdp's of these new cards and instead they are simply making an educated guess and allowing for a bit of overhead (as you would want to do in the context for such a calculator).
i just want the 5050 to be a good deal, there are no good deals for new GPUs in the indian market rn
blurs out names and signature block of internal amd email, but left time stamp which would take anyone in their IT department 15 seconds to find and identify the outgoing mail.
That 5060 will be a beast.
Not if it has 8gb of ram which is what it's rumoured to have.
@@AssassinIsAfk yah then it would suck, since 3060 had 12gb I could see it having that at least as an option.
@@AssassinIsAfk i see what you mean but its rumored to have 12 gb from some sources, and also the 60 class cards usually are for 1080p so we don't need an extra amount of Vram.
Misleading title (Again) "as you say in it" and recommending a product you KNOW is failing upwards of 50% (13900k/14900k) that you mentioned on your channel multiple times is failing in mass is just tone depth and just says you care not for your viewers in making good PC hardware decisions
You can prevent the CPU's from degrading by making bios changes. How much it affects the chips performance, I don't know for sure, but If I were to believe people it would be under 5%
The 5060 might actually be a good card unlike the last gen.
Wonder if Intel have fixed the crashing bugs i 13th and 14th gen.
Also 500W just for a graphics card is ridiculous. Would need 1200+ W power supply for a decent PC. All for a few extra FPS.
The softest version of "yet again" ever 😊😊
R.I.P kidney 😢
Will someone finally sue NVidia for this power connector? That really seems to be a timed-failure solution (+ firehazard) than anything else.
Stopped watching at Core i9-13900K...
I mean after everything what is going on with Raptor Lake CPUs and especially with 13900K and 14900K, imagine having audacity to recommend that e-waste.
anyone buying a 13 or 14 gen right now is crazy
Nvidia: We have new 3/4nm proces and can finaly lower power consumption. Also Nvidia: nah we pump power consumption even higher. Lets melt those 16pins completely.
Good thing I just got 1600w psu 💀
Yeah that should handle a RTX6090 at 600watts as well
the 4090 connector in 450 watts have melting issues, how much more issue will the consumer will encounter for the 500 watts 5090? noone should be recommending any intel 13th and 14th gen processors cos of the issues..
If only they would reactivate SLI. I would buy 2 5090
that would be the dream, 3-4 way 7900xtx too
2x 5090 will surely plays anything 4k 144 wo framegen smoothly, smth 4k 144 led owners been wishing
I predict the rtx 5050 will be a decent card, I have a gut feeling
Massive power draw too
For the price difference isnt this obvious?
So long as there is a massive price gap that's fine.
But there wasn't for 4090 vs 4080.
I'm honestly done with Nvidia. i've said it before and I'll say it again. I don't think I'll ever buy another Nvidia product.