Click this link sponsr.is/bootdev_GamerMeld and use my code GAMERMELD to get 25% off your first payment for boot.dev. That’s 25% off your first month or your first year, depending on the subscription you choose.
Calm down @GamerMeld, the reason for 2 12V connectors, is to mitigate the power connector burning and melting issues, they know that the 12 V connectors can only sustain 300 to 400 watts at best, they can't just admit they where wrong.
So why not go back to good old reliable PCI-E 8 pin connector then if they gonna split the load anyway. They are inventing less reliable solution when there is one very reliable for decades already...
Then why did MSI designed each of the 12V power connector to deliver 600W? MSI would just leave money on the table & give gamers a freebie? I don’t think so.
@@tringuyen7519 I would think that this PSU is not aimed at gamer but at stations that use mutliple GPU's. Meanwhile Seasonic announced new PSU that could power four RTX 5090s..
You're so wrong, my 4090 regularly has done 550watts the past year and a half, lets not lie, and spread misinformation trying to be hard, and pressing the RUclipsr lmao
You’re never supposed to run a connector at 100% of its power rating for a sustained duration. You should always design a system to use 80% of it’s rating to allow for aging and oxidation of the connectors. So if it’s a 600watt connector, you should use 2 connectors if its anything over 480watts for a sustained duration.
At this point, no one really cares about what NVIDIA is doing unless you're an enthusiast with tons of money or it's something the market is looking forward.
Also, the way the power plugs were catching fire, silicon snapping. On the high end. "Midend" has crippled performance, and is very shortly lived playing relatively new games maxed. Lowend works decently unless they lie and name a worse performing card the same thing/ gimped with bus or vram.
For Me as SFF PC enjoyer, Nvidia and undervolting goes well like bread and butter due to Nvidia overvolt way to much Their power efficiency is insane, They're the only viable higher end option for ITX case, Prob to those who engineer founder edition cooler, not only they're smaller, quieter and cooler they also at MSRP card unlike similar MSI Expert. For AMD, it is way harder to find, RDNA 3 is known to being inefficient and scale very badly
upgrading from 5800X3D to 7800X3D isn't even worth it, and tbh, I would guess the 9000-series X3D's as well doesn't bring a worthy upgrade, the 5800X3D is just too amazing value
@@lassebrustadGoing from 5600X to 7700X brought me a nice bit of breathing room. I will upgrade to the 9900X3D once it comes down in price. A few more threads should help a great deal.
battery test comparison sucks, Mac has the smallest screen at 13 inches as we should all know Screen size matters as they are the main variable to power use/usage. They should all be run with same external monitor in testing.
Newer cpus doesn't make your old cpu obsolete remember it will be obsolete when its cooked the whole upgrade path hype train is absolute waste of money if we are looking at a clever consumer decisions.
My husband still games on a 5900x and it works perfectly. I have the 7950x3d because I leapfrogged from an ancient Intel and I'm perfectly happy with mine as well.
I'm still on a 4790K. Upgraded from an A10-7850K, my board was limited to 2 RAM slots, so I took the 4790 when someone gifted it to me since RAM was the limiting factor and those DDR3 modules are so expensive, if available at all. My newest part is still the GTX 970, though. Still works for most games. Will upgrade soonish, though. Time for another 10 year investment.
The battery usage benchmarks are bogus, or at least very misleading. The biggest power usage on light workloads is the display and no specs were given. However, the size of the display is telling. The AMD chips, with the largest display (16" vs 13"/14") has the worst battery life. Go figure.
I like how the sponsor aims to battle boredom, but programming was the absolutely most boring and mind-numbing thing I've ever tried to learn in my life.
@@narachi- No. It is exactly the other way around. There is no more power headroom in the current architecture. They are simply raising voltages and making bigger chips with everything. The real performance advance now is in improving the design of the architecture further or move to an even smaller process. Then you will see performance per watt take another little leap. Then gradually total power is increased again and the cycle repeats. This used to not be obvious because of the best performing gpus in the mid 2000s only drew about 50W.
@@juliusfucik4011 that’s what I mean by future proof, kinda hard to design new architecture that will cost more power if you don’t already have the power.
They've hit the limits of the silicon. Die shrinks and pumping absurd voltage into them doesn't work any more. More power just means more waste heat these days.
I could care less about overclocking the x3d chips, they better make those two dual ccdx3d or im not interested. I'll take the clock hit ifbit means not having to rely on software to figure out if im playing a game or not
@@tringuyen7519do 8k gaming monitors even exist? What would be the point of even running it above a 4k monitor you won't be able to tell the difference since the screen will be too small
@@Fr00stee8K monitors exist. I have had many screens, 1/2/4/5K and different sizes. 4K at 28 inch is perfection at 100% scaling. Main game screen is 3440×1440 at 34 inches which is great. Having a 5K ultra wide at 29-30 inch with 125% scaling would be the ultimate but for now they are not going over 60Hz.
FORGET Ryzen 5000! Ryzen 7000 is coming on an entire new socket! FORGET Ryzen 7000! Ryzen 9000 is going to have a huge architecture leap! FORGET Ryzen 9000! Ryzen 9000X3D is going to have a completely new 3D-VCache system!
9800X3D no overclocking support is very disappointing if true. I hope it has more performance and 3D-VCache than the 7800X3D, or it looks like the 9950X, 9950X3D and 9900X3D are the better choices in 2025.
Very disappointing. It's common knowledge that the 6 core parts are inferior to the 8 core parts. If the 9900x3d can be overclocked on the vcache chip, then there is no reason that the 9800x3d could not overclock. No reason except profit.
I'm an enthusiast with loads of money, even my toilet paper is made of gently massaged, scented, perfectly fluffled and creased freshly minted hundeds an fittys (mispelled on purpose), but jokes aside, im staying on the 4090 lol, it's powerful enough I only run it at 2/3 of the power but it still puts out about 90% of the performance. So I really dont care if the 5090 is better. Could be two, three or 4 times better. Its not even the money, its just that I as an enthusiast with money that buys crazy GPUs like a 4090, i feel content. In fact im so content i bought a weaker 4080 laptop so I can kick back whenever I feel like. Mind you the 4080 laptop GPU is closer in performance to a 4070 desktop part or 3080 from previous gen. It's enough for me ...I still have my old 3080 actually. Nobody "needs" a 5090, no, not even you, yeah you, the one using it for renders and work. You'll be fine with a 4090. Its also simpler to run the 4090 cable and PSU wise, leas complicated imo. Now ok, some may disagree with me, is fine. I'm happy as a pig in sh**. I'm the guy the buys the new ARK Survival Ascended and then still plays the older ARK Survival Evolved instead because I want to despite having the gear to run the newer software. Lol.
Power requirements for GPUs are getting out of and. They really need to focus on power efficiency, we're starting to run into the scenario where a typical North American 115v plug is not enough!
The 8 core chips are higher quality than the 6 core chips. If the 6 core chip with 3d cache on the 9900x3d is overclockable, but the 8 core 9800x3d is not overclockable, I would be very disappointed. Very scummy AMD.
Well, I figured AMD would want to get the most popular X3D chip out there quickly - which would be the 8 core version. As for new features, honestly my gut was and is still telling me it'sw likely to have some form of NPU/AI acceleration feature set that the regular 9000 series doesn't have (esp given the over-hype around using the buzzword (*YAWN*) AI).
On the next Max Tech video "AMD and Intel got a lot more points than Apple, but Apple uses 1/3 the power, so lets multiply Apple results by 3 aaaaand, wow!!! Apple has the fastest chip ever!!"
I will buy 5090 no matter what and definitly not for gaming. If it launch with 2x 16 pin connector it will be more safe (550/2 =275w per connector = safe)
9800x3D cant be over-clocked. after living with a i7 6900k that cant Overclock beyond stock boost, i dont care much about OC anymore. and current tech is already pushed to the limit by manufacturers anyways. i dont need more than 16 threads, so 9800x3D is still my next pc build.
Battery life chart looks a bit off. It should be normalized, at least with battery capacity (e.g.: time per Wh). Ryzen has larger screen ... Apple is still the winner here ...
LOL. I don't want to defend lying Intel, but you yourself are not honest. You compare the results of processors and you haven't said a word about the number of cores in each of them.
I'd love to see one of these hardware reviewers do show on what you realistically need to game. I know everyone is waiting for the next 'most powerful' hardware to drop but let's get real here... Do you really need the latest CPU and GPU and NVME and RAM and motherboard and etc to game?
Kids these day lack the attention and memory of a simple goldfish ffs. It was always like this for generations. One gen of power hog and next gen is power efficient. 4000 series was power efficient now its gonna be power hogs in 5000 series. They just refine product with every second gen and sell it to you as "new" cards. Wait till 6000 series for power efficient cards now.
2 connectors of 12vhpwr is because single connector just wasn't cut it..all those 600w on tiny wires put unnecessary strains on it..2 connectors seem enough to split 300w/300w and prevent it from melting..
So a GPU now has the real potential to be more power needy than an entire rig just one gen back? Yeah they lost their minds, how is that better? imagine your cpu was set at a TDP of 600 watts? would you even be able to cool that? a potential 1200 watts to the GPU how the fk is that going to be tamed? quad slot cooler with no other option but water cooling and some fans for the ram and vrm's? jokers. Nvidia just out there on crack at this point.
I think the dual connector could simply be so that it isn't trying to pull all of the power through 1 cable and to help eliminate the burning we had when 40 series launched. Say 300w on each cable rather than 600w one one
On GPUs. Now to extrapolate and interpret the info in the vocals, that is why AMD has stated it is not going to compete with nvidia at the top end and will concentrate on the middle of the market. A sensible move, unless one hooks up the GPU to either the mains or the have a dedicated solar battery. AMDs move seems to be justified by looking at the values of second hand GPUs, the true market position indicator, not the loudness of those paid to boost sales by whatever means they chose like pretending they dropped from heaven off a Gods workbench.
Those small laptops are difficult to compare, basically the all are effectively the same. The criteria of one over another comes down to the aesthetics.
I personally feel like they went the route of separating out the power draw from the connectors by having two of them so that all the load is not on just one of the connectors. This is the same route that the overclocker Kingpin would have taken with his version of the 4090 I believe as he states in the gamers nexus video. Seems like a great idea to me!
If you are surprized by the RTX 50xx taking 600W then you have not been paying attention. The nVidia H100 was already an over 700W part and the new H200 will take more, the trend is the same on the consumer side, no surprizes there. This will be a problem in NA or anywhere else that does not natively have 220V/240V power to the outlet. SInce 1600W is the theoretical max from a 12V 15A circuit. Lets do some math, 600W for GPU, 300W for CPU, RAM 50W NIC 25W, Sound card 25W Storage 50W = 1050W Don't plug anything else in there, at least you wont need that spare space heater in the winter.
H100 and H200 GPU die is humongous, They are roughly 35% bigger die area than 4090, They also have way more VRAM, Their high power consumption is absolutely justified
Click this link sponsr.is/bootdev_GamerMeld and use my code GAMERMELD to get 25% off your first payment for boot.dev. That’s 25% off your first month or your first year, depending on the subscription you choose.
thanks for you love amd,,,show people amd is no1 optimum cpu,gpu with lower price
Calm down @GamerMeld, the reason for 2 12V connectors, is to mitigate the power connector burning and melting issues, they know that the 12 V connectors can only sustain 300 to 400 watts at best, they can't just admit they where wrong.
So why not go back to good old reliable PCI-E 8 pin connector then if they gonna split the load anyway. They are inventing less reliable solution when there is one very reliable for decades already...
Then why did MSI designed each of the 12V power connector to deliver 600W? MSI would just leave money on the table & give gamers a freebie? I don’t think so.
@@tringuyen7519 I would think that this PSU is not aimed at gamer but at stations that use mutliple GPU's. Meanwhile Seasonic announced new PSU that could power four RTX 5090s..
You're so wrong, my 4090 regularly has done 550watts the past year and a half, lets not lie, and spread misinformation trying to be hard, and pressing the RUclipsr lmao
You’re never supposed to run a connector at 100% of its power rating for a sustained duration. You should always design a system to use 80% of it’s rating to allow for aging and oxidation of the connectors. So if it’s a 600watt connector, you should use 2 connectors if its anything over 480watts for a sustained duration.
At this point, no one really cares about what NVIDIA is doing unless you're an enthusiast with tons of money or it's something the market is looking forward.
nobody should care or buy because they don't make for consumers.
Also, the way the power plugs were catching fire, silicon snapping. On the high end.
"Midend" has crippled performance, and is very shortly lived playing relatively new games maxed.
Lowend works decently unless they lie and name a worse performing card the same thing/ gimped with bus or vram.
@@bionicseaserpent yep. It's a prosumer, enthusiast, and enterprise-only brand now.
For Me as SFF PC enjoyer, Nvidia and undervolting goes well like bread and butter due to Nvidia overvolt way to much
Their power efficiency is insane, They're the only viable higher end option for ITX case, Prob to those who engineer founder edition cooler, not only they're smaller, quieter and cooler they also at MSRP card unlike similar MSI Expert.
For AMD, it is way harder to find, RDNA 3 is known to being inefficient and scale very badly
You sound broke AF
Looks like I am skipping 7800x3d, I will wait for 9900x3d/9950x3d, my 5800X3D will be enough until then.
as a 7950x3d owner i agree with this statement.
Same here. When the 7800x3d went up in price and Microcenter ended the bundle, I knew I would either cave for the 7600x3d or wait. Glad I waited.
Got a 5800x3d for like 120 bucks.. ill now wait for the 99003d/9950x3d as well.
upgrading from 5800X3D to 7800X3D isn't even worth it, and tbh, I would guess the 9000-series X3D's as well doesn't bring a worthy upgrade, the 5800X3D is just too amazing value
@@lassebrustadGoing from 5600X to 7700X brought me a nice bit of breathing room.
I will upgrade to the 9900X3D once it comes down in price. A few more threads should help a great deal.
battery test comparison sucks, Mac has the smallest screen at 13 inches as we should all know Screen size matters as they are the main variable to power use/usage. They should all be run with same external monitor in testing.
Copium when intel starts waking up 😂
I get what you mean, but ARM chips are not power hungry and x86 chips have a bit improving to do in that area.
I’m still using the 5900x and it’s still a very solid CPU.
Same but on the 5600x definitely upgrading to AM5 when they have a good combo deal on board/ram/cpu deal
@@Dr.RichardBanks same, AMD does what I need it to do and 9000 series looks interesting.
Newer cpus doesn't make your old cpu obsolete remember it will be obsolete when its cooked the whole upgrade path hype train is absolute waste of money if we are looking at a clever consumer decisions.
My husband still games on a 5900x and it works perfectly. I have the 7950x3d because I leapfrogged from an ancient Intel and I'm perfectly happy with mine as well.
I'm still on a 4790K. Upgraded from an A10-7850K, my board was limited to 2 RAM slots, so I took the 4790 when someone gifted it to me since RAM was the limiting factor and those DDR3 modules are so expensive, if available at all. My newest part is still the GTX 970, though. Still works for most games. Will upgrade soonish, though. Time for another 10 year investment.
Not touching the 90 series cards because of that stupid connector.
That's why their are 2, so the power can be split
Good. More stock for the rest of us then.
Only reason im noy touching the 90 cards are for one reason alone. Thats double my rent.
@@OfficialDachia 4090 is 4x my rent if i think about it that way😂
The problem is money😂 be honest.
The battery usage benchmarks are bogus, or at least very misleading. The biggest power usage on light workloads is the display and no specs were given. However, the size of the display is telling. The AMD chips, with the largest display (16" vs 13"/14") has the worst battery life. Go figure.
on top of that Strict AMD uses OLED .....which is consume more power than IPS
Bro, your favorite word is "BUUUT!"😂
Everyone love a good but
Especially when copium is involved 😂
buuuette
To be honest, I would approve the new AMD CPU boxes if they were like the thumbnail of this video. That black and white is pretty slick.
I like how the sponsor aims to battle boredom, but programming was the absolutely most boring and mind-numbing thing I've ever tried to learn in my life.
It's crazy that graphics cards are using so much more power, yet still their performance improvements aren't all that impressive.
future proof?
@@narachi- No. It is exactly the other way around. There is no more power headroom in the current architecture. They are simply raising voltages and making bigger chips with everything.
The real performance advance now is in improving the design of the architecture further or move to an even smaller process. Then you will see performance per watt take another little leap. Then gradually total power is increased again and the cycle repeats.
This used to not be obvious because of the best performing gpus in the mid 2000s only drew about 50W.
@@juliusfucik4011 that’s what I mean by future proof, kinda hard to design new architecture that will cost more power if you don’t already have the power.
They've hit the limits of the silicon. Die shrinks and pumping absurd voltage into them doesn't work any more. More power just means more waste heat these days.
@@patrickweaver1105quantum 7000 series gpus.
I could care less about overclocking the x3d chips, they better make those two dual ccdx3d or im not interested. I'll take the clock hit ifbit means not having to rely on software to figure out if im playing a game or not
Yup, the 7950x3d is great only when the software uses the right cores or when manually set it up.
LOL 5090, draws more power than your washer and dryers.
Pulling that much power isn't impressive at all... Efficiency is the next true technological breakthrough we need...
Amd 9800x3d 9900 x3d and 9950 x3d will kill the market if they have the same increased performance not jutst in game but also in work
The two connectors are for a total of 600W (which is exactly what RTX 5090 will need) and not 1200W.
Doubt it. 5090 will dissipate more than 600W for 8K gaming. Nvidia fandom is stricken by FOMO.
@@tringuyen7519 good luck with 8k gaming loool ngreedia fanboys are hilarious.....
@@tringuyen7519 RTX 5090 (or whatever it's going to be) not going to hit that advertise 600w TDP due to being CPU limited is most game
@@tringuyen7519do 8k gaming monitors even exist? What would be the point of even running it above a 4k monitor you won't be able to tell the difference since the screen will be too small
@@Fr00stee8K monitors exist.
I have had many screens, 1/2/4/5K and different sizes.
4K at 28 inch is perfection at 100% scaling. Main game screen is 3440×1440 at 34 inches which is great.
Having a 5K ultra wide at 29-30 inch with 125% scaling would be the ultimate but for now they are not going over 60Hz.
I will not install anything with a 12VHPWR connectors, but less multiple. The risk is too big regardless of the performance.
Wow a GPU that consumes triple the power of my entire system.
And 10x FPS of your system.
FORGET Ryzen 5000! Ryzen 7000 is coming on an entire new socket!
FORGET Ryzen 7000! Ryzen 9000 is going to have a huge architecture leap!
FORGET Ryzen 9000! Ryzen 9000X3D is going to have a completely new 3D-VCache system!
Prices are just nuts. I'm sticking with my 5800x3d and 6900 toxic until the rig falls apart!
this guy is a joke. Reading the benchmarks while didn't mention the power wattage
The macbook will throttle during heavy load, making it more power efficient
GM, You are blocking the graph that you are trying to show on screen. It's very hard to follow along.
Let's not forget that the Ryzen AI Asus S16 has a 16" screen. That's got to drain more power.
I hope the Z2 extreme chip would be enough to convince Valve to make the Steam Deck 2.
You’re blocking the graphs my guy
Screen size makes a big difference in power draw. I would like to see some test with the exact same hardware.
Maybe they Bring Back sli? Or raytrace on a second Card??
the 5090 will not be a consumer product for sure.
Constant name changes on videos, smh. At least this one isnt as clickbaity.
Ohh there's plenty of Copium in. When it's not clickbait it's copium.
9800X3D no overclocking support is very disappointing if true. I hope it has more performance and 3D-VCache than the 7800X3D, or it looks like the 9950X, 9950X3D and 9900X3D are the better choices in 2025.
Very disappointing. It's common knowledge that the 6 core parts are inferior to the 8 core parts. If the 9900x3d can be overclocked on the vcache chip, then there is no reason that the 9800x3d could not overclock. No reason except profit.
I'm an enthusiast with loads of money, even my toilet paper is made of gently massaged, scented, perfectly fluffled and creased freshly minted hundeds an fittys (mispelled on purpose), but jokes aside, im staying on the 4090 lol, it's powerful enough I only run it at 2/3 of the power but it still puts out about 90% of the performance.
So I really dont care if the 5090 is better. Could be two, three or 4 times better. Its not even the money, its just that I as an enthusiast with money that buys crazy GPUs like a 4090, i feel content.
In fact im so content i bought a weaker 4080 laptop so I can kick back whenever I feel like. Mind you the 4080 laptop GPU is closer in performance to a 4070 desktop part or 3080 from previous gen. It's enough for me ...I still have my old 3080 actually.
Nobody "needs" a 5090, no, not even you, yeah you, the one using it for renders and work. You'll be fine with a 4090.
Its also simpler to run the 4090 cable and PSU wise, leas complicated imo.
Now ok, some may disagree with me, is fine. I'm happy as a pig in sh**. I'm the guy the buys the new ARK Survival Ascended and then still plays the older ARK Survival Evolved instead because I want to despite having the gear to run the newer software. Lol.
Stop the copium for a device that hasn't even been released.
Power requirements for GPUs are getting out of and. They really need to focus on power efficiency, we're starting to run into the scenario where a typical North American 115v plug is not enough!
The 8 core chips are higher quality than the 6 core chips. If the 6 core chip with 3d cache on the 9900x3d is overclockable, but the 8 core 9800x3d is not overclockable, I would be very disappointed. Very scummy AMD.
Heavy load... M3 2000 points wins. Others 8000 points looses....
Well, I figured AMD would want to get the most popular X3D chip out there quickly - which would be the 8 core version. As for new features, honestly my gut was and is still telling me it'sw likely to have some form of NPU/AI acceleration feature set that the regular 9000 series doesn't have (esp given the over-hype around using the buzzword (*YAWN*) AI).
On the next Max Tech video "AMD and Intel got a lot more points than Apple, but Apple uses 1/3 the power, so lets multiply Apple results by 3 aaaaand, wow!!! Apple has the fastest chip ever!!"
I've been talking about the 9800x3d paired with the 5090 all year
What if AMD lets you OC the non-X3D chiplet on a Ryzen 9900X3D/9950X3D? Could that be the new feature? or is that possible already?
i predict that the 9800x3d is going to be very expensive and disappointing . hopefully i am wrong
That’s a bit ridiculous…….no……..rigoddamndiculous!!!
I will buy 5090 no matter what and definitly not for gaming. If it launch with 2x 16 pin connector it will be more safe (550/2 =275w per connector = safe)
9800x3D cant be over-clocked.
after living with a i7 6900k that cant Overclock beyond stock boost, i dont care much about OC anymore.
and current tech is already pushed to the limit by manufacturers anyways.
i dont need more than 16 threads, so 9800x3D is still my next pc build.
Don't worry, 9800x3d can be undervolted😂
Battery life chart looks a bit off. It should be normalized, at least with battery capacity (e.g.: time per Wh). Ryzen has larger screen ... Apple is still the winner here ...
If the new X3D CPUs are only magically better than the 7800X3D: AMD will need a serious kick in the butt!!
I really doubt the 5090 will need 2 600 w connections... think its probably more for AI 2 Gpu's or even the titan card.
LOL. I don't want to defend lying Intel, but you yourself are not honest. You compare the results of processors and you haven't said a word about the number of cores in each of them.
love the point before the crazy hands. lets do the point in the beginning first. its a really great opener
5090 using that much power is ridiculous. I hope the 5080 is not like that.
not pumped as I waiting on getting 9800X3D to replace Intel entirely but now I may just have to stay on Intel instability and GPU spiking to 100%
I'd love to see one of these hardware reviewers do show on what you realistically need to game. I know everyone is waiting for the next 'most powerful' hardware to drop but let's get real here... Do you really need the latest CPU and GPU and NVME and RAM and motherboard and etc to game?
1200W has the potential for EPIC gaming
Thank you for confirming that you are an AMD Fanboy😂
not long after and I can connect a 400V outlet to my PSU to power my single gpu card
If the 98003d is too expensive I'll take the 76003d
Kids these day lack the attention and memory of a simple goldfish ffs. It was always like this for generations. One gen of power hog and next gen is power efficient. 4000 series was power efficient now its gonna be power hogs in 5000 series. They just refine product with every second gen and sell it to you as "new" cards. Wait till 6000 series for power efficient cards now.
2 connectors of 12vhpwr is because single connector just wasn't cut it..all those 600w on tiny wires put unnecessary strains on it..2 connectors seem enough to split 300w/300w and prevent it from melting..
So a GPU now has the real potential to be more power needy than an entire rig just one gen back? Yeah they lost their minds, how is that better? imagine your cpu was set at a TDP of 600 watts? would you even be able to cool that? a potential 1200 watts to the GPU how the fk is that going to be tamed? quad slot cooler with no other option but water cooling and some fans for the ram and vrm's? jokers. Nvidia just out there on crack at this point.
My PSU already has 2 12vhpwr outs?
maybe apple would have won with bigger battery but do they offer a bigger one ? no they don't
hard to believe the 9800X3D will release in 1 month
I reckon the new features are that both CCDs have the 3D cache so you don't have to worry about core parking nightmares.
I think the dual connector could simply be so that it isn't trying to pull all of the power through 1 cable and to help eliminate the burning we had when 40 series launched.
Say 300w on each cable rather than 600w one one
On GPUs. Now to extrapolate and interpret the info in the vocals, that is why AMD has stated it is not going to compete with nvidia at the top end and will concentrate on the middle of the market. A sensible move, unless one hooks up the GPU to either the mains or the have a dedicated solar battery. AMDs move seems to be justified by looking at the values of second hand GPUs, the true market position indicator, not the loudness of those paid to boost sales by whatever means they chose like pretending they dropped from heaven off a Gods workbench.
I expect that the dies with 3D cache will not be OC'able, but the other 6/8C will be
i got a 7600x for fairly cheap gonna pair with 5080 soon and see how much performance is left on table before i upgrade.
Those small laptops are difficult to compare, basically the all are effectively the same. The criteria of one over another comes down to the aesthetics.
5:00
Have fun upgrading your apple battery.
that's like a fuckin AC unit power draw
1200W I can heat up my apartment :) that's just nuts...
it'll be to split the power between 2 connectors and stop them gettong overloaded
I personally feel like they went the route of separating out the power draw from the connectors by having two of them so that all the load is not on just one of the connectors. This is the same route that the overclocker Kingpin would have taken with his version of the 4090 I believe as he states in the gamers nexus video. Seems like a great idea to me!
the only reason i'd ever get a 5080 is for the new vr headsets with 8k resolution.
It would’ve been nice if you could put yourself on the bottom right and not bottom left where you block the test data
you tube does not stream in 4k their compression wont allow it
I just got 1100 watt pcie 5.0 for my 4090 soo yes no...
Can't wait to upgrade my i7 8700k
This guy has peak youtuber voice.
5090 and 5080 will be out in November not 2025
Either that or they are gping to give SLI another chance?
Man, at this point im gonna be waiting till my parts die before upgrading.
I forgot Ryzen 9000 last month.
What happen with your voice?? 🥺🥺
RUclips unsubed me from this channel
Garbage it won’t need two 12 pins.
AMD going to try to rain on 285k launch.
5090 maybe 2 chip...
All I want is a 5900x3d
Hahaha 1200 Watts!!
nice, so i can spend even more money
Finaly no forced overclock
So... we went full circle??
game over for intel go home isrehell
I already have
If you are surprized by the RTX 50xx taking 600W then you have not been paying attention. The nVidia H100 was already an over 700W part and the new H200 will take more, the trend is the same on the consumer side, no surprizes there. This will be a problem in NA or anywhere else that does not natively have 220V/240V power to the outlet. SInce 1600W is the theoretical max from a 12V 15A circuit. Lets do some math, 600W for GPU, 300W for CPU, RAM 50W NIC 25W, Sound card 25W Storage 50W = 1050W Don't plug anything else in there, at least you wont need that spare space heater in the winter.
H100 and H200 GPU die is humongous, They are roughly 35% bigger die area than 4090, They also have way more VRAM, Their high power consumption is absolutely justified
👍✌
I forgot Gamer Meld instead
Can anyone get more clickbaity than this chan?
graphically challenged channel is also clickbaity probably even more than this channel
Nope
I won't be needing a PSU. I'll be needing a new grid in my house...