But that will make AMD the new monopoly though. Winning on both CPU and GPU fronts. I also want to see Nvidia get cut down a peg or 2 and hoping Intels Battlemage does something. I understand Arc performing how it performed is expected, its literally Intels Betatest.
Amd is a bread and butter company their markets are cpu apu and consoles handhelds etc and they are trying to break into laptop mkt which makes sense considering the market size
I tried once to believe amd in gpu segment in 2021. They never beat 3090 till today, when nvidia released 4090 amd was blown away. Also driver issues are still a problem. No cuda cores is falling behind in ai, gpu mining and other tasks like dlss. And you pay similar price for worse? Well as long as people believe in AMD and make nvidia cheaper i am all in.
@@KianFloppa Agreed. I personally rather not pay more than $500 for a graphics card (in the past, the number was $350) just because of how price to performance in the next generation usually ends up. Graphics cards in particular are one of the easier parts of a PC to upgrade especially. I'd almost be more impressed if they come up with a great $200 card than a $1200 one
@@alfredlordtennyson5464 I hate ngreedia for many reasons but this comment is correct. GamerMeld, Graphically Challenged and other youtubers like these 2 just make clickbait videos mostly with news and rumors gathered from all around.
This is the problem with being a news channel. Things will pop up suddenly and you don’t know when. You kinda have to go along with the predictions in order to keep the flow going. Nobody is going to click on a video because he said: “new amd news!”. It’s just how it is tracking leaks and rumors.
I think this AMD Radeon triple GPU setup is why you are seeing Nvidia rush to release this cycle because they can get ahead to the point AMD may be catching up but only to the newly released Nvidia cards in the October area. Its always felt like Nvidia has had a connection to what AMD is doing on the manufacturing side of things.
i understand AMD on the issue of your reporting on the xt vs 13700k and stuff. they know like all; their users, more people have a rx6600 or a low end of any brand because money is tight and they cant afford a high end gpu. so they are actually showing real life data not lying
I understand why they test with the best GPUs possible, but those end up being no better than synthetic numbers with very little real world applications. He mentions Hardware Unboxed and they are the worst for that. BFD that a particular CPU can grind out 5% more performance when coupled with a 4090. Hardware Unboxed is a joke of a channel. Their tests are 100% correct and accurate, and absolutely meaningless to most people. They are also completely self-centered with the "NVIDIA doesn't care about gamers". Yeah, so what? NVIDIA exists to make money. These channels are crying because they've talked the hardware to death and they want new hardware so they can create content. It's not the manufacturers jobs to help RUclipsrs and web sites out.
Probably should note that the Instinct cards run the CDNA architecture, which is AMD's compute optimized GPGPU architecture. Unlike RDNA it doesn't have ROPs and therefore isn't capable of graphical rendering like RDNA is. AMD also doesn't provide DirectX / OpenGL compatible driver stacks for these cards as their main use-case doesn't necessitate the use of DirectX, OpenGL or vulkan. Due to this I would take the results they're getting on datacenter with a grain of salt if trying to equate the performance of Instinct to any potential future RX8000 or 9000 GPUs. Especially as CDNA is a massive multi-chip module. Its TDP is rated at 750 watts where most consumer class RDNA chips don't even crack past 350w because they're much smaller and geared towards being "affordable" (lolnope) to the average consumer. Expect RX8000 and RX9000 to be smaller modules and for the performance to be below that of the MI300X or its future equivalents.
You wish, 5090 is expected to be 30% faster than the 4090. The 5080 is gonna be slightly slower than the 4090, which means that the 4090 will be the second fastest card after the 5090, People who can't afford a 5090 will buy a 4090 (new or used). AMD needs more work to catch up with Nvidia
It's more than 3 GPUs. They patented the architecture of the MI300X which can split up workloads, and also act in unison. That does mean in the future they can easily wipe the floor with Nvidia by adding an extra GPU die whenever they need to.
The thing is by the time they crushed the 4090, Nvidia will be working on a 6090. AMD need to catch up by thinking outside of the box otherwise I'll say this is controlled competition. Artificial trying to make us think they're competitors and that's a lie.
That's not true. I just got a 4070 ti super and will upgrade to the best gaming AM4 processor, these benchmarks literally delayed me from getting a 5800x3d but I'm kinda glad because now it's 40$ cheaper than when i was going to.
Fair play, but everyone is in the wrong here. The XT refresh was just a refresh for the mid range systems, so the rx 6600 is still good, they didnt do something wrong, they just didnt say ANYTHING about high end systems
3060 is now struggling to play modern games at 1080 p 60 fps I have it and I have to turn down settings to get 60 fps . I don’t know what these aaa games companies are doing because I don’t see graphic difference change drastically but somehow these new games with old graphics don’t run well
@@MMC619 Modern games have really bad optimization and a lot of devs seem to not even know what they are doing. So the game looks and runs bad. It should be unacceptable and they just seem to solve bad performance with dlss/fsr...
They already have but gamers are the most idiotic demographic out there.. I can get a 7900XTX for $1389 here while the cheapest 4080S is $1900, if that's not competition what is? Seriously whats considered competition to you I really want to know?
The 7900xtx is better than 90% of nvidiaa lineup. It’s literally only less than 2 or nvidia 10 or so called cards. that’s competitive. Yall calling calling for them to beat a card yall not going to buy anyway
@@djnes2k7 That's exactly what i said but my comment went bye bye.. What does competition mean? XTX is 500 cheaper than the 4080S here so if that's not competing I don't know what is. Its not like these same people will buy Radeon anyways.
It is highly unlikely that is going to happen BUT making the GPU from smaller chiplets may increase yeild on the waffer and may actually help reduce the overall cost which COULD be passed to the users in cheaper GPUs. I won't hold my breath though
LOL, you think any company is going to pass its savings onto the costumer instead of having a higher profit margin...LMFAOTOTOTOTOTOTOLOLOLOLOLOLOLOLOLLLELELELEELULULULUULULULOLOLOLELULOLULELULOLULELLOLOLOOOoooOOOooooOOOOooooooooooooooooooooooooooooooooooooooo !!!
Chiplet will not necessarily making it cheap. Because in the end the more complicated packaging might end up making the gpu more expensive. It will increase power consumption as well.
They could bake in multi-gpu support onto vulkan at the software level in order to support multiple gpu compute dies without having to rely on infinity fabric to make it work.
I'm sorry but I'm tired of hearing AMD will blow everything out of the water... Been hearing this for years with nothing to show for it for more than 10 years... It always falls short one way or the other. Either in performance, features, price or some combination of all 3. Zen1 and + showed 2 and higher actually had potential. Till they have 1 to 2 generations of actual good market share stealing, I'm not gonna wait for AMD GPUs. You can't just give slightly better performance per dollar if the features aren't up to par. You either undercut a lot or match/exceed performance and features if you're going to give similar performance per dollar. Now when Nvidia ever reaches 60 or sub 60% market share, then you can price match. The features usually are playing catch up, coming out months or even years after Nvidia releases them. I wanna support AMD but man am I tired of the BS every time...
@@broose5240 it barely outpaced hopper in selected benchmarks, while coming out almost a year later than it lmao, and got demolished on real usage, let alone that it lacks all the libraries and support nvidia has.
@@broose5240 I should've specified consumer market but overall they are lacking behind. The server market works a bit differently because of contracts and whatnot but hopefully they can bring some competition there too. It'll take a few generations for contracts and projects to catch up so it'll be harder for AMD's sales to properly reflect that
Take your meds, AMD graphics cards are very proficient in gaming the 6000 series faced very well the RTX 3000 series, the current gen is very good also (while you don't turn on the Ray Tracking nonsense)
The MI300X's for sale so far run over 15K. Isn't it an apples to oranges comparison when saying it beats the 4090? It's not like you can get one to use for games or anything like the 4090.
The X3D is the gaming chip, but the 9000 series will stomp on productivity apps better than the 70003DX does. Not everybody games, they need to do work.
Today on Gamer Meld: More BS that proves the last BS wrong, but will still be BS somehow, and thus proven wrong yet again on the next episode of Gamer Meld.
can we all agree his trucker hat is not working for his viewership and likes ...his likes are bottom borderline...he got more dislikes than his likes right now. He need help in saving his under-performing channel . He not making as others making in this cut
9:16 This is the reason why do not have high end RDNA4, we may still have a chance to see this monster in RDNA5. They cancelled the release as the price would have been a bit much and their software stack could use some love.
For AMD to be at the top of the stack, it will also need to beef up its software tools, so that developers can start creating applications that will drive people to their platform. This is where nVidia has the upper hand as many application are developing with their tools over AMD tools.
Not just that. AMD actually need to spend money to ensure their ecosystem being use. In nvidia case most often they have their own engineer working at various other company (fully paid by nvidia) to ensure integration can happen. AMD they go open source route so software developer can do it themselves . This way AMD can save cost rather than sending their paid engineer to those company unlike nvidia.
10:30 YES! With competition everyone wins. This is the stuff I've been waiting for. Some seriously good stuff coming from AMD. Fingers crossed indeed that they implement this so well that it absolutely crushes Nvidia's monolithic die approach and is way better than Nvidia's best GPU. 🤞 Now THAT would ramp up the competition RPMs to another dimension and customers would win.
@@justgoodvibes1500 Delete is code for kill on platforms that censor certain words or phrases. Like here on RUclips where creators have to tip toe around certain topics and make up words in order to say what they want to. Even if they are having a positive discussion about topics that need addressing in society.
I think I see why they used s 6600xt is because the new 5800xt is now a entry level cpu so I don't think it's as big a deal then everyone is making it out to be.
I will say hardware unboxed is stupid to make a big fuzz about the 5800XT and 5900XT performance figures. If you are choosing these processors you are likely on an older AM4 platform and on a tight budget. Highly unlikely that you will pairing it with a 4090. Yes from a technical perspective the setup AMD used was flaw if you are comparing performance purely. But what AMD showed is the most likely configurations these chips will find themselves it. If you pair it with a mid- range GPU there is no difference between intel and you get to remain on AM4.
I watch both Steve's but I have to agree with you, if the 5800XT is cheaper people will buy them same with the 5900XT if its cheaper its actually a decent option for people who require 16 cores for a NAS/Docker machine. Tubers are running out of content since the interest level for hardware has dropped significantly due to the current market situation, hence all the clickbaity crapola. Nobody buying any of these chips would pair them with a 4090 you are correct, people who buy 4090 are the top 1%, they go on like everyone buys them.
I made the same point in a comment to their video about it. Somebody commented back about "True Power" of the cpu....at 4k most cpus are bottlenecked by the gpu...that's why you see things like the 8700k was no faster than an 8400 at 4k. The rule of thumb that it has always been is get a midrange cpu and pump more money into your gpu than your cpu. If you have a 4090 you don't really need anything more than a 5700X or a 7700X. For just gaming anything higher than that in your build you're wasting money. Now midrange is quite a bit different. You want a more balanced system for midrange gaming computer....5800X3d with an RTX 3070/4070/Ti or RX 6700XT/7700XT/7800XT...this is1440p level hardware....now RTX 4080 or higher or RX 7900 GRE or higher you don't need any cpu higher than a 5700X or 7700X for 4k. 7800X3d is a high end cpu but the lower end of the high end spectrum...with the 14900k and 7950X being the top end high end cpus. Now amd did not lie in their graphs but they did mislead...you can get better performance iin many games with DDR5 vs DDR4(The ram they paired a 13700k with)
@@pituguli5816 I've left that comment about content on a few H.U. videos lately. They are desperate for new product. Both Steves in Taipei did joint videos about that and sounded desperate. Telling people that CPU A is 10% better than CPU B ONLY when hooked up to a 4090 is meaningless. "How about with mid range graphics?" Oh, about the same.........
7:29 Vendor scores do not matter, nor have value as they are not set under real world conditions, often having caught AMD saying as much on Gamers nexus streams / vids. They in fact use highly extreme cooling situations that are not available to the end consumer, on top of having the choice of silicon (meaning they never pull bad tickets from the chip lottery as they own the lottery )
I’ve had plenty of AMD cards. But, they will never catch up to Nvidia. And that’s OK. Their products are still amazing. My HTPC is running a 4090, and my kids 6900XT and 6700XT.
Multiple gpu's will probably generate a lot of heat. Good luck cooling it down. What kind of power supply will that need? Is someone really going to notice more fps than a 4090?
Nvidia will also move to a multi chip GPU at some point so AMD really needs to win quick and stay at the top for long enough for it to help their brand
Nvidia only cares about 'AI' whilst AMD still makes products for us gamers, at reasonable prices. We need AMD more than ever in these times when Intel also pulls a Boeing.
AMD needs to be ahead of nvidia instead of just catching up lol. While making something that’s more efficient too. They need a 4090 like card tbh, instead of just being an afterthought for most people
Honestly I disagree, I am running a 5900x with a 6750xt I encode my stream via CPU and run games at Max with the gpu. This benchmark charts look fine to me.
AMD making multi-chiplets in consumer GPUs will finally close the gap in performance. Remember when they did it with the Ryzen CPU? Now, it's a matter of TDP, firmware, and DRIVERS!!!!!
The issue is gaming workload does not like complicated design like chiplet. It is the reason why AMD still unable to make it work in games. The easiest way is to make chiplet work is by changing how game being develop. But that's mean such design will not be able to work properly on existing games (or not working at all) . And even if they can it can only work on a single die from the chiplet. That is no-no for gamer out there because they want their gpu to work even with games that coming out back in the 90s.
The GPU division of AMD needs to rebirth themselves when it comes to software and invest in developer outreach programs. They've had this problem since way back when they were still ATI, and they never learned their lesson. They're never gonna touch Nvidia until they start doing what Nvidia's been doing for decades. Nvidia spends upwards of 200 million dollars a year JUST in developer outreach alone. They ask "hey what do you guys want?" and then they do something absolutely wild: they give us precisely what we ask for. There's no incentive to use ROCm when Nvidia's software stack is SO stout. Horsepower isn't everything. If gamers care about AMD GPU's, they'll stop recoiling in denial about this and start pushing AMD to fix this habitual oversight so they can actually compete, because if that never happens, Nvidia is going to settle into their thrown like Intel did before Ryzen and it's just bad for everybody. If tomorrow AMD dropped a card AND software stack that beat Nvidia, I'd sell off my 90's and replace them with AMD cards quicker than you can blink.
I'm actually really bothered by CPU benchmarks only benchmarking in 1080p and 1440p with an RTX4090 or AMD equivalent. Hardly anyone is buying a Core i9 or Ryzen 9 CPU to play at those resolutions. While I get the idea of removing the CPU bottleneck, I'm still very much interested in how it can do at 4k and 8k. Is it really worth getting a 14900K? or at that resolution, is a 14700K just as good?
also, about the last one with AMD having 3 GPU's in one. NVidia also has that technology, and way more advanced, too. While consumer SLI is dead, NVidia has been keeping it alive for industry applications, and expanding on it - making it faster. So AMD's Infinity Fabric may be slow, NVidia's isn't. NVidia has been forcing this standard as well, instead of using industry open sources so they can force companies to use the same hardware and not be compatible with other vendors.
It will still suck if it does not support VR mode out of the box without tweaking... I know VR is still a niche today, just because people are freaking clueless...
Wow, just when I thought the Radeon team couldn't be in a worse spot than they already are, they overcome themselves and getting worst, they're now barely winning by using a server-based GPU that's ten times more expensive, and only managing to edge out Nvidia's personal computer GPU lineup by little. If I hadn't seen the benchmarks myself, I wouldn't have believed it. Looks like Radeon has been completely melted by Nvidia-more than the Nvidia 4090 melts itself. It's official, folks, Radeon is done. No hope left anymore. Goodbye, Radeon!
Michigan here... AMD is going to come out with a dual pcie connected crossfire gpu.. thought it would be here by now.. but soon is going to have to do.. as for the interconnect they are going to be doing an "over/under" dual connect to push the bandwidth needed to have each of the gpu to communicate..also it will be ai chip assisted to more or less guess on rendering for some set or settled objects to get the speed needed... exciting times are on the way.
AMD is hard pressed to come up with new tech and solutions to compete with Nvidia, being able to have several smaller GPU chips work together will give them the win by brute forcing it.
AMD is pulling an Apple. You can say that AMD has done it first with their G series and X3D series chips, but that is their hardware that are at their strongest. Apple has pulled their biggest trump card in software, allowing the merging of both hardware and software to speak to one another through Apple Silicon. That all started with Apple’s A17Pro chip on iPhones talking to all M-Series chips on iPad’s and Macs. And I think AMD is going to push more software into Windows catching up with Nvidia, but catching Windows up to Apple in software. Intel is in a lot of trouble. The need to push more GPUs to speak to their ARM chip CPUs. Pretty much having an ARM chips for both CPU and GPU for better performance with very less power consumption.
I think AMD just set up a joint group with all the major manufacturers (minus Nvidia) to develop an ultra high speed, open source interconnect protocol called UAlink.
Nvidia always seem to have a detailed insight into what AMD is producing under their Radeon graphics division. They must be paying extraordinary bribes to some staff members in order to accomplish this. Every release that AMD has, Nvidia just slams itself over the top making Radeon video cards look impotent even though they are very powerful units. Jensen has a net worth in excess of USD$100 billion and he won't be happy until he becomes the richest man on earth. There is nothing his money excesses can buy but he will never be able to spend 1% of it.
To say, Nvidia will win again is to imply that AMD tried to take the top spot when they didn't even release a competitor card to the 4090 it's misleading at best and the 7900 XTX beat the 4080 super
Do you (or did you) practice the way you talk on camera? I can't imagine that is how you really talk normally, its very unique so don't think I am being rude, im just curious if you planned it and intend to speak in that fashion.
he talks like that to make it sound like ground breaking news, when the reality is amd has been losing market share year on year to nvida, for years because sensible people invest in nvidia, unlike this amd fanboy
7900xtx user currently.. gotta say not impressed by amd's claims. gotta pay the big bucks for stability and right now nvidia for the most part is where its at. dont even want to mention intel gpu's
5:18 yes... but it's a wrong and unrealistic approach most of the time. Like when you pair up low end CPU with top of the line GPU. You are forcing the CPU to churn out frames that in a real world would never happen, risking inconsistent framerate when a CPU spike occurs (which is worse). You are benchmarking the CPU, but it's a synthetic benchmark with no real world counterpart. It has been proved time and time again that "futureproof" the CPU or GPU don't work either. Granted that in this case I think it's wrong, it's more plausible for someone who wants to extend the life of an existing AM4 platform, which had a previous high-end graphic card. So they should have paired with an old high-end GPU, 2 or even 3 generations ago. And they don't scale with 1 generation old low end.
Man I how AMD does release this because the 5080 is gonna be a severely castrated 5090 severely they're not only gonna remove it's balls but the shaft to lol I'm a HUGE AMD CPU fan I've been really tempted to switch to AMD GPU's to I got a taste of frame gen with the legion go and Z1 Extreme and color me impressed that paired with RSR and Integer scaling and you got something truly special and they didn't lock it all behind a pay wall like Ngreedia
I wouldn't call that a lie. The point is, how are AMD CPUs outperforming Intel ones on the same graphics card and resolution? If Intel CPUs are better, wouldn't they dominate under the same settings and equivalent hardware? AMD didn't lie. Third-party benchmarks are essential to uncover the actual truth, but AMD isn't foolish enough to make such a statement without some basis in reality. I challenge any third-party reviewers to replicate the setups and test for themselves before dismissing it as a marketing lie.
told yo so...amd is force to use 600w connector...what choice they have...its why they made good call to cancel rdan4 high end and go forward with nvidia 600w connector for rdna5 high end...its only way to get out the ditch AMD is on but will be too late for them ?? as always ...time will tell
This new patent doesn't show anything much really. I hope it's RDNA4. Splitting the compute die further just adds more latency. Similar to the Zen 3 CCD situation where workload split across the CCD / CCX has poorer performance and only the 3D vcache can recover some of that performance back. With TSMC COWOS fab maxed out, I doubt AMD can book any of the latest gen COWOS and have to rely on older gen COWOS tech. My guess is, it's the same infinity fabric as RDNA3 and only the more lucrative EPYC & MI300 will be fab using the latest gen COWOS, which AMD is branding as Infinity Fabric. And that's why RDNA4 is not expected to challenge 5090 because 5090 is still a monolithic die.
we REALLY need a "Ryzen moment" for Radeon GPUs, a total rework to challenge Ngreedia!
that might come with RDNA5
@@Dozik1403 i think i heard that RDNA5 is a ground up redesign. i think it comes out pretty soon after RDNA4 and covers the high end of the market
Even better if AMD takes Nvidia's spot and Intel then starts fighting them like AMD does now.
But that will make AMD the new monopoly though. Winning on both CPU and GPU fronts. I also want to see Nvidia get cut down a peg or 2 and hoping Intels Battlemage does something. I understand Arc performing how it performed is expected, its literally Intels Betatest.
Amd is a bread and butter company their markets are cpu apu and consoles handhelds etc and they are trying to break into laptop mkt which makes sense considering the market size
AMD always trying to catch up to Nvidia. It's a good thing. We wouldn't want a monopoly.
"trying to catch up" bro their price to performance in cards below 800$ is literally better 😂
@@KianFloppa And more Vram.
I tried once to believe amd in gpu segment in 2021. They never beat 3090 till today, when nvidia released 4090 amd was blown away. Also driver issues are still a problem. No cuda cores is falling behind in ai, gpu mining and other tasks like dlss. And you pay similar price for worse? Well as long as people believe in AMD and make nvidia cheaper i am all in.
If it doesn’t touch the top end nvidia, it may as well not exist to me.
@@KianFloppa Agreed. I personally rather not pay more than $500 for a graphics card (in the past, the number was $350) just because of how price to performance in the next generation usually ends up. Graphics cards in particular are one of the easier parts of a PC to upgrade especially. I'd almost be more impressed if they come up with a great $200 card than a $1200 one
I'm waiting for the Title: AMD's New GPU EATS the 5090!
That would bei great for the Market..
wont happen. After the 5090 is done NVidia will destroy then next iteration of AMD...
Check back in 4 years.
I do want this to happen and if they do it'll cost half what a 5090 does sad but true Screw Ngreedia
Great headline only if the card is affordable. The high margins Nvidia is getting away with are ridiculous.
Gamer meld uploads be like:
THE RX 8000 IS EVEN BETTER THAN WE TOUGTH!
Then:
THE RX 8000 IS NERFED!
yup, One of his latest ones: RX 8000 dead on arrival. but youtubers need content even it there is none.
@@truthhitshardthere is content. It’s something that people create with their brains. It never runs out. This dude does not create content.
Thought
@@ProfileUserNumber hes got a hater he must be doing something right
Gamer Meld: I am the content!
This is the most click bait Title I ahve ever seen,
All of his videos have click bait titles.
It wouldn't be if it was for ngreedia, right?
@@alfredlordtennyson5464 I hate ngreedia for many reasons but this comment is correct. GamerMeld, Graphically Challenged and other youtubers like these 2 just make clickbait videos mostly with news and rumors gathered from all around.
The guys an actual duck head I would listen much better yt ers out there
yet you're still here with us
Friday: AMD blows EVERYTHING out of the water!
Monday: AMD: is dead before they even launch!
Lmao this dude with his video titles haha
This is the problem with being a news channel. Things will pop up suddenly and you don’t know when. You kinda have to go along with the predictions in order to keep the flow going. Nobody is going to click on a video because he said: “new amd news!”. It’s just how it is tracking leaks and rumors.
How true.
I think this AMD Radeon triple GPU setup is why you are seeing Nvidia rush to release this cycle because they can get ahead to the point AMD may be catching up but only to the newly released Nvidia cards in the October area.
Its always felt like Nvidia has had a connection to what AMD is doing on the manufacturing side of things.
i understand AMD on the issue of your reporting on the xt vs 13700k and stuff. they know like all; their users, more people have a rx6600 or a low end of any brand because money is tight and they cant afford a high end gpu. so they are actually showing real life data not lying
I understand why they test with the best GPUs possible, but those end up being no better than synthetic numbers with very little real world applications. He mentions Hardware Unboxed and they are the worst for that. BFD that a particular CPU can grind out 5% more performance when coupled with a 4090. Hardware Unboxed is a joke of a channel. Their tests are 100% correct and accurate, and absolutely meaningless to most people. They are also completely self-centered with the "NVIDIA doesn't care about gamers". Yeah, so what? NVIDIA exists to make money. These channels are crying because they've talked the hardware to death and they want new hardware so they can create content. It's not the manufacturers jobs to help RUclipsrs and web sites out.
Probably should note that the Instinct cards run the CDNA architecture, which is AMD's compute optimized GPGPU architecture. Unlike RDNA it doesn't have ROPs and therefore isn't capable of graphical rendering like RDNA is. AMD also doesn't provide DirectX / OpenGL compatible driver stacks for these cards as their main use-case doesn't necessitate the use of DirectX, OpenGL or vulkan.
Due to this I would take the results they're getting on datacenter with a grain of salt if trying to equate the performance of Instinct to any potential future RX8000 or 9000 GPUs. Especially as CDNA is a massive multi-chip module. Its TDP is rated at 750 watts where most consumer class RDNA chips don't even crack past 350w because they're much smaller and geared towards being "affordable" (lolnope) to the average consumer. Expect RX8000 and RX9000 to be smaller modules and for the performance to be below that of the MI300X or its future equivalents.
Dude, its fucking 2024 theres about to be a whole ass line up , at this point nobody was buying 4090's anymore lol
You wish, 5090 is expected to be 30% faster than the 4090. The 5080 is gonna be slightly slower than the 4090, which means that the 4090 will be the second fastest card after the 5090, People who can't afford a 5090 will buy a 4090 (new or used). AMD needs more work to catch up with Nvidia
It's more than 3 GPUs. They patented the architecture of the MI300X which can split up workloads, and also act in unison. That does mean in the future they can easily wipe the floor with Nvidia by adding an extra GPU die whenever they need to.
tbh I am excited for what new tech the RTX 50 series will bring to the table with DLSS but I am even more excited for AMD this time
The thing is by the time they crushed the 4090, Nvidia will be working on a 6090. AMD need to catch up by thinking outside of the box otherwise I'll say this is controlled competition. Artificial trying to make us think they're competitors and that's a lie.
People who buy a 5800XT or 5900XT are not gonne buy an highend GPU, so it's actually make sense..
If they buy an highend GPU, they get the 5800x3D
Smarttt
nah theil wait for 9800x3d
very true!
That's not true. I just got a 4070 ti super and will upgrade to the best gaming AM4 processor, these benchmarks literally delayed me from getting a 5800x3d but I'm kinda glad because now it's 40$ cheaper than when i was going to.
@@Eskoxo That means nothing to the people these chips were for, AM4 motherboard owners that are waiting til AM6 to upgrade motherboard.
Fair play, but everyone is in the wrong here. The XT refresh was just a refresh for the mid range systems, so the rx 6600 is still good, they didnt do something wrong, they just didnt say ANYTHING about high end systems
If they're gonna make a genuine 3x gpu it might be wild... I can't imagine how much performance you could get out of a 3 x 7800 xt or 3 x 7900 xtx...
It's not a lie; it's just misleading since all the details are in the footnote. Omitting the footnote, however, would have constituted a blatant lie.
I thought they weren't releasing high end gpus
Next, next gen bud
That's for 8k series this is for 9k series.
they aren't releasing high end for 8000
To be fair there was alot of click bait titles that would make people think that AMD is completely giving up on high end.
RDNA 5 will be high end gpus.
I have a different imagination of what the dude behind this voice looked like. I was way oooffff 😂
Nice video. I'm gonna wait to make any judgements on the next gen hardware until release.
MCM videos card WONT HAPPEN STOP. MCM=Latency is not going to happen on gaming video cards. Please stop this crap.
RX6600 is roughly similar performance to RTX2070 / RTX3060. I never thought I'd live to see the day when this was considered "low end."
3060 is now struggling to play modern games at 1080 p 60 fps I have it and I have to turn down settings to get 60 fps .
I don’t know what these aaa games companies are doing because I don’t see graphic difference change drastically but somehow these new games with old graphics don’t run well
@@MMC619 Modern games have really bad optimization and a lot of devs seem to not even know what they are doing. So the game looks and runs bad. It should be unacceptable and they just seem to solve bad performance with dlss/fsr...
PLEASE AMD, give us some true competition against Ngreedia, please!!!
They already have but gamers are the most idiotic demographic out there.. I can get a 7900XTX for $1389 here while the cheapest 4080S is $1900, if that's not competition what is? Seriously whats considered competition to you I really want to know?
every company is greedy people are greedy this world runs on greed
The 7900xtx is better than 90% of nvidiaa lineup. It’s literally only less than 2 or nvidia 10 or so called cards. that’s competitive. Yall calling calling for them to beat a card yall not going to buy anyway
@@djnes2k7 That's exactly what i said but my comment went bye bye.. What does competition mean? XTX is 500 cheaper than the 4080S here so if that's not competing I don't know what is. Its not like these same people will buy Radeon anyways.
@@yMMahmood0990 Agreed.
It is highly unlikely that is going to happen BUT making the GPU from smaller chiplets may increase yeild on the waffer and may actually help reduce the overall cost which COULD be passed to the users in cheaper GPUs. I won't hold my breath though
LOL, you think any company is going to pass its savings onto the costumer instead of having a higher profit margin...LMFAOTOTOTOTOTOTOLOLOLOLOLOLOLOLOLLLELELELEELULULULUULULULOLOLOLELULOLULELULOLULELLOLOLOOOoooOOOooooOOOOooooooooooooooooooooooooooooooooooooooo !!!
Chiplet will not necessarily making it cheap. Because in the end the more complicated packaging might end up making the gpu more expensive. It will increase power consumption as well.
I'll be looking at an AM4 CPU shuffle instead. Wake me when September ends.
They could bake in multi-gpu support onto vulkan at the software level in order to support multiple gpu compute dies without having to rely on infinity fabric to make it work.
if rx8000 has similar preformance at a decently reduced price with large power efficiency improvements it might fully capture the mid and low end
I'm sorry but I'm tired of hearing AMD will blow everything out of the water... Been hearing this for years with nothing to show for it for more than 10 years... It always falls short one way or the other. Either in performance, features, price or some combination of all 3. Zen1 and + showed 2 and higher actually had potential. Till they have 1 to 2 generations of actual good market share stealing, I'm not gonna wait for AMD GPUs. You can't just give slightly better performance per dollar if the features aren't up to par. You either undercut a lot or match/exceed performance and features if you're going to give similar performance per dollar. Now when Nvidia ever reaches 60 or sub 60% market share, then you can price match. The features usually are playing catch up, coming out months or even years after Nvidia releases them. I wanna support AMD but man am I tired of the BS every time...
you missed amd mi300 is faster than anything nvidia has?
@@broose5240 it barely outpaced hopper in selected benchmarks, while coming out almost a year later than it lmao, and got demolished on real usage, let alone that it lacks all the libraries and support nvidia has.
@@broose5240 I should've specified consumer market but overall they are lacking behind. The server market works a bit differently because of contracts and whatnot but hopefully they can bring some competition there too. It'll take a few generations for contracts and projects to catch up so it'll be harder for AMD's sales to properly reflect that
Well, I'm still on my old RX 480 (16 GB).
Take your meds, AMD graphics cards are very proficient in gaming the 6000 series faced very well the RTX 3000 series, the current gen is very good also (while you don't turn on the Ray Tracking nonsense)
The MI300X's for sale so far run over 15K. Isn't it an apples to oranges comparison when saying it beats the 4090? It's not like you can get one to use for games or anything like the 4090.
It absolutely is. However, just the fact that they benchmarked it put a smile on my face. Being ridiculous for its own sake.
The X3D is the gaming chip, but the 9000 series will stomp on productivity apps better than the 70003DX does. Not everybody games, they need to do work.
Today on Gamer Meld: More BS that proves the last BS wrong, but will still be BS somehow, and thus proven wrong yet again on the next episode of Gamer Meld.
can we all agree his trucker hat is not working for his viewership and likes ...his likes are bottom borderline...he got more dislikes than his likes right now. He need help in saving his under-performing channel . He not making as others making in this cut
9:16 This is the reason why do not have high end RDNA4, we may still have a chance to see this monster in RDNA5. They cancelled the release as the price would have been a bit much and their software stack could use some love.
For AMD to be at the top of the stack, it will also need to beef up its software tools, so that developers can start creating applications that will drive people to their platform. This is where nVidia has the upper hand as many application are developing with their tools over AMD tools.
Not just that. AMD actually need to spend money to ensure their ecosystem being use. In nvidia case most often they have their own engineer working at various other company (fully paid by nvidia) to ensure integration can happen. AMD they go open source route so software developer can do it themselves . This way AMD can save cost rather than sending their paid engineer to those company unlike nvidia.
I'll be buying NON X3D anyways because they will be cheaper and faster in workloads
What is the point of Radeons next gen? It seems like placeholder so they can say they released something.
The title is misleading. It insinuates this GPU doubles as a depth charge. Very disappointed.
I'll believe it when I see it.
AMD, FTW!
Being a fanboy of anything is weak.
@@dauntae24 It's why nvidia is still king
Did you even watch the video before making this comment?
@@einstien2409 Fanboys don't care they wanna be supportive even if the products turn out to be more mediocre than a box of sand
@@jasonvors1922 Yeah, most companies base their IT spend on being a fanboy. /s
I'm wondering if the true MCN design is close that is why they decided to apparently only focus on midrange with the 8000 series.
10:30 YES! With competition everyone wins. This is the stuff I've been waiting for. Some seriously good stuff coming from AMD. Fingers crossed indeed that they implement this so well that it absolutely crushes Nvidia's monolithic die approach and is way better than Nvidia's best GPU. 🤞 Now THAT would ramp up the competition RPMs to another dimension and customers would win.
Yall FW the 9950XTX3D? It's the 9950XTX, but with 3D Cache!
That's crazy! I would buy definetly buy that it's awesome
Still waiting for the 9950XTX Ti FTW
More X is better
“Delete me” doesn’t play well with its current meaning in internet lingo..
What does it mean?
@@justgoodvibes1500 Delete is code for kill on platforms that censor certain words or phrases. Like here on RUclips where creators have to tip toe around certain topics and make up words in order to say what they want to. Even if they are having a positive discussion about topics that need addressing in society.
It's been 2 decades of hearing this.. wake me up when they actually do it
Ryzen 7600 actually beats 5800X3D in many games, so it wouldn't be that far fetched to expect for 9000-series to do the same.
It really seems like some of those old Nvidia cards with multiple chips from the 2010's
I think I see why they used s 6600xt is because the new 5800xt is now a entry level cpu so I don't think it's as big a deal then everyone is making it out to be.
MI300X has the crown of compute and VRAM. I don't see any surprise.
I will say hardware unboxed is stupid to make a big fuzz about the 5800XT and 5900XT performance figures. If you are choosing these processors you are likely on an older AM4 platform and on a tight budget. Highly unlikely that you will pairing it with a 4090. Yes from a technical perspective the setup AMD used was flaw if you are comparing performance purely. But what AMD showed is the most likely configurations these chips will find themselves it. If you pair it with a mid- range GPU there is no difference between intel and you get to remain on AM4.
I watch both Steve's but I have to agree with you, if the 5800XT is cheaper people will buy them same with the 5900XT if its cheaper its actually a decent option for people who require 16 cores for a NAS/Docker machine. Tubers are running out of content since the interest level for hardware has dropped significantly due to the current market situation, hence all the clickbaity crapola. Nobody buying any of these chips would pair them with a 4090 you are correct, people who buy 4090 are the top 1%, they go on like everyone buys them.
I made the same point in a comment to their video about it. Somebody commented back about "True Power" of the cpu....at 4k most cpus are bottlenecked by the gpu...that's why you see things like the 8700k was no faster than an 8400 at 4k. The rule of thumb that it has always been is get a midrange cpu and pump more money into your gpu than your cpu. If you have a 4090 you don't really need anything more than a 5700X or a 7700X. For just gaming anything higher than that in your build you're wasting money. Now midrange is quite a bit different. You want a more balanced system for midrange gaming computer....5800X3d with an RTX 3070/4070/Ti or RX 6700XT/7700XT/7800XT...this is1440p level hardware....now RTX 4080 or higher or RX 7900 GRE or higher you don't need any cpu higher than a 5700X or 7700X for 4k. 7800X3d is a high end cpu but the lower end of the high end spectrum...with the 14900k and 7950X being the top end high end cpus.
Now amd did not lie in their graphs but they did mislead...you can get better performance iin many games with DDR5 vs DDR4(The ram they paired a 13700k with)
@@VoldoronGaming For 4k yes I agree with you.
You could have stopped after the word stupid. H.U. tests are based for a reality that not many people can afford.
@@pituguli5816 I've left that comment about content on a few H.U. videos lately. They are desperate for new product. Both Steves in Taipei did joint videos about that and sounded desperate. Telling people that CPU A is 10% better than CPU B ONLY when hooked up to a 4090 is meaningless. "How about with mid range graphics?" Oh, about the same.........
7:29 Vendor scores do not matter, nor have value as they are not set under real world conditions, often having caught AMD saying as much on Gamers nexus streams / vids.
They in fact use highly extreme cooling situations that are not available to the end consumer, on top of having the choice of silicon (meaning they never pull bad tickets from the chip lottery as they own the lottery )
AMD: Our Toyota Corolla is faster then a Lamborghini
*Lamborghini speed locked @ 60 Kph.🤫🤫🤫
Or it's just a GR vs a Miura
IS THE 8950 XTX COMING OUT OR NOT???
@Gamer Meld Motorcycle riding video when?
😂
@@hotsytotsy80s I wanna see him do something besides rumor videos
Motorcycle would break because of his weight
@@THE_BOSS_THE1. 🤣🤣
@@jasonvors1922 i want him to release his real voice.
I know he's holding it.
With Nvidia the leaks usually mean something. With AMD, I am not sure..
Can't wait for Gamer meld to upload a video with the title RX 10900XT beats RTX 6090 🙏🔥
As much as we all want at least ONE dominant AMD GPU generation, we ain't even got RX 8000 yet. I refuse to hold my breath.
Yeah Ok.
I’ve had plenty of AMD cards. But, they will never catch up to Nvidia. And that’s OK. Their products are still amazing. My HTPC is running a 4090, and my kids 6900XT and 6700XT.
Multiple gpu's will probably generate a lot of heat. Good luck cooling it down. What kind of power supply will that need? Is someone really going to notice more fps than a 4090?
man the price of that 300x will be skyrocking high?
Nvidia will also move to a multi chip GPU at some point so AMD really needs to win quick and stay at the top for long enough for it to help their brand
Nvidia only cares about 'AI' whilst AMD still makes products for us gamers, at reasonable prices. We need AMD more than ever in these times when Intel also pulls a Boeing.
AMD needs to be ahead of nvidia instead of just catching up lol. While making something that’s more efficient too. They need a 4090 like card tbh, instead of just being an afterthought for most people
I hope it can run 4k content full maxed out settings with at least 240fps. It should be if its got 3 gpu dies.
Honestly I disagree, I am running a 5900x with a 6750xt I encode my stream via CPU and run games at Max with the gpu.
This benchmark charts look fine to me.
I am going to be upset if Zen 5 X-3D is a minor performance uplift in gaming workloads, due to it being delayed until CES 2025.
AMD making multi-chiplets in consumer GPUs will finally close the gap in performance. Remember when they did it with the Ryzen CPU? Now, it's a matter of TDP, firmware, and DRIVERS!!!!!
The issue is gaming workload does not like complicated design like chiplet. It is the reason why AMD still unable to make it work in games. The easiest way is to make chiplet work is by changing how game being develop. But that's mean such design will not be able to work properly on existing games (or not working at all) . And even if they can it can only work on a single die from the chiplet. That is no-no for gamer out there because they want their gpu to work even with games that coming out back in the 90s.
The GPU division of AMD needs to rebirth themselves when it comes to software and invest in developer outreach programs. They've had this problem since way back when they were still ATI, and they never learned their lesson. They're never gonna touch Nvidia until they start doing what Nvidia's been doing for decades. Nvidia spends upwards of 200 million dollars a year JUST in developer outreach alone. They ask "hey what do you guys want?" and then they do something absolutely wild: they give us precisely what we ask for. There's no incentive to use ROCm when Nvidia's software stack is SO stout. Horsepower isn't everything.
If gamers care about AMD GPU's, they'll stop recoiling in denial about this and start pushing AMD to fix this habitual oversight so they can actually compete, because if that never happens, Nvidia is going to settle into their thrown like Intel did before Ryzen and it's just bad for everybody. If tomorrow AMD dropped a card AND software stack that beat Nvidia, I'd sell off my 90's and replace them with AMD cards quicker than you can blink.
I'm actually really bothered by CPU benchmarks only benchmarking in 1080p and 1440p with an RTX4090 or AMD equivalent. Hardly anyone is buying a Core i9 or Ryzen 9 CPU to play at those resolutions. While I get the idea of removing the CPU bottleneck, I'm still very much interested in how it can do at 4k and 8k. Is it really worth getting a 14900K? or at that resolution, is a 14700K just as good?
also, about the last one with AMD having 3 GPU's in one. NVidia also has that technology, and way more advanced, too. While consumer SLI is dead, NVidia has been keeping it alive for industry applications, and expanding on it - making it faster. So AMD's Infinity Fabric may be slow, NVidia's isn't. NVidia has been forcing this standard as well, instead of using industry open sources so they can force companies to use the same hardware and not be compatible with other vendors.
It will still suck if it does not support VR mode out of the box without tweaking... I know VR is still a niche today, just because people are freaking clueless...
Wow, just when I thought the Radeon team couldn't be in a worse spot than they already are, they overcome themselves and getting worst, they're now barely winning by using a server-based GPU that's ten times more expensive, and only managing to edge out Nvidia's personal computer GPU lineup by little. If I hadn't seen the benchmarks myself, I wouldn't have believed it. Looks like Radeon has been completely melted by Nvidia-more than the Nvidia 4090 melts itself. It's official, folks, Radeon is done. No hope left anymore. Goodbye, Radeon!
Michigan here... AMD is going to come out with a dual pcie connected crossfire gpu.. thought it would be here by now.. but soon is going to have to do.. as for the interconnect they are going to be doing an "over/under" dual connect to push the bandwidth needed to have each of the gpu to communicate..also it will be ai chip assisted to more or less guess on rendering for some set or settled objects to get the speed needed... exciting times are on the way.
Dont forget, 128GB of VRAM
Every single time, since Polaris, itbeen "the generation after the next one."
AMD is hard pressed to come up with new tech and solutions to compete with Nvidia, being able to have several smaller GPU chips work together will give them the win by brute forcing it.
RT and 4k on and AMD frame rate catches fire
AMD is pulling an Apple. You can say that AMD has done it first with their G series and X3D series chips, but that is their hardware that are at their strongest.
Apple has pulled their biggest trump card in software, allowing the merging of both hardware and software to speak to one another through Apple Silicon. That all started with Apple’s A17Pro chip on iPhones talking to all M-Series chips on iPad’s and Macs.
And I think AMD is going to push more software into Windows catching up with Nvidia, but catching Windows up to Apple in software.
Intel is in a lot of trouble. The need to push more GPUs to speak to their ARM chip CPUs. Pretty much having an ARM chips for both CPU and GPU for better performance with very less power consumption.
I think AMD just set up a joint group with all the major manufacturers (minus Nvidia) to develop an ultra high speed, open source interconnect protocol called UAlink.
Yeah they did. I recently discussed that. I wonder if it could come into play here.
@@GamerMeld I don't want to add 2+2 and come up with 5, but seems like it's too much of a coincidence.
@@GamerMeld this is tech for data centers, i think. With all this new tech its pushing the market fast for higher end gpu's.
Easy multigpu Mode is for Datacenter ... Not more to See here
Nvidia always seem to have a detailed insight into what AMD is producing under their Radeon graphics division. They must be paying extraordinary bribes to some staff members in order to accomplish this. Every release that AMD has, Nvidia just slams itself over the top making Radeon video cards look impotent even though they are very powerful units. Jensen has a net worth in excess of USD$100 billion and he won't be happy until he becomes the richest man on earth. There is nothing his money excesses can buy but he will never be able to spend 1% of it.
12 yrs ago maybe , today nope not even close.
Thanks again for the news.
To say, Nvidia will win again is to imply that AMD tried to take the top spot when they didn't even release a competitor card to the 4090 it's misleading at best and the 7900 XTX beat the 4080 super
This just tells me that AMD could easily beat their current specs they just don't want to yet
I keep hoping AMD will someday put up stiff competition, but their endless over-promising while under-delivering makes me never trust that they will.
Do you (or did you) practice the way you talk on camera? I can't imagine that is how you really talk normally, its very unique so don't think I am being rude, im just curious if you planned it and intend to speak in that fashion.
he talks like that to make it sound like ground breaking news, when the reality is amd has been losing market share year on year to nvida, for years because sensible people invest in nvidia, unlike this amd fanboy
7900xtx user currently.. gotta say not impressed by amd's claims. gotta pay the big bucks for stability and right now nvidia for the most part is where its at. dont even want to mention intel gpu's
Uh huh and this has been the pattern for RDNA GPU's right?.. Right?
5:18 yes... but it's a wrong and unrealistic approach most of the time.
Like when you pair up low end CPU with top of the line GPU. You are forcing the CPU to churn out frames that in a real world would never happen, risking inconsistent framerate when a CPU spike occurs (which is worse).
You are benchmarking the CPU, but it's a synthetic benchmark with no real world counterpart.
It has been proved time and time again that "futureproof" the CPU or GPU don't work either.
Granted that in this case I think it's wrong, it's more plausible for someone who wants to extend the life of an existing AM4 platform, which had a previous high-end graphic card. So they should have paired with an old high-end GPU, 2 or even 3 generations ago. And they don't scale with 1 generation old low end.
Hilarious I was just thinking why don’t they make 2-4 card GPU units. Run them in series it wouldn’t be hard
Man I how AMD does release this because the 5080 is gonna be a severely castrated 5090 severely they're not only gonna remove it's balls but the shaft to lol I'm a HUGE AMD CPU fan I've been really tempted to switch to AMD GPU's to I got a taste of frame gen with the legion go and Z1 Extreme and color me impressed that paired with RSR and Integer scaling and you got something truly special and they didn't lock it all behind a pay wall like Ngreedia
Amd already beats Nvdia in my opinion when it comes to core performance
I've never seen you so passionate making your point about how AMD was cheating using that 6600. You got my attention! Thanks!
I wouldn't call that a lie. The point is, how are AMD CPUs outperforming Intel ones on the same graphics card and resolution? If Intel CPUs are better, wouldn't they dominate under the same settings and equivalent hardware?
AMD didn't lie. Third-party benchmarks are essential to uncover the actual truth, but AMD isn't foolish enough to make such a statement without some basis in reality.
I challenge any third-party reviewers to replicate the setups and test for themselves before dismissing it as a marketing lie.
AMD is not beating Nvidia. Nvidia is just too powerful in gpu making.
told yo so...amd is force to use 600w connector...what choice they have...its why they made good call to cancel rdan4 high end and go forward with nvidia 600w connector for rdna5 high end...its only way to get out the ditch AMD is on but will be too late for them ?? as always ...time will tell
my water still in my cup
They always say this, any amd card I’ve used is so buggy and annoying, Nvidia never fails.
I always support AMD, I really would hate Nvidia having all the market, give amd some love.
If NVidia turns straight towards AI, then there will be no better opportunity for AMD to utilize it's gap for GPU'a. NVidia already dropped gamers.
This new patent doesn't show anything much really. I hope it's RDNA4. Splitting the compute die further just adds more latency. Similar to the Zen 3 CCD situation where workload split across the CCD / CCX has poorer performance and only the 3D vcache can recover some of that performance back.
With TSMC COWOS fab maxed out, I doubt AMD can book any of the latest gen COWOS and have to rely on older gen COWOS tech. My guess is, it's the same infinity fabric as RDNA3 and only the more lucrative EPYC & MI300 will be fab using the latest gen COWOS, which AMD is branding as Infinity Fabric.
And that's why RDNA4 is not expected to challenge 5090 because 5090 is still a monolithic die.