RDNA 5: Can AMD Finally Dominate GPUs? HUGE Architecture Changes
HTML-код
- Опубликовано: 15 июл 2024
- RDNA 5: Can AMD Finally Dominate GPUs? HUGE Architecture Changes
Today, we have some VERY exciting updates for AMD's RDNA 5, which according to recent leaks will have HUGE architecture changes following after RDAN 4's "bug fix" generation which will target the budget and mid range. Join us as we examine the latest leaks on RDNA 5, what these big changes in the Radeon architecture mean for the GPU, and of course - the interesting question of how Nvidia will respond. While big number go up in terms of the hardware specs is important, it's the bigg improvements in architecture will allow for those truly huge leaps in performance in both gaming and ray tracing, and even things like path tracing.
RTX 50 Blackwell is shaping up to be a beast, and when RDNA 5 releases it should have been out for some time - so we could see some interesting moves from Nvidia to stay competitive if the rumours about RDNA 5's jumps in architecture and performance are true. But just who will win? After RDNA 4 won't be targeting the high end / flagship graphics cards at all - it's going to be crucial for them to come out swinging with RDNA 5.
LINKS
benchlife.info/geforce-rtx-50...
www.chiphell.com/thread-26041...
/ 1784561456694046744
Subscribe - ruclips.net/user/RedGamin...
Join our Discord - / discord
Tip on PayPal - paypal.me/RedGamingTech
Whokeys 25% off code:RGT
Windows 11 Pro $23,2:biitt.ly/581eD
Windows 10 Pro $17,6: biitt.ly/9f0ie
Windows 10 Home $15:biitt.ly/XYm9o
Office 2016 $28,8:biitt.ly/cy7Vb
Office 2019 $48.6:biitt.ly/YInvw
www.whokeys.com/
AMAZON AFFILIATE LINKS
US - amzn.to/2CMrqG6
Canada - amzn.to/2roF4XO
UK - amzn.to/2Qi6LMS
GreenManGaming Affiliate link -
www.greenmangaming.com/?tap_a=...
SOCIAL MEDIA
www.redgamingtech.com for more gaming news, reviews & tech
/ redgamingtech - if you want to help us out!
/ redgamingtech - Follow us on Facebook!
/ rgtcrimsonrayne - Paul's Twitter
/ redgamingtech - Official Twitter
0:00 Start
0:01 Today's topic
0:16 RDNA 3 fell short... what does this mean for RDNA 4?
3:05 RDNA 5 update
5:13 How will Nvidia respond?
Royalty-Free Music - www.audiomicro.com/royalty-free-music - Игры
Hi guys, we've tried to change the presentation style from all your feedback - please let us know your thoughts!
less basically im close to dying from drinking every time you say basically.
@@Joker-no1fz I thought after 1st 3 or 4 victims of alcohol poisoning we decided to stop this game. At least with "basically".
Good for me. The only thing I would change would be to have a small rectangle box overlay in one of the corners of the screen with you on film talking through the points as I feel the audience is a bit disconnected from you if they only see the b roll and screenshots.
I think that would strike a better balance as the audience doesn’t need to see you on film in full screen but it would be good to have a smaller version of you there.
Kirk Kreifels has a RUclips channel which is about cars and he does certain videos dedicated to cars rumoured to be coming out in the future and their potential specs. I think he has a good format for going through it e.g internet sources, with the audience - might be worth a look for inspiration (“Revival the new Toyota Celica returns with massive power and AWD” is a video in this style)
👌
Thanks for Funny video
From dead to Dominate in 24h 😅
From dead to Dominate in 24h, that esacalated quickly.
This is how rumor mill goes
When did we say it was dead?
Yeh even if AMD were only making mid range GPUs from now until eternity that doesn't mean they're dead. To be fair I am very doubtful about RNDA 5 leaks it seems to go this way every single time RDNA 1 originally was supposed to go head to head with NIVIDA, Vega, Polaris, Fiji. Every single time it was, but the next arc is really good just as the current arc is clear it's not going to go head to head with NVIDIA. So I will want to wait until we are significantly closer until AMD has final silicon in hand until I will even entertain any leaks.
RDNA 2 was the only exception to the rule which in raster it truly was pretty competitive with Ampere, but 1 for 5 is not a good track record. Also I suppose you had the 200 series, but I wasn't following tech news that closely back then, so I don't know what the rumour mill was like.
@@RedGamingTech He was probably referring to an article in a "serious" online tech-outlet which was saying that Radeon GPU division was on decline and will probably be shutdown in the next few years lol
Bryan from TechYesCity made a video about it yesterday :)
I see. Thanks
Don't care who makes the very fastest $2000 GPU! A $550 or less GPU with good price to performance uplift is what most care about!
I hope this is where AMD is going . Plus makes sense for PS6 or a new Xbox. Sure there are those who can't wait to throw 2 RTX 5090s into a PC. I don't want my PC builds to be around some stupid power hungary card. Turn down some make F all difference settings , down to 1440P if necessary. Want a nice quiet build ,latest triple AAA is only a small part of gaming out there . Plus I buy GOTY editions at big discounts so new high mid tier card is enough
@@nimblegoatPS6 is not coming anytime soon pointless comment.
@@ZackSNetwork Thanks , can you tell me the lead time for development for the new design . Or does AMD and Sony get together 1 month before launch to work it all out. Really mate think about it, AMD work with Sony and Microsoft to find new solutions. If you think AMD does not consider new consoles in designing new cards, please let me know why. All the big companies including Google, Amazon and META are working together to be rid of Nvidia dependance for RT and CUDA cores. That's why AMD is going open on some stuff. Microsoft , Sony , AMD are all doing lots of research patents etc , if you follow the news. New consoles will need to stay under a certain power draw. Yes the details , specs are way off. But maybe you know Sony and MS are not seeking general guidelines . Microsoft are demanding so much AI power in coming chips. We still haven't see the use case .
Most definitely AMD knows it has to aim for X ability , for Y power draw, Z heat output , and U size.
Plus they will let AMD what extra stuff it will need eg AI space , codecs etc
What do all these zen 5 , 6 , 7 timemaps mean, why bother too far out
I am just dying to move over to AMD and graphics and give the finger to Nvidia. Could not happen soon enough.
I feel like the rumors are always AMD, in 2 years, will beat what Nvidia can do today.
Hey at least the 6000 series AMD tried with the high end, the 7000 series AMD didn't GAF about competing with the 4090. 300mm2 for the GCD and charging $1000 was a crime, and no power wasn't an issue because they could reduce shader clock speeds and increase the TDP to 400w because why tf not when AIBs and 4090s already go beyond.
LOL they say this EVERY RDA release leak 😂😂😂
I think we should all be concerned about energy efficiency. Yes, it's all rumor at this point, but it's better to be prepared than otherwise.
It's been widely spoken about that RDNA 3's MCM structure didn't quite live up to expectations or design targets.
You know it’s clickbait when the word AMD and dominate is used in the same sentence.
It’s a title to a RUclips video, you don’t need to read it to know it’s click bait.
Tbh every company has a product that dominates in certain scenario / price segment.
There's more to hardware than just pure computational power
This is so true...LOL!!!
You mean amd Radeon because ryzen is selling like hotcakes
The only place where Nvidia dominates is in features to hide the fact that they are ripping off their customers! AMD cards are 15% faster so Nvidia has 5 features to support their "pretend-fast" catds ...
Okay. I think everyone knows that the moment AMD irons out the GPU Chiplet problem then that's that - they get to have multi-gpu die chips and even if it doesn't scale linerally they can throw chiplets at the problem and it will still cost less than a monolithic die. Without chiplet they cannot build a die that directly competes with Nvidia's high end. With it - they can blow any monolithic nvidia design out of the water for far less $$$ than nvidia can build their monolithic die for. So the question is, why has everyone said "RDNA X is going to be the architecture that blows Nvidia out of the water!"? Well the answer is obvious. Since RDNA 2 AMD has been *trying* to do chiplet GPU. They have poured resources and effort into it. Every single generation. And every single generation it hasn't panned out. It really is *that simple* when you think about it. Designs and prototypes have existed, people have worked *heavily* on these projects. People have talked about it because it is being worked on. I suppose it doesn't really matter, but after the success of Zen and the amount of flexibility chiplet compute gets them... the absolutely bonkers cost savings... the reduction in defect induced silicon loss... wafer efficiency... etc., how can people NOT believe that when this comes to GPU the exact same principals will apply. Even Nvidia knows this, as they're working on their own gpu chiplet architecture. Hell, intel has been working on chiplet designs since the pentium days. AMD had chiplet dreams back when they developed Athlon. It was not a trivial issue and is very difficult and has taken a very long time. And if AMD gets this out by 2027 then they'll probably be a step or two *ahead* of Nvidia for a bit until they get their own footing in the race. They haven't had to compete in so long its going to be a *shock* for them.
You said AMD was going to have a zen moment for RDNA3... and RDNA4. Given up on RDNA4 already? It's not even out...
"AMD will destroy Nvidia in the next gen".
People have been repeating this same joke over a decade lol.
The top RDNA4 = 7900xt = 4070ti = 5070 (or maybe even 5060ti) so yeah...
@@user-hr4hu8xb5f I went with 6950XT, was super unhappy and sold it for 4080s ... AMD is a joke for high end. Maybe they are good for lower range but like ...
@@gorgono1 For esports titles gamers, Radeon is fine imo.
Just the lack of features make it weird for the high end.
What segment? Low end, mid tier or high end.
You already know the answer to high end which is AMD isn't going to bother making high end products.
For RDNA 4 AND AMD scrapping MCM AND with AMD improving RT performance AND Microsoft, Sony and about every game developer telling AMD they want temporal upscaling because FSR ain't cutting it, we'll see how it does in the tiers AMD is going to compete in.
AMD was going down a path they had to step off of because of latency, which KILLS GPU performance.
I'll be interested to see the difference between the PS5P in RT and AI upscaling vs. RDNA 4. I'll assume AMD knows they will cease to be in the graphics business if Intel is beating them with AI and RT on desktop GPUs.
So, RDNA 5 BETTER be a Zen moment, but Ryzen didn't beat out it's competition for a few years after coming into existence. AMD isn't going to beat Nvidia again though. RDNA 2 DID beat out it's competition for raster performance and you can watch all the 6950 XT vs. 3090 Ti videos done within the last 1.5 years and see it, but that wasn't because Nvidia had a bad design, it was because they chose to use Samsung instead of TSMC.
Remember that Raytrace compute IS very parallelizeable and not latency sensitive compared to Raster. Chiplets for Raytrace makes more sense to implement than for Raster (due top latency drawbacks of using chiplets over monolithic chips).
"Chiplet level crossfire" would probably work better than classical multi gpu setup, however there are many issues and challenges for Raster being divided over several chips, like frame pacing issues or that you would need same amount of memory per chip (i.e. if you have 2 compute chips/chiplets you would need 2x total memory) unless there could be some unified memory architecture.... which has its own challenges for raster.
Maybe RDNA 5 will have some interesting tech that can solve some of the issues of chiplet based architecture for Raster compute, AMD does have cleaver engineers.
Still, I believe that Raytrace will be the big winner for chiplets because it can scale near 100% without many of the issues Raster faces on multi chip/gpu.
RDNA5 is gonna be so good but i heard rnda 4 just gonna be a refined version of rnda 3.
AMD is going to give Intel a real run for their money in the GPU segment.
I promise you melting power connector problem is more widespread on RTX 50 series than RTX 40 series.
RDNA 2.0 was Radeon Ryzen already the difference Nvidia is not stuck in 14+++++ nm and Nvidia fanboys are willing to pay 1000$ for a 60 class gpu
Based Kyou avi is correct
384 is obviously number of memory bus width and not nubers of CU´s
All i want is a nice 100w range GPU that comes in an itx sub 17 cm for factor, with more than 12gb of ram
As all the people in the comments have pointed out. We've been waiting for that Zen-Moment for Radeon now since... RDNA 2? If it ever comes, that will be great and benefit all of us, if not... Well, at least it seems like Intel also still has not given up on Arc, so that could be another hope 😀.
It will be hard to beat team green, team red has only around 10% of the budget team green has…but hence chances that the shrödinger cat is still alive are 50%
Bring back Crossfire.
It'll be good I think cause the last time they skipped high end (rdna1) they made good consoles and matched nvidia's high end
nothing wrong with a quick video m8
So a hypothetical 144 WGP GPU would have 288 CUs, 3x what the 7900 XTX has, but sticking with a 384 bit memory bus would mean only ~50% more memory bandwidth. I don't see how they can keep a chip that large fed without a 512+ bit bus or switching to HBM. That's assuming there aren't any major architectural changes (like 1 WGP = 1 CU or something.) I can't even imagine how much power that would end up using either...
Probably 144 CU and not WGP.
@@OrjonZ Well all the other chips in that chart were listed with WGP counts, not CU's. There have been plenty of people suggesting CU counts near 300 too. Who knows, maybe it'll be feasible if they go for the 192+ MB of infinity cache too.
Hopefully me being critical on their lack of competition will make them competitive. 🙏
Massage Parlours are diversifying their business into Chips because that's where the men go
Answers: nope 😂
Zen moment with raytracin on? I don't think s-o Paul, AMD has been trying to keep up with Nvidia for years, I don't think it's the time, and I don't think they want such a big hassle all of a sudden, but yes, we can try to dream :)), I would be happy if it were at most 15% below 6090 with rt on because the prices would drop more.
So it looks like chiplet GPU had issues for AMD and Nvidia has also decided to take longer. 5080 seems to be monolith also and 5090 looks like is postponed as there is no need until AMD bring rDNA5 new highend. That could be reason why 5090 delayed as its planned to be chiplet and 512 bit.
We will see but both companies are trying to figure out how to make chiplet GPU work efficiently AND scale better.
They should just go crazy and make a dual power hungry GPU(something like 2x7900xt which would outperform 4090) and a mid tier reasonable priced ones. Win win.
But that won't happen because reasons
Its been the same damned discussions for as long as I can remember.
The problem is that all these speculations never take into account that Nvidia also is planning ahead.... And with 10x the funds... If not 100x now days.
We’ve been hearing this since Radeon “era”…
Some companies stay stagnant, other companies are always chasing others, Some companies have a vision and chaise it with everything they have and invent invent and push new boundaries! I see this happening with the 3 companies mentioned! Lol at this point the 2 are all chasing the one i will stay with the one for now and if you can figure what company is which then you already know the best one!
It's sad that RDNA 4 (which hasn't launched) isn't even in the conversation. RDNA 5 is meant to beat Blackwell? OK, but by the time it launches nVidia would be ready to launch their next successor. AMD is always a half step behind nVidia it seems.
the only talking point about rdna4 is that the 8800XT would be another 480/580/5700XT moment, offering great value at the mid range.
P.S. I'm considering upgrading to it from a 5700XT, depending on benchmarks.
more than half step.
RDNA5 wasn't intended to go against nvidia 50, it was intended to go against nvidia 60 series. From the trend of the rumors, AMD is pushing the 8700xt out as a stopgap and pushing up rdna 5 launch by a year, last time they did this it was the hd 7970 that was well out ahead of nvidia.
@@PineyJusticeWhat you’re stating is false because AMD is behind. How are they competing against two Nvidia GPU architectures that aren’t on the market yet?
@@ZackSNetwork The 5090 isn't on the market yet either. That's the point, nvidia already is working on the 6090 and probably the 7090 too, just as AMD works on generations ahead of what's out. They dumped the high end RDNA4 cards that were supposed to go against nvidias 50 series so they can get the ones that were supposed to compete with the 60 series out sooner. What don't you get about this?
It seems like all of these rumor mills about possible chips coming out in late 2025 / early 2026 are just confusing people who want to buy something now.
No question that Intel, AMD and Nvidia will keep adding products and performance each year, but it is not realistic to expect that the price will be cheaper.
people can buy current hardware now and especially later in Q3, all the current gen hw will be steadily approaching peak value prices until black friday and christmas and then hopefully between Q4'24 and Q1'25 new gpus will be released and available from both companies
@@DakuHonoo Except the presentation said release in late 2025, not 2024
RTX 5090 is also going to be mcm in my opinion considering it is said to use a 512 bit bus and is twice the size of GB203 meaning they will combine two GB203 dies to make the 5090.
Also the data centre blackwell gpu are also based on the combination of gpus so it makes sense to do try the same thing with the 5090.
And blackwell architecture is not using the 3nm process node which further makes sense they will combine two GB203(96 SM)
To make the RTX 5090 on GB202(192 SM exactly double of GB203)
There is no way nvidia will let AMD take the Performance crown in any aspect whether it is AI,Data Centre or gaming even if Nvidia gpu costs much more than AMD.
Also if AMD is a 10 times smaller company than Nvidia can make mcm based gaming gpus Nvidia can definitely do it quite easily.
IMHO, 5080 and 5090 are going to be purely AI. being completely irrelevant for gaming since you can't get them. Nvidia will try push all of those into chin much the same they do the 4090. the 5080 is more than likely designed around the limits imposed but the shipping restriction so if 5090s get stopped. they still have the 5080 to cover it. i'd expect them to fight to keep the 5080 at 1600 USD too. but ultimately settle at something like 1400 or so.
@@bionicseaserpent Reading from your cereal bowl ain't it ?
@@darkfire3691 those are pretty sound expectations, production of silicon is limited, new fabs in the US nor EU which lags behind even more aren't online or even built yet
NVIDIA isn't 10x bigger than AMD. They actually have about the same amount of annual revenue. Intel actually has about the same revenue as NVIDIA and AMD combined. Although, NVIDIA does have almost double the number of employees of AMD. But Intel has 4x more employees than AMD and NVIDIA combined. The main difference is NVIDIA manages to be much more profitable than the other two companies, which is why the market cap is so much higher.
Yeah the rumors about the die sizes are strongly suggesting to me that GB202 is just two GB203 dies. The die connection tech they revealed with GB100 sounds like it should work just fine for gaming too, and it would reduce manufacturing costs.
Thanks for Funny video
From dead to Dominate in 24h 🤣🤣🤣
When AMD graphics can handle duel or triple screen setups with each monitor at different resolutions, then let the mass PC market know. As of now just concentrating on game performance will not sell their cards to the general public.
Pretty sure Windows won't allow that, not the graphics cards.
@@robertmajors1737 Well it works with Nvidia not reliably with AMD. Windows 11 quite happy to oblige.
@@tomquimby8669 I'm thinking about refresh rates, not resolution, my mistake. Apparently that has been updated for Windows 11 also, I'm still using Windows 10.
It's so far off though - RDNA5 is like 3 years into the future. The market will be quite different by then. Hopefully Intel will also still be around in GPU's by then -- that could cause some real fireworks if all 3 have good GPU's out then. Still... we'll see. Much can happen.
That is a really good question, I can already answer it : NO !
Nvidia will be the first to bring MCM GPU to PC gaming with Blackwell, 10 years in the making.
I mean, RDNA4 is next, and we may as well write that off, AMD's marketing department is still "developing" aren't they?
4 is definitely not going to be a writeoff, there is every chance the 4 cards could be very cheap, likely between 400 and 500$ for just under 4080 performance. 5 is being released early because of this, almost like a delayed next gen with a refresh when the next launch should happen. A 240mm or less chip could end up being real cheap, that's the same die size as the 6600xt which launched at 379$, the 7600xt is 329 msrp with the same die size, if they can bring 7900xt performance into that price bracket it will be a huge winner.
So Blackwell won't be competitive again? Disappointing.
🗿Gotta have low expectations when it comes to AMD GPUs
By the time this GPU comes out hopefully the AI bubble will have burst (much like crypto did), and the average consumer will be able to pick them up much cheaper than the outrageous $2000 they are currently asking.
It matters not what Radeon come up with. Somehow Nvidia has inside knowledge of what AMD are building. You see, Nvidia not only want to dominate this space but they are out and out attempting to place Radeon out of business. Nvidia do not want competition as this will allow them to demand an insane pricing structure even when their current pricing scheme is out there.
Only if they are at least on par with NVIDIA
And that will be very hard to achieve.
That shouldn't be too difficult because every card under the 5090 will equal or 20% stronger than 40 series. Amd need to work on their software sweet to be competitive with Nvidia next gen not only raster but FSR, Framergen, Raytracing.
they were on par with rdna 2 delusional take buddy.
@@Joker-no1fz Dude the only reason why Amd was remotely close to the 30 series was due to the Samsung 8nm process being trash and that's why Nvidia switched to TSMC and smoked amd.
@@ISeeQuality-ox8ku You're forgetting that AMD also had, as usual, smaller die-sizes than Nvidia - the larger die-size makes up for some of the short-comings of the Samsung node, you know - still more transistors. As such, I do feel AMD reached parity with Nvidia, *for real* that generation. RDNA 3's downfall, was, sadly, the chiplet-technology - it wasn't quite ready yet. Had it been monolithic, I'm convinced AMD would still have had parity with Nvidia - except for the 4090, since AMD wouldn't make such a large die, but everything else, yes.
@@ISeeQuality-ox8ku doesnt change the fact they were on par. stay salty go buy nvidia every gen.
just give us RX 480 but better
JustWait™
You wish, I wish, we all wish it. But once again it won't happen.
RDNA3 is built with a bottlenecked it had 100% increase in IPC going from single instruction pre-clock to dual issue per-clock, but only shows a 7% increase in IPC & 8-10% increase in RT accelerator. That tiny front end RDNA2 had was ported over to RDNA3 causes a massive bottleneck. Until the front end gets doubled in size in RDNA4 & RDNA5 will not ever come close to Nvidia's in RT or shader efficiency. Just look a tiny increase in RT cores & shaders, it because they knew it a starved for instructions, so they lessened the increase in shaders & RT cores, the only thing they really increased was R.O. P's compared to RDNA2.
Dude there are like 1 or 2 instructions that AMD managed to work in dual issue mode, even those two are very sparsely used, the compiler is simply not smart enough to take advantage of those.
The dual issue capability failed miserably. It simply doesn't work for most of the gaming workloads.
That 60 Tflops number is misleading fpr general use case
@@harshivpatel6238 of course it bately use it, it doesn't have enough room to use any more than that in front-end on the of L0/L1 caches.
@@kevinerbs2778 what's front end that you refer to ?
Srsly rdna 5 is that muchh powerful?? If we assume 144 WGP rdna 5 top die at least gonna have 200 teraflops of fp32 😳😳😳
Be ready to be disappointed emoji man.
kind to late when rubin release and that supposed to target rdna6 but rdna5 is gonna get squash by blackwell...and yes rumor been true after i been saying about amd dropping radeon name entirely and make huge changes in group name that no longer be called radeon...it gonna be base on AMD GRAPHICS Ai...and architecture is no longer navi anymore ....died with RAJA koduri finally farewell and good ridance of troubles for amd keep updating flaw navi architecture that have trouble with power curve and stable clocks with bad temps issues is not gonna away since navi 1
Rest assured that AMD will either not release on time, not deliver a commanding lead in performance, or that Nvidia will make whatever refresh is necessary to stay on top. Nvidia will even rush whatever follows blackwell if need be. Best case for AMD, they get the crown for six months. I say this all as an EXCLUSIVE AMD user that will never buy anything nvidia makes if I can help it. (dont really have a choice in laptops)
Too lottle too late rdna4 should be a new architecture
Radeon always overhyped under delivered
The only dominating done in reference to AMD is AMD getting dominated by NVidia.
Radeon won't ever have a "Zen moment" because Nvidia is not complacent like Intel.
Paul... Vlad.... Dracula AMD will never EVER beat Nvidia again. But good vid.
Well better than most fanboys who cannot even admit the times AMD spanked Nvidia. Just remember people have said this before and were wrong, just like they have on cpu's vs Intel. Plus they need to stop using bad connector designs and wrecking peoples $2k gpu's.
@@kaseyboles30 We are not fanboys we are just living in reality.
I want AMD to be able to compete.
And I hope they stop holding back to do so.
But they are not given the resources to even try and you know it.
NVIDIA is raking in the cash on such a ludicrous level, so they can outspend AMD to stay a ahead when ever they need to. That is why every new substantial feature comes from them first. And then AMD comes along and is just a cheaper also ran with lower performance.
The past AMD wins does not matter in the slightest right now.
AMD is falling further and further behind because they are not even trying to stay ahead anymore.
@@cajampa admittedly they do have to be careful. If a price war gets sparked Nvidia has a much larger war chest to outlast AMD, and has fewer worries than Intel does if AMD goes away. Still AMD has gone from 2.5b to 250b in ten years. In a few years Also right now AMD is getting much better financial results in other markets.
As they grow however they will be able to put more effort into gpu tech.
@@kaseyboles30 I wish you are correct and that they will start heavily investing in this space again.
But I am realistic I don't think they will. I think the money they can earn from selling AI datacenter chips and zen to also data centers is so much more than the low margins they can get as an underdog to NVIDIA in the consumer GPU market.
Probably will mean they will be satisfied to just hold on and take care of the console business. And all we get is an after taught and scraps from the table of Lisa's attention and money allocated to the GPU division.
I hope I am wrong and they will start going like they have been doing with zen. But so far I see zero such ambition.
They would need several generations where they are really pushing both features, performance and price. For me to start believing they are actually in the game again.
And they are so far behind in just a massive thing like CUDA that I don't see how they can get to par or ahead of NVIDIA.
@@cajampa Thing is there is a huge overlap between gpu and ai accelerator. Research on one benefits the other. The other thing is the ai boom will slow down. Also rocm may be a huge step forward on the cuda proprietary crap. That said they do really need to push compatibility with their gpu's into more software outside gamming.
AMD's Zen moment for GPUs? No. Their Zen moment in CPUs required intel to slack for several years in combination with the technological progress of AMD. Nvidia never slacks. Thus AMD will never have a Zen moment in the GPU front.
Most likely not. RDNA 4 is not even out and RDNA5 will be faster? I call BS, nvidia has the grip on the performance
Yeah how is RNDA5 going to compete against the 60 series. AMD lost against the 40 series and have been losing the past 10 years.
nvdia: look at this most power gpu
AMD: just you wait
lmao are we just going to skip 1 whole generation in the amd hype train?
Oh no I think Nvidia should run for the hill AGAIN because RDNA 5 is going to kill all. 😂
That's not the question to ask. The question is "Can AMD finally compete?"
At the very high end, last gen AMD's 6950 XT 16 GB competed fine with 3090 Ti 24 GB at half the price of Nvidia.
At the higher end, current gen 7900XT 20GB and 7900 XTX 24 GB also competes well with the 4070 Ti 12 GB/4080 16 GB at a lower price, more Vram but inferior RT.
Rdna 6 will be ryzen momentum
Because ps6 and rdna 6
Fanboys already giving up on rdna4 and going straight to hyping rdna5 rofl.
I expect AMD provide competition to some degree to nvidia so it doesn't become monopoly but I don't expect AMD to win. They just can't win against nvidia. Yes, I am stick with nvidia all the time but AMD, please stay in business as long as you can. :(
if you are stick to nvidia all the time that is the problem, please amd stay on business so i can buy nvidia at only very expensive prices , 100% of people like you and 4060 at 1000 usd...
@@isanvicente1974 meanie. 1 more customer won't change the outcome. 🥲
RDNA 5: Can AMD Finally Dominate GPUs?
NO
Even with the rumors. I still think AMD will drop a high-end card
No.
I know things change, but x years ago I tried AMD CPU's and both times ran into issues not working or working well with certain apps, so I gave up trying to use them.
When they give me a 100% money back that there is zero app issues... I consider them again, but I don't see that gaurantee.
the whole comment section is people who would never even consider buying an amd gpu calling them garbage its delusional.
It's not that AMD GPUs are garbage, it's because every year they get hyped up to the moon and it always ends up in disappointment when it falls short.
@@03chrisv *shrugs* I actually didn't think RDNA2 fell short -- it did what we wanted, hence why I bought a used RX 6800.
@predabot__6778 You might not have, but every generation people think AMD will demolish nVidia and it never happens. AMD makes good GPUs but I don't forsee them overtaking nVidia anytime soon, especially as path tracing and more machine learning becomes the norm.
@@03chrisv Well, I have to admit that I thought RDNA3 would be AMD's Zen2-moment -- since AMD themselves seemed to think so, and the idea of bigger GPU's with chiplets seemed like a no-brainer - before I knew about the issues the architecture apparently has. You seem to be correct in that AMD probably can't overtake nVidia now that they're on to them, but I certainly think they can reach complete parity fairly soon.
We all know thats the answer is sadly NO
AMD can never ever" dominate "not while Jensen laser focused man is the CEO
?
People said this about their CPUs once. Never say never.
@@AshtonCoolman doubt, hate nvidia for the price but they aint resting on their laurels, scott henkelman directly said radeon even waits for nvidia to decide their price. which mean they don't even have confidence in RDNA.
@@AshtonCoolmanYour delusional people have been saying this for 10 years know. AMD always fails to compete against high end Nvidia GPU’s.
TIme for AMD to release a GPU for 200 USD with 2080 performance. Who cares about halo products? They should emphasize entry level and mainstream. ANd no, APUs are not a good solution for AAA gaming.
AMD already had its zen moment with RDNA2. But it's no use if you can't play at the top in the long term. An answer to nvidia every few generations is not enough if there is a drought for years afterwards.
AMD is like Microsoft. They are trying too hard to beat Apple, but it's inevitable that Apple will always win.
As with AMD, they are a very small company compared to Nvidia. They can try with whatever they have, but they will NEVER beat Nvidia. AMD don't want to invest in their gaming GPUs like Nvidia. If one day they decide to invest in their GPUs, they might be on par with Nvidia.
AMD is just a BIG disappointment 😞
i bet AMD gonna replace RADEON name with AMD Instinct Gaming ai 900XT to save face for investors anger of losing $3 billion dollars on radeon group...its very possible radeon group merging with AMD instinct group and drop entire RADEON brand name for make it unity with Instinct with sub division for gaming dept to make Instinct gaming cards since rdna5 is base on cdna3...not navi anymore ...AMD just end it due to many many failures by now
lol amd is dominating every time but only in rumour
Realistically they should be like 40% cheaper than nvidia counterparts in the same class and then we can talk. Amd cards simply lack software support to be considered for anything other than casual gaming
Short Answer: No
Long Answer: Nooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooo
BS, AMD is not even trying.
And you all know it.
They just follow.
And Nvidia has WAY more in the tank if AMD ever would come close.
AMD need to start focus on the GPU side of things like they did with Zen many times over. But they don't so it will not happen.
All these hype BS rumors happen before every release and it is ALWAYS a disappointment when they hype falls way short of reality.
They are focusing on GPU side just not Dedicated Cards since they are moving towards handheld and Embedded solutions
@@sleepingvalley8340 I agree that they are moving away their focus from the client market and focusing on the embedded, custom and the data center chips I will add, it has been very obvious for a long time.
That is why it is frustrating every time they try to start up another hype train. That always end up falling short for the consumer GPU cards.
@@cajampa I am extremely hyped for their Strix Halo APU 40 CU on such a small package will be awesome can't wait to see what Minis Forum will do with it but I have to agree it is frustrating I wish they stopped hyping the performance and started hyping their SFF Handheld products which are far more interesting and affordable than the crap out on the market today.
@@sleepingvalley8340 Agreed again. It looks very very interesting.
Why it looks so good to me, is that we will possibly have a decent system with at least 64GB and hopefully up to many times of that with GPU addressable VRAM. To be able to run massive gen AI models at decent speeds.
I know a lot of people will try them out for such use cases and I really look forward to seeing how they will perform. But.....then again AMD can screw us on the software side again with s itty support.
And it is dangerous to be hyped about a AMD GPU product.....they tend to disappoint. I hope we get a good one this time.
At least it is a promising path forward. The GPU and CPU + a capable tensor/matrix/NPU unit or what ever they will call it when it is time, really need to be combined sooner or later.
With lots of memory on the package.
It has always been to where we are going it just takes so damn long to get there.
@sleepingvalley8340 RUclips keep deleting my answer, let's see if this one gets though.....
Agreed again. It looks very very interesting.
Why it looks so good to me, is that we will possibly have a decent system with at least 64GB and hopefully up to many times of that with GPU addressable VRAM. To be able to run massive gen AI models at decent speeds.
I know a lot of people will try them out for such use cases and I really look forward to seeing how they will perform. But.....then again AMD can screw us on the software side again with weak support.
And it is dan gerous to be hyped about a AMD GPU product.....they tend to disappoint. I hope we get a good one this time.
At least it is a promising path forward. The GPU and CPU + a capable tensor/matrix/NPU unit or what ever they will call it when it is time, really need to be combined sooner or later.
With lots of memory on the package.
It has always been to where we are going it just takes so da mn long to get there.
AMD doesn't have the software stack to "dominate".
Have you ever worked with ROCm or HIP?
@@TheTaurus104 LOL and that is why AMD can't compete.
And CUDA is king.
Bingo Amd falls short with software when you compare them with Nvidia and that's the biggest reason gamers choose team Green over Red. Price to performance argument means nothing if you have inferior software people are willing to pay extra to have a seamless experience over trying to play doctor with temperamental Amd.
@@cajampa that's what ROCm aims to fix
@@russellwilson619 Of course it is.
But as so much with AMD GPU's it is pretty much a failure.
Very badly supported and they are dropping support for not very old hardware in newer version all the time. And don't even have full support for all new hardware.
As I said NVIDIA's CUDA support is king, it is so much better supported it is not even in the same league. ROCm is a joke compared how good CUDA works on NVIDIA hardware.
I wish AMD would fix their s I want to be able to use their hardware for such stuff. But it is such a mess
most just don't want to bother with them.
RDNA 5: Can AMD Finally Dominate GPUs? seriously? Its not even at the AMD's plate to offer this video oversight and cringing about omg AMD RDNA5 will whipe the Nvidia's ###.
AMD lost the current generation, has already given up on the next generation, but trust me, the next next generation is going to be real winner!
🤣
"RDNA X is a zen moment for amd GPU" i think i have heard that one before...
NV is not intel... they don't go to sleep and do 0 to 5% bump and call it a new gen *and* keep doing it for 6 years.
** no, they didn't do it with the 4060 because it's not actually a 4060, it's a 4050 and they could get away with because of poor competition.
Not poor competition but lack of competition, if you look at AMDs current products they are slowly moving away from dedicated GPUs and looking toward embedded solutions. Why compete when you can just go after an entirely different market I am currently just waiting for their APUs to come out so that I can grab a 40CU Mini PC.
Nvidia has like 85% of the market. AMD is on life support
Ah yes... "poor volta" over again. AMD is non existent in the RT/PT space. I've seen nothing yet to prove AMD ever will be compared to NVidia. With RDNA5 being 2026 or so, we are looking at NVidia's next gen after blackwell in competition. By then NVidia will do multi chiplet as well. So what exactly does AMD have to compete with?
You are a bad orator.
This guy is a phony. Shut up about AMD Radeon gpu already. It sucks and it will continue to suck.
AMD will never dominate the GPU market. Nvidia is the Apple of the GPU market. Much like the I Phone Nvidias GPU aren't ready any better than AMDs. But Nvidia spends billions on advertising. While the 4080 is a fantastic card its $1000 and only beats the 7900xt in some RT games. And the 4090 is useless unless you make a living editing professional videos, ( not a youtuber).
So that leavs 95% of the GPU market. Nothing below the 4080 can beat its Cheaper AMD competitor.
Again like the IPhone Samsung has had the best devices on the market for 15+ years. Yet people still waste thousands.
First 👍
One. 2nd
You AMD folks are adorable. Gluttons for disappointment.
5080 will be a 750W card with basically 4090 performance. Amd is way too far behind. Dlss is waaaay ahead
If the 5080 is going to be 750w then I'll literally change my name to "cuckman"
bro u never even turned on fsr in any game stfu they are almost the same.
Nvidia has too strong of a base to build off of. 40 series was so efficient compared to rdna3. I just believe the 5080 will be 85-90% of a 4090 performance but at 65-75% of the power draw. No way in hell is amd touching that.
This dude has all clickbait. Idk how u can have a channel with bs and rumors that aren't ever accurate or even entising
Nope. They can't.
RDNA1? Garbage.
RDNA2? Garbage.
RDNA3? Garbage.
RDNA4? AMD have already admitted it's Garbage.
RDNA5? I'm predicting ground up Garbage.
garbage lol so delusional. you would not use an amd gpu ever so your opinion does not matter.