[SPON: AliExpress is up to 80% OFF for Back to School! New Users get FREE Shipping & Returns: s.click.aliexpress.com/e/_DE6fZgP?af=MDUS004 ] [SPON: Until 8/25 use MOORE25 for $100+, MOORE15 for $60+, and MOORE9 for $30+ orders!] [SPON: Retro Video Game Console: s.click.aliexpress.com/e/_DBkS5EB?af=MDUS004 ] [SPON: Delux M900PRO Wireless Gaming Mouse: s.click.aliexpress.com/e/_DdE4zMx?af=MDUS004 ] [SPON: Baseus Bowie 10 Max Wireless Headphones: s.click.aliexpress.com/e/_DEkV7ZV?af=MDUS004 ]
AMD will absolutely blow it. I've never been disappointed by a product I've bought from them. However their greed keeps landing them in second place. They don't have the same install base as Nvidia and never will if they don't take a full generation and under cut the shit out of Nvidia.
Seems like every time they're about to have a shot at getting ahead it turns out they were aiming for their foot. Glad the leak is good news, but I am tempering expectations until we see reviews
The only thing I care about for RDNA 4 is a return to sane GPU pricing from at least one company. RDNA 3 was a hot mess at MSRP and I hope that mistake isn’t repeated again
AMD will always never fail at being a disappointment. Given their shitty Zen 5 launch, I'm not gonna bet on AMD self-correcting and they're still going to chase Nvidia level of pricing at launch.
@@lyserbergEven the RTX 40 series eventually adjust. In my region AMD is still a hard sell since their price to performance ratio is just slightly btter than nvidia while not providing better premium featurs, plus drawing more power.
Now they really need to have a score with this one because their gpu sales has decreased this year. Also they are most likely targeting Intel battlemage which are predicted to be released close to this date.
i recommend the Video of High Yield about RDNA3 expectations vs reality, i think they aim to at least reach what was promised then, not more. Which would give us about 25% better performance than 7800XT at the same CU count, which would bring it on 7900XT level.
I hope Lisa has had a stern talking-to after RDNA3 and Zen5. If they just revert to RDNA2 and Zen4 marketing strategies nobody will complain about anything.
@@Loundsify thats generous. more like 300 more, then after it bombs for a few weeks, they will lower it to 200 more, then after 3 months of awful sales they will keep it 100 more than it should be
@@martineyles it looks like 2 to 3 years, actually. I'm not saying it is right. But that sure looks like it. In the last 3 years 500$ has been budget. I don't make the rules
Everyone who was going to pounce on this generation already has. It's the same every single cycle. I've seen this play out over several decades. They can lower prices all they want by now, and only improve sales a little bit. The value crowd is waiting for next gen to come out, so current gen gets heavily discounted, and the enthusiast crowd isn't going to buy nearly 2 year old products now.
@@andersjjensen I disagree people love bargain sales, lets say 25-50% off..but these products never gets close to this. so if they are stuck with them they get what they deserve. performance electronics do not age like wine
@@hotdog9262 Yeah, you're in the value crowd. That's what I just said. You'll take a 2 years old card at enough of a discount because you expect that to be better perf/$ than next gen. Or you'll wait, because the discount doesn't happen until next gen is actually out, and do your calculations then. But it will never be 50% off. The 6750XT launched at $550 and is now at $320, which is probably at break-even for AMD with only a few dollars for the AIBs.
So they’re trying for a Ryzen 9000X3D + RX8000 double whammy, maybe this time around the Radeon Group actually delivers and doesn’t just send a douche in a sports jersey on stage.
I'm currently using a 5000 series chip and was considering an upgrade to a 9000 series chip but oh boy did they lose some of my trust with those overblown marketing numbers.
@@crassusmaximus5879agreed unless you aim for 7900xtx or 4090 level of performance should stick out for AM6, the 5000 series will be more than sufficient until that comes in a long time from now
I have a 13600kf fml, and RTX 4080 so unfortunately this doesn't have the horsepower. However, now I'm very excited for a RDNA 5 and my next PC will be the 10000x3d? and 9900xtxt? Whatever dumb naming scheme AMD decides lol.
I just hope RDNA 4 goes back to power efficiency again. RX 6600 consumed about 100 watts, but then for some reason the 7600 consumes more than 160 watts.
The RX7600 uses the full die, it's the successor of the 6650XT but they didn't want to call it as such because it was just ~2% faster. They didn't release a cut down lower power version because they still had so many unsold last gen cards. The RX7600 is probably the biggest failure of this GPU generation.
@@hyperixz You know you can always power limit it, right? Of course, you'd have to accept the loss in performance. It's not like AMD could have just decided to release the card at the same price and performance but using 60 watts less energy. Something has to give. If you want the card to use less energy just tell it to in settings. But I suspect like most people you don't want to give up the performance... So that really means AMD made the right choice to boost the power, their product overall just isn't as good as people wanted.
@@Loundsify It will get replaced by the smaller RDNA4 chip. The 7600 is a very silly card, due to the high power draw the board and the cooler costs more than the ones used for the RTX4060.
Now the production stop of the 7900xtx makes more sense too. That 6-9 months ahead of release sounded a bit long. I gotta wonder though how much fab capacity they will be willing to give Radeon while also trying to get ahead vs Intel in Laptop. Addendum: announcing at Gamescom would give them some positive news to distract from the clown show that is Zen 5.
@@andersjjensen Heh heh... that's EXTREMELY good news! ^^ That means the silicon manufacturing industry may have done the GOODY of building over-capacity after the chip-shortage and we can now all enjoy lower chip-prices and GOOD availability in the years to come.
@@predabot__6778 I don't think prices will improve much. Wafer cost is going up with each node for entirely technical reasons. Basically EUV arrived a decade late, and High-NA EUV is still not ready, while also being a bit of a clutch.
RX6600/6600xt were the replacement for that, 2x perf +50% price most recently same price, not to mention massive efficiency improvement. You also get the 7600M eGPU which I would wait for it to be cheaper. Hopefully there is a greater OCuLink push in the future
@@martineyles 480 was low tier, its natural replacement was the 6600 or 6600xt not a 80... ofc is gonna be more expensive, meanwhile is gonna be like 8 or 9x times faster...
They NEED to leverage their extra "ai" type cores for an FSR update. It's already in the 7000 series, and will likely be in 8000 series. If the ray tracing is around a 4080 with less power draw I'm already interested in it, and if they did that I'd pick it up in a heartbeat.
honestly the radeon 7000 series was also them being sure it would perform insanely well, but had problems they couldn't fix at the last minute. the ryzen 9000 story sounded too familiar in that context.
Usually AMD products need at least 3-6 months in the oven post release before they're actually somewhere regionally close to what was advertised. Lets see how their marketing manages to fuck up another unfuckupable situation :D
From what I've seen Zen 5 is actually a pretty big uplift - if the OS is optimized for it. Ironically on Intel optimized Linux distro Zen 5 is 20%+ stronger than Zen 4.
if they called it an 8900XTX people would go mad in joy that "Flagship cards" are back down to 500-600 dollars but at the same time hate it cause its not a 5080 competitor.
AMD would be crazy to brand it a '8900XTX' because that would mean their new "flagship" getting absolutely dumpstered by both the RTX 5080 & 5090. Technically it'd also be beaten by Nvidia's last-gen 4090, which would make the perception even worse.
Whatever they call it, it will be their flagship and it will be pathetic performance compared to nvidia. They don't seem to even be trying to compete at the high end this generation.
@@levigoldson It’s been leaked before and talked about publicly afaik that they are in fact not competing in the high end because the cost of the gpu would be “too expensive”.
@@levigoldson The vast majority of people don't feel like dropping $2K on a gpu. Those top end cards may as well not exist. The only thing that matters is P2P.
AMD giving us 4070 Ti Super level performance (a $799 card) for $549 would be pretty amazing. I'm not in love with 16GB of RAM for my AI stuff, but that is enough to handle current games in 4K, which is good. If this does happen, it's a huge shift in the midrange that will help a lot of gamers. Thumbs up!
I wouldn’t doubt the 4070TiSuper dropping to that price after next gen launches, if they would’ve done that for last gen sure but I think being a gen later makes it not as big of a deal
@@TopNotchPanch looking at what happened with RDNA 3 and Ada Lovelace, with close to no price to performance improvements apart from the extremely high end models, I'd honestly take a 4070 ti super performing graphics card any day of the week
You can make a higher VRAM card with this die without resorting to clamshell, which for a mass market midrange GPU is not cost effective Although AMD will happily sell the W8800 with 32GB of RAM, Pro Driver support for more money thou
@@lucazani2730 I think if you’re a few gens behind or lower tier sure, I have a 6800xt I bought over a year ago (used for ~$400) and this isn’t a big enough jump for me to justify buying although it would depend on RT performance to really say. GDDR7 would’ve definitely helped sway me
Let's keep our expectations extra tempered. The Radeon team might not be the Zen 2-5 team but they have the same marketing department and the duopoly to maintain.
By the time this happens the 4080 will be reduced in price to $800 or less and will sitll be a superior card due to better upscaling and ray tracing. Y'all are delusional if you think a large percentage of NVIDIA buyers would switch over
@@Mark_Williams."one" they didn't have a good launch in years. we always have decent pricing in the end because nobody were buying them and the market adjusted the price by itself.
Couldn’t be happier with my 7900 XTX that I got on a sale for 999€. If you use FSR3 Ray Tracing is no problem for it. If they improve Ray Tracing further in RDNA4, this one is gonna be a winner.
@@imlegend8108 Early/mid 2023 when I was looking at prices for the XTX, the best versions were going for $1200-$1300. The cheaper ones were still around $1100. Edit: I'll add that since the XTX production is stopping, the fact you can still get it for MSRP isn't too bad. It went down to around $899 at one point I think and maybe $849 for open box. Just checked PC Part Picker and there are only 3 or 4 models below $999. Again the best versions, like Sapphire Nitro+, go for $999 or more such as AsRock Taichi OC and all of Power Color's ones. The Nitro+ was the one I had my eye on for months but I decided against spending that much on a card right now when I can't guarantee I can make money with it for a while. Have a 6700XT as a filler until maybe this RDNA 4 release.
AMD's marketing department, proverbially on a bike, holding the stick, leaning over and just so desperate to place it right in the spokes hard and fast. Good luck AMD, good, luck. Imagine if competent people there were allowed to make the big decisions, I'm sure there are plenty there, they just aren't at the controls.
A 4080 is around 45-50% faster than a 7800 XT 7800 XT has 60 CUs and 8800 XT 64 CUs RDNA4 is a gem of an architecture and a huge upgrade over RDNA3 for it to trade blows with a 4080 at those specs... Or is it the next gen IC that is doing all the work? Come on now, let's be serious...
4080 is TWO YEARS OLD. It should NOT blow your mind that a new card is "trading blows" with that. GPU progress is ridiculous... If they can't increase the performance, they should focus on price and drivers... There is enough competition going around so it looks a bit like all 3 designers are holding back!
@@nonameentered1918 the overall point is that it should already be at that price.....AMD slow rolls price cuts before the announcement of a new product to anchor the price
Dude they ARE focusing on price and drivers. AMD drivers have been really solid for a long time now and a 30%+ price drop for the same performance is a solid generational increase. Of course, we won't know anything for sure until it comes out, but if the leaks here are true this is definitely exciting news for people not trying to spend $3k on a gaming PC.
So a bit faster than 7900xt with less ram but RT is usable. Priced at 100£ off 7900xt. The price is high, I have seen 7900xt go as low to £620 in the UK. Think it's about £680ish lower end now. In the UK the 8800xt will start off over £600. If US 5-600d. Sounds good for RT lovers. Shame about the price
@MooresLawIsDead thast actually the thing that could be exciting for me, but if its 200-220w max. Maybe the 2nd card in the lineup, if it doesnt use more power than rx 6800. As for the supposed RT uplift... sadly we heard the same things the last 2 generations and it wasnt true at all
Nobody in their right mind would be happy to pay anything above 500 after 2 years of having similar performance with higher ram and the launch price within a hundred Euro/dollar/pound difference of what is already available. Especially as they are looking to get amd buyers who care a lot for ram and less for a small efficiency improvement. The argument that they will launch it higher to empty sales of rdna3 is exactly why their market share keeps shrinking. They should eat the loss and try to double market share by starting out with aggresive pricing.
@@maxs9894 you still have loads of over priced stock from 2 generation ago and so many SKUs of the 7000 series's. Can they have a big fire sale, clear the cards like 7800xt 7900gre/xt. If these cards are about still then 8800xt will be over priced and people will not touch them.
I really Really hope it’s not called the 8800xt unless it beats the 7900xtx AMD needs a marketing win Even if the performance means it could meet/exceed a “8700xt” Calling it the Rx 8700xt, solidifies its role as a mid range card, at a mid range price 500$ and edging out the 7900xt, while using 250 watts? That would look so good for AMD, and the reviewers will test it against the 7700xt, which it will trounce completely The 7800xt should have been named the 7800, as it was 150$ cheaper then the 6800xt and even 70$ cheaper then the 6800, but was compared to the 6800xt because of its name, leading to a very good GPU getting very bad reviews
All those are pipe dreams. RDNA4 will be like Polaris so expect a 5% better performance from the 8800 XT. Also AMD wouldn't cannibalize their 7900 XT/XTX sales when they decide to not have a high end competitor.
@@laszlozsurka8991 "RDNA4 will be like Polaris so expect a 5% better performance from the 8800 XT" 5% better than what, the 7900 XT? The RX 480 was slightly slower than the R9 Fury X, but way more than 5% faster than its predecessor, the R9 380. We're expecting the RX 8800 XT to compare similarly to the 7900 XTX and 7800 XT, respectively. "Also AMD wouldn't cannibalize their 7900 XT/XTX sales" The sales of a GPU that they aren't making any more because it costs a ton more than Navi 48 to manufacture (another interesting parallel with the Fury X and Polaris), and which they intend to sell out of soon? Why would this be an issue?
@@nathangamble125 5% faster compared to 7800 XT. As I said, RDNA4 will be like Polaris and if you compare the RX 480 to it's successor the RX 580 it's a 5% difference in performance. 8800 XT will be a 7900 GRE in performance. AMD still makes money from selling the 7900 XT and 7900 XTX so if they make an 8800 XT that would match a 7900 XTX then AMD couldn't sell the 7900 XTX because people would buy the cheaper 8800 XT if it had matching performance and AMD also can't charge $800 or more on an 8800 XT. If you don't have a successor on the high end, then you don't want to make the lower tier GPU perform too good otherwise you're gonna lose sales on your current high end product.
@@Etemuss I am playing a lot of new titles as well and have no problems. Use upscaling and can run everything at high/ultra settings. I think you are being unrealistic. You bought a midrange card and expect high/enthusiast level performance.
@@puffyips I can promise you the performance hit going from 1080p to 1440p is marginal at best with this card. Most games it gets the exact same performance. 10 fps difference in the worst scenerios.
"So many times" is just RDNA3. RDNA1 and RDNA2 were right on the money (unless you follow Red Gaming Tech, in which case everything is a major disappointment because he lies through his teeth to get views from people who hope for miracles).
@@andersjjensen RDNA1 was a mess , performance was there and for the money it wasn't bad but it was let down by its drivers , my wife had the 5700XT Nitro+ SE and it was a disaster, I had Radeon VII it was also a mess , Fury X was disappointing , so was Vega 64 which I also owned , I now had the 7900XTX which I do like actually I have 2 because my wife has one also, hers runs fucking great but my one does have issues here and there , if AMD doesn't get a grip I might move to nVidia for good
I don't see myself upgrading from my 7900xt . I've had it for a year and love it. I could use more Ray tracing but it not worth a extra 500 for a few more frames in Ray tracing won't be enough to be a big change
not too suprised about the specs, kinda what I expected. The only thing I didn't expect was the october/november release date, so if true, AMD really did hoodwink people on the idea of when they were going to release it
People thinking this is a revolution are insane, the 7900GRE already exists, setting up the memory to match the speeds of the 7900XT, makes it perform virtually the same. This card will just be better at RT. It needs to be $499, can't be more than that.
Techpowerup has the XTX at 37% stronger in raster than the GRE. That is not chump change. Even if the 8800XT is only 90-95% the raster of the XTX, but delivers on the 4070 Ti Super level RT that is still a sizable generation uplift at $599. The GRE's more direct competitor/replacement will be a cut down variant.
@@andersjjensen Even with proper VRAM speeds, matching the 7900XT? That puts it only 15% away from the XTX. 15% more performance plus matching the RT performance of a 4070Ti, can really make a $600 8800XT look that much better than a currently priced $520 7900GRE?
That's been fixed ages ago. My XTX is perfectly fine, running anywhere between 10-15W @ 4K 144Hz when idling. Memory downclocks almost instantaneously when the load is gone. And that's on W11, some Linux distros can indeed perform even better.
Unfortunately *not* fixed with multiple monitors with some types of freesync hardware - mine still idles at 90W no matter what the OS is. Actually bothered sending reports in to AMD with it, but I guess some cases count as no cases.
AMD needs to make a 32gb model in order to compete in the AI market. there are already a lot of cheep 16gb GPUs but anything more than that is extremely expensive. I think that if they do launch a 32gb model at 600$ or 700$ they will steal 5090 sells for people that are trying to make high end AI servers on the cheep. At least I would buy a few if the 8800 xt has 32gb.
I suppose its impressive for AMD to come so close to NV in RT, but RT perf as a whole is just so underwhelming that it doesn't excite me when it's supposed to "come close to a 4080 in some games"
Exactly as strong as I'd hoped. I was betting on 4080 at half the MSRP. Hopefully some last minute bug won't show up cutting perf down! I got a good feeling about this one.
Now, are absolutely 100% of the AIB vendors going to throw away the very useful USB-C port for absolutely no reason, as they did with the 7900 series? I sure hope not...
They aren't, all those dies can goto Mi300x which are in SUPER SHORT supply and they're making money hand over fist with even without 10% ai market share. They rather make 1000% profit vs under 30 with rdna4
This is the worst case scenario for RDNA4, they are blowing the launch ALREADY. They don't want marketshare, they want margin and profits. They don't get enough chip supply on the latest node to justify good products at good prices. The whole trick is for AMD to convince everybody they have enough supply.
AMD is dumb if they launch the thing at 549-599. They _need_ to gain market share. No one currently believes in them, period. Whatever they do, they need to put graphics cards in people's hands, to show them that there is a legitimate alternative to NVIDIA, and fast, and the only way to do this, is through _very_ competitive pricing. It's been shown time and time again that if the pricing is not _very_ competitive, people just don't care. "Thanks for bringing Nvidia's prices down, now I'm gonna go buy their stuff" - that's what AMD gets every, single, time, and it's sickening
I don't think that AMD can reduce Nvidia prices either way. Nvidia is more popular & compatible for AI applications and AI makes them MUCH more money right now than gaming ever could. AMD could sell their top GPU for $100 and Nvidia would just wait until AMD can't keep that pricing up anymore.
Sadly both companies have calculated that they can make far more money by selling as many chips as TSMC will allow them to make as 'AI' whatevers. Until that bubble pops, GPUs are a drag on their profitability.
@arenzricodexd4409 Nvidia only gets away with that because their brand is so strong. AMD can't do the same if they want to be competitive. It's like trying to compare Fila to Nike. Both shoes are made from the same materials in the same sweat shops, yet Nike can charge top dollar because of their strong brand.
AMD really can’t ask more than 499 for this if it turns out to be somewhere around 30% faster than 7800xt, at 600 dollars it’s almost stagnation again so unless it’s priced aggressively AMD will fumble the bag again as they usually do
Agreed. they have to be disruptive to the market to matter especially in the wake of the Nvidia AI Tsunami. They practically have to give it away to make people care about them and yet again, like zen 5 vs 4, the speed isn't significantly increased outside of rumored Ray tracing. So price to perf and power/thermals need to shine along with price.
Great breakdown Tom! Totally what I was expecting... A 2024 release with better performance than a 7900 XT, and I couldn't help but think when we heard that they were cutting production of the whole RDNA3 lineup, that the performance might be closer to 7900 XTX performance, which is awesome. That $500 price point is still critical though!
So anywhere that isn’t the USA this card will probably cost the same price as the 4070ti super, not great. People have shown time and again they don’t want amd gpus if nvidia gpus are anywhere near the price of a amd card. Ether that or nvidia will cut the price of the 4070ti just enough to screw over amd’s launch.
Not exactly. Surely, mid and upper mid cards like x60/ti and x70/s/ti will drop price, but the top and ultra top x80/s/ti and x90 will be completely at ngreedia's mercy. 5090 will be over $1500, maybe even closing in on $2000 msrp tag.
return it. Better RT, on-par raster, choice of overclocking which will push it pass the 7900xtx in raster if it isn't already, less power consumption, $400 less( you could use that $ 400 to upgrade CPU if you want or something else).
@@marktackman2886 yh. The 7000 series overall was a massive missed opportunity. Even though I really like my 7800xt, which I think is the best bang for the buck currently. If the performance of the 8800xt is true, then it is a day one buy for me. I am a serial overclocker, so getting a GPU faster than a 4080 at $5-600 16gb 256 bus is a crazy deal.
Just want to remind people that he said RDNA 3 will decimate Nvidia in efficiency and 7900xtx will roughly be in the ball park of 4090...Guess how that turned out.
@@Thirty-Two zen5nm% was because of how the arch were designed. its a server 1st product. gaming WILL not benefit much from multicore/thread enhancement. unless you play games that use more than 4 core, then you will see the improvement. it will be better on City skyline 2 or X4 foundation(on linux. never seen windows using more than 6 core) in my experience
RDNA 3 caught most people off guard because of the 96 dual issue CU. i seen people hyping up amd actually going to have a very big design and only few people suggesting(I'm one of them during NAAF livestream a few weeks before the event) that it could follow nvidia dual cuda thing and instead of having actual 12288 sp, you have 6144 that can use up to 12288 if a program/engine knows how to use it. with RDNA 3, the dual cu needs to be address on the game engine rather than the driver. so in most cases, RDNA3 are just RDNA2 refresh. and with AMD marketshare, no game dev want to put effort since it doesn't benefit them at all. even PS5 still have RDNA2. maybe with PS5 Pro we will see game to have some boost on RDNA3 since they are finally utilizing RDNA 3 Dual issue CU
I cannot say how much of an influence this might have on AMD's potential sales numbers. Yet ray tracing performance is obviously very important in the productivity segment, specifically ... ray tracing in Blender 3D and similar programs. Here nVidia has ruled supreme for a long time now, due to AMD's less than stellar RT figures. I would suspect that having an actually competitive AMD card with 16GB of VRAM, reasonable power efficiency and _not_ using that blasted 12V connector would be very interesting to a lot of people.
Thats a big if tho. Price is similar to what the 6800XT was, and that one was pretty much neck on neck with the 3080 of the same gen, not the previous gen.
@@Tiasung if this get at the same level or slightly better in performance compared to the RTX 5070, like the 5700XT was with the 2070, is possible for the 8800XT to sell really well. I don't expect the 5080 to be less than $900, so the 8800XT trading blows in RT and being better in raster than the 5070 while costing the same or less ($500~600), is kinda the same story as the 5700XT. Let's see if the Radeon executives don't miss the oportunity to miss the oportunity as always.
If they give us a generational leap as good as RDNA2 and more importantly decent pricing, i'm very interested with what RDNA4 will bring on the table...
@@NadeemAhmed-nv2br Yes, lower than last gen flagship overall while being released as being 2 models below it. For $350 less. You are delusional if you think this isn't a decent leap for the mid range market. Essentially 4080 performance for just over half the price sounds fantastic right now
Well I was preparing to buy a 7900gre or 7900xt in December, but it looks like I might be considering a new option. Even if it does come out next year, I think it would easily be worth the wait for a newer generation with better overall performance.
Around 4070Ti Super performance with comparable RT performance and 16GB memory sounds pretty nice, but this still needs to be priced right. At $600 this would've been competitive around the same time the 4070Ti Super launched, but with Blackwell coming soon I don't think $600 will be as impressive as it sounds. Nvidia themselves will likely bring out something that performs similarly at $600-650, at which point I'm getting that instead even if I end up with 12GB memory. If AMD thinks they can get away with comparable raster price/perf as Nvidia just because they have parity in RT applications, they're just being delusional, as usual. 64CUs on 256-bit sounds like a pretty small die, and on a relatively cheaper node too. This needs to be $500 at max, maybe $480 if they wish to be competitive. And if they want to be aggressive, $450. We mustn't let Nvidia's outlandish pricing structure get to our heads, people.
Yea comparing performance and price to the outgoing gen is really only a half battle won. And thus far Nvidia has been very silent about their overall lineup and performance estimates so we really won't know until much closer to those launch dates, but in any case we can expect them to offer better price/performance than Lovelace did at minimum so already the gap is being closed to RDNA4 in terms of pure price/performance at the baseline without considering added benefits like DLSS 4.0, Ray Reconstruction, lower TDP for equal performance level, etc. I think even if Nvidia has a comparable card that costs more, they'll sell more by default. That has been the case forever, unless they do a major blunder like the 4080 launch which was just way over the line for everyone and only served to either drive people up to the 4090, or over to the 7900 XTX.
@@Real_MisterSir the 4070 Super has an msrp of $600. An RTX 5000 $600 card has to be at least 15% faster to match the 4070Ti Super, 20% to match the 7900XT. There isn't much room for doubt, tbh. Even with RT parity AMD will still have the much worse overall package. Previously I'd have allowed a 70% price differential, but even if I allow 80% this time the 8800XT cannot be more than $480. You see the issue? Back when rumors were that we're getting a 7900XTX for $500, it was much more acceptable. This isn't.
Even if, and it's a big, that Nvidia launches something at close to the same price as what the 8800xt msrp is, and it performs similarly, at best it will be MANY months till that releases.. This 8800xt is coming out before the 5080/5090, which will both be drastically more expensive than the 8800xt, so one would need to wait for say a 5070 to launch to even get close to the price/performance that the 8800xt is rumored to bring. I doubt that will happen within 6 months of the launch of the 8800xt, and that means it will have all that time to run effectively unopposed as the indisputable Upper Midrange Mainstream King.
@@slickrounder6045 We don't know the launch dates for anything. The only thing we're mostly sure of is that the 5090 is coming after the 5080, which is coming out some time within the next 4 months. So far there has been no indication the 8800XT is coming out before the 5080. Therefore I highly doubt there will be a significant gap between the 8800XT and 5070's launch, and even if there was I'm sure Nvidia will reduce 4070Ti Super to $600-650 to remain competitive. No matter how you look at it, this is Nvidia's game to lose. More importantly, we as the customer lose either way because we'd be paying overinflated prices regardless of the GPU vendor we go for. Now if AMD prices it at $500 or below, Nvidia won't be able to price match them unless they bring a $800 card down to $550, which ain't happening. AMD will be able to enjoy a market that Nvidia simply cannot compete in, possibly even with the 5070. $500 or below is what we should be paying for this class of GPU at this point, whether it's Nvidia or AMD. For me this isn't a 3070Ti vs 6800 situation anymore, the only reason why I went with the latter is because I knew 8GB would bite me in the arse for the type of games I play. If I'm buying a new card in the $550-600 range I'm never getting the 8800XT when a similar option exists on Nvidia, even if the value is bad - that's the reality.
If AMD can pull this off I'll be very impressed! And definitely want to purchase one, that said, if they're trying to clear out stock of the 7900 series, the prices are still way too high. Also, I hope one of the AIBs makes an 8800 series in white :)
Ray tracing is honestly most trash tech invented, give me 3D display technology back! I wish stereoscopic 3D gaming was pushed on the same level as RT is. Most 120Hz displays are capable of 3D, all nVidia and AMD had to do was sell some cheap glasses and build the tech into their drivers, make game developers implement them into their game, 50 bucks for some quality glasses is nothing when gamers already spend 300+ on their super light "gaming" mouses. 3D gaming is so good, a good implementation of it is unreal its like looking into a window and see little plastic figures move around its mind blowing, RT can never match this level of immersion.
I'd rather see more VR than RT (I guess they could do both if we had enough oomph), but VR is near dead despite being the only real tangible change in the visual presentation of games
@@defeqel6537 VR does not succeed because of price, and most people do not care. 3D displays are same as conventional displays but you get the 3D experience of a VR headset, there is nothing inherently bad about it other than needing software to support 3D games or convert 2D games into 3D ones.
Its only because the perfomance isn't there to back it up. We've seen this before pre tesselation and other graphics features we take for granted today. Thats why I'm glad that Nvidia is pushing the technology. RT is way more correct way of rendering 3D space and adds a lot more realism and makes them more grounded. AMD just have to stop playing catch up if they want to be relevant in the GPU market. Also helps everyone because like it or not most of their hardware is what drives the games industry because its whats in the consoles.
Yep.. When add in 4080 is still superior due to Ray tracing, DLSS 2 and 3 being better than what AMD has to offer and arguably it having better drivers, it's not the same thing..
Yeah raster performance is already in a good space in the mid range. Current generation GPU's have dropped in price a lot in a lot of places and this kind of performance for RT can be had now at a similiar price point right now and not in a years time. Nvidia also offers better image quality through DLSS. At $499 this would take the markey by storm. Higher and people would gravitate towards a 5070.
This is good news for me, I want to hold off on a full system rebuild for another couple years. And I have an AM4 with a 3600x chugging along with a 3070, just want to swap my gpu and cpu, and then get an oled monitor.
One think i like a lot about you channel is getting bleeding edge news it is fun to get something early you know. mostly right but speculate and talk technology yours is always fun. just like you i feel like all the news the last couple months has been dark that get a touch tiring a lot of us really love this stuff and just want everything to go well. Anyway i know what you mean i certainly feel like i want to hear some good new. I d honestly like to have intel around it would be nice to here anything good over there as well. A lot of time lately i noticed people getting more angry easily maybe thats a sign that it is having an impact on the community.
well said, I agree, I am one of the EXTREMELY angered people of the community. In my opinion AMD needs to cancel RDNA4 and clear house in the graphics division. They are making products with unacceptable price/performance ratios.
If it close to a 4080 and is $499 that would be tempting, they're still lacking a good a.i scaler. Which is the main thing that puts me off using their cards. Also the next gen nvidia cards probably won't be that far behind so they really need to price it as they can and I don't think Amd will, they'll go for $599 lol
@@haukionkannel yeah maybe if nvdia's next gen wasn't around the corner, AMD make bizarre choice but i dont think they'll completely nuke themselves. the Whole point of this gen for AMD is to be competitive in the midrange. a 5070 would likely beat this card and i'd imagine that being around $599. SO if AMD price $50 lower like you say it would more likely be $549, which wont be appealing next to the 5070. Another fail for AMD lol it needs to be under $500 but they're not agressive enough
AMD might have a winner on their hands if they can ship this for $499 More importantly, they should make this a true successor to the 5700XT and call it the 8700XT instead. We all know what the true x800 and x900 class would be like. That being the chiplet-based cards coming with the release of RDNA 5 and RX 9000 series
If Amd actually releases an 8800xt in the coming months at the $500-$600 range that equals the 4080, with equal or better ray tracing than the rtx 4070ti Super, it's hard to imagine that won't be one of the most successful mainstream upper midrange graphic cards in recent memory. I highly doubt Nvidia would have anything even remotely competitive at the price range any time in the near future. By the time a 5070 would even come out, it could be many many months after the 8800xt was already out, and I'd be skeptical that it would be able to match the 8800xt in either performance, price, Vram.
We can pretty much bank on the 5080 being exactly the performance of the 4090D. Nvidia is not going to let sanctions prohibit them from marketing the 5080 in China. The 4090D is only ~10% stronger than the 4080 Super. So for there to be any meaningful segmentation between the 5080 and the 5070, the 5070 can't be much stronger than a 4070 Ti Super, or there will be no room for a 5070 Ti, and even then it's a bit cramped.
@@andersjjensen Yeh i doubt the 5070 will wind up stronger than the 4070ti Super at all, and thus it may not even beat the 8800xt (we could have a repeat of the 7800xt vs 4070 situation, except this time with the 8800xt having a many month lead in terms of release date, giving it a huge advantage). Its not even a certainty that the 5070 will have 16gb of Vram from what the rumors have stated so far. The 5070ti should reach the 16gb of Vram threshold, but it likely will just be a replacement for the 4080ti Super, and it will likely be closer to that price (no reason to expect it to be less than $800 like the of the 4070ti).
@@pdmerritt Well if you watched this video and the many rumors/leaks for months now, you would know that the msrp price range of the "8800xt" is $499-$599. Over $600 has never been mentioned as a possibility according to any leaks. Maybe AMD will get greedy like they did with Zen 5 and overprice things though. I personally think if ends up beating the 7900xt but not quite equaling the 7900xtx in raster (in Ray Tracing it should), then $600 is really the most Amd can charge if it wants it to succeed, and actually provide performance per $ improvement gen on gen. At say $650 it would end up more expensive than the 7900xt with 20gb of vram, which would be a bad look and would already not be particularly impressive going into 2025, and it would age even worse by mid/late 2025, and inevitability it will have to fall in price to be competetive. So no, its unlikely even Amd will bungle things by charging more than $599 msrp for the top Rdna 4 model.
If the 8800xt can pull of the RT/Rasterization rumors, I'll totally buy it over the 5080 regardless of whatever tech it may have. 4070ti SUPER level RT for $600? I'm cool with that.
64 CUs, on a 256 bit bus, on the same memory, and it will beat a 7900xt? I'm doubtful on the raster. I buy the RT. Have to go back and check the performance difference between the 6800xt and 7800xt.
then you might want to check the spec on 7800xt vs 6800xt. been seeing comment like your and seems like no one even bother to check the spec of the each card.
@@noobgamer4709 I meant it as a general reference point for investigation. Just watched Ancient Gameplays video comparing the 6800 to the 7800xt and the 6950xt. The 7800xt was 25% faster than the 6800 on average. 5-7 percent slower than the 6950XT. The 6800 ran at much slower core clocks and VRAM frequency than the 7800xt. 2900/2600 vs. 2400/2100. Likely where most of the difference comes from. 7900XT can be run at 3000/2700 and was 16% faster than the 7800xt at 1080p, 22% at 1440, and 30% at 4K. According to Ancient Gameplays. I don't see an improvement in memory speeds on the same memory. So that likely won't add increased performance. I can see them pushing the clocks above 3000 consistently better than the 7900xt, but not by much. It would have a latency advantage over 7900xt so I could see it coming close or better in 1080, but I doubt it would close the 30 percent gap at 4K. It should do much better in RT with that many more cores and the power should be much better.
the overclcoked models from the 7800xt catched the 6900xt...80 CUS card. so why not? besides this is a new architecture, supposed to be improved aswell compared to teh buggy RDNA 3.
@nicane-9966 The 7800XT ran both the memory and core 500 Mhz higher than the 6950XT. Even if they ran a hundred more Mhz in both than the 7900xt I doubt they shrink the performance gap to the amount being implied. 30% at 4K.
@@robertmyers6488 RDNA3 significantly underperformed AMD's initial performance claims, and there were leaks of a stability bug which (or its mitigations) resulted in a loss of performance. If this is true (the alternative is that AMD lied during RDNA3's announcement - not impossible, but I doubt it, as it hurt AMD's reputation and opens them up to being sued by shareholders) then fixing that bug, the ability to clock slightly higher on N4P compared to N5, the cache/memory latency reduction from the monolithic design, and adding an extra 5-10% performance on top of that from architectural improvements, give you your 30% improvement. This is admittedly a lot of assumptions, but it's not unrealistic. Previous GPU generations (e.g. Maxwell; the GTX 980 is 30% smaller than the GTX 780 Ti, and built on the same 28nm node, but about 10% faster while using much less power) have given similar or greater improvements.
See this leak might be real.. But we all know the saying "AMD never misses an opportunity to miss and opportunity" After the last 2+ years of launches, I'm not holding my breath.
@@MooresLawIsDeadMarketing isn't really their only failure. Price/Performance of the 9600x wasn't an improvement over the 7600. The best marketing in the world wouldn't change that.
@@martineyles AMD's marketing should have been honest about what tasks Zen 5's performance gains apply to, and emphasised the efficiency, rather than presenting misleading or inaccurate performance claims which reviewers couldn't replicate. If AMD hadn't overhyped Zen 5, the reviews wouldn't have been so negative. The engineers and firmware engineers obviously deserve most of the blame for the underwhelming and inconsistent performance gains, but it would be much better for AMD's reputation if their marketing didn't lie about it, even if that means admitting that Zen 5 isn't a lot faster than Zen 4. Zen+ wasn't a lot faster than Zen 1, but was still a good generation which got mostly positive reviews, because AMD didn't overhype it.
@@nathangamble125 The 'efficiency' is a moot point, Zen 5 isn't more efficient compared to Zen 4 non-X/X3D (7900 destroys 9700X in MT including efficiency and 7800X3D in gaming including efficiency).
MLID had a video where he actually said that Zen 5 was not very impressive before the sandbagging video, I hope whoever that source is is the source for this video.
I didn't say it wasn't impressive, I just said the 40% people were insane...and they were. Look, Zen 5 got the IPC they claimed, it just isn't translating into gaming on Windows right now (but it IS on Linux). Still, that's AMD's fault for over-hyping gaming and fumbling the launch.
@@MooresLawIsDead I bashed on you a bit on some of your recent takes but I swear you had that one video where you actually said a percentage on how fast it would be and you correctly pointed out that Zen 5 won't be faster than 7800x3d and you'd need the x3d version of Zen 5 to match Arrow Lake. Although you didn't put a figure on how it would perform against Zen 4 but that one actually looked a bit spot on ngl. I commented on the video though I can't remember which.
RNDA2 was a smash hit technologically. Obviously the 'rona and the crypto crap made the situation suck. If I understand things correctly RDNA1 and RDNA3 was from the same team, and RDNA2 and RDNA4 are from the same team... So there is a sliver of hope, provided Lisa has had a stern talking-to with the marketing team.
5700xt was super good! 6800 was good… So i have no doupt that they can make good gpus. But if this is near level 4070ti…. It will also be close in price… most likely $50 cheaper. So maybe $750 for top model $650 for cut down?
64 CU's is kind of disappointing. I mean, the 6800xt had 72 CUs as a monolithic 800 tier card. I know that the performance per core might be better, but it almost wasn't with the 7000 series. I'll just keep my 3080 for now....
Not just CUs, but ROPs/TMUs/streams processors/Infinity Cache were also higher compared to the 7800 XT while only using 40W more. All the more underwhelming when the 7800 XT would lose in 1440p/4k benchmarks or draw even with the 6800 XT in games like Dying Light 2, FFXIV, RE4, & Tomb Raider games
My current consideration, Nvidia hasn't started or has barely started testing the GB203 and lower tier dies as they are still focused on GB202 and its list of video cards. This allows AMD to fully launch an RX 8000 series product before Nvidia can launch one RTX 50 series card. This can look like weakness in Nvidia. But even if AMD comes out with a decent competitor in the RX 8800 XT, will people make the switch?
Nvidia doesnt really give a damn about AMD radeon now, they have so much money from selling AI accelerators that its basically irellevant for them if AMD radeon releases sooner or later than new RTX
@@damara2268 This. The entire RTX product line is a side-project to nVidia at this point, all their money is in AI and encode. If AMD somehow became too difficult to compete with, they would just exit the segment, which just feels like a very funny position for the industry leader to be in.
7900xtx 64 shaders per compute unit. Similar to 2000 series cards which had 64 shaders per core. 3000/4000 series Nvidia cards have 128 shaders per core. The kick in the pants is AMD getting mid range 3000 ray tracing performance with 64 shaders per core. Thats insane. So if/when they double shaders to 128, they will perform extremely competitively in ray tracing. But that also means higher gaming performance as modern "shaders" are just generic compute shaders. Hell even physics run inside shaders. Dedicated cores dont exist on GPUs anymore. They stopped that shit when generic compute shadeds took over.... Ray Tracing Cores don't exist. Read the fucking Microsoft DirectX 12 Ray Tracing white paper. If you are gaming and using ray tracing, you are using DirectX12.... And they literally tell you that ALL Ray Tracing functions, are shader functions. How are you a leaker but you dont even know the basics of how GPUs work. 7900xtx 96 compute units. Also had 96 RT cores. Why? Because when a GPU generic shader runs an RT workload, technically its then an RT core. Same on Nvidia, 4090 has 128 SMs (streaming multiprocessor, their name for graphics cores as a whole, like how AMD calls them compute units) and magically has 128 RT cores. RT cores in the sense of dedicated cores do, not, exist. No, you dont know someone from Nvidia who said they exist. You read marketing which told you they exist. Then people will "reeee" saying "what about Nvidia's new bvh traversal engine" oh you mean the new engine DirectX12 implemented thats specifically stated in their white papers? That? Because yeah, they are running a new algorithm in the shaders, via DX12 code. Its all there, black and white, in english, for anyone to read....
Hey Tom! Did you get confirmation that wgp did not get changed? 64cu would just be +4cu compared to n32. Sure, if it clocks to ~3ghz (almost 50% higher than 7800xt) , that'll do. But it seems a bit unlikely that the 4nm node will make that possible. What if rdna4 has 3cu per wgp?
The only thing I personally care about with RDNA4 is, will there be an, _at least_ 6GB, card that either doesn't need an additional power connector, OR at most only a six-pin connector. *_That's It_* . . . . something to pair with an 8600G/8700G in a tiny build, that won't generate unnecessary amounts of heat. It's not possible for me to express how _Little I Care_ about raytracing performance.
Why TF would anyone buy an 8600G or 8700G just to pair it with an entry-level graphics card??? Get a 7500F or 8700F instead, or wait for MiniPCs with Strix Halo. It might make sense to get a low-end graphics card to upgrade an ITX PC with an 8600G or 8700G in 2+ years, but pairing these APUs with a graphics card is a waste of money right now.
While I expect n48 to be he Only intresting gaming gpu in 2024/2025, I find it hard to belive That a roughly 240mm2 is going to trade blows (sometimes in RT) with Nvidia ~379 mm2 on the same node.
Yea it wouldnt surprise me if it's very selective as not all RT is equal. On one hand you have a game like CP77 where you have RT enabled to the max and even Path Tracing on top - vs another game like Elden Ring where it's minimally implemented but still advertised as "having RT enabled". You can find plenty of games already where AMD cards perform "better" than Nvidia cards, simply because the RT is so minimally implemented that raster still plays the biggest role and the RT gains don't outweigh it. Nothing in this regard should be considered at face value until we have actual independent benchmarks from 3rd party testers. If AMD had leapfrogged Nvidia in RT to that degree they wouldn't be able to shut up about it, we all know how AMD marketing department works lol.
The 240mm^2 size was a claim on Twitter by All The Watts which they didn't provide a source for, but was apparently an estimate based on assuming that all parts of the die can be shrunk at the same ratio as logic (which is untrue, PHYs and cache can't be made as dense as logic). My own calculation, which accounts for the relative area of cache, PHYs, and logic (based on die shots of RDNA3), gives an absolute minimum size for N48 of 263mm^2 if it's built on N4P. Realistically it would be slightly larger than this, considering the additional RT cores and added features, probably about 280-300mm^2. Hypothetically it could be smaller if there are major improvements to circuit routing, but I don't think that's realistic unless RDNA3 is just a fundamentally terrible architecture, which I don't see any evidence for (it underperforms, but this seems to be due to a minor bug rather than a conceptual flaw in the architecture's layout, and RDNA3 is already much denser than RDNA2, so it's not likely that there are many more architecture-level density improvements that can be made). A 300mm^2 die competing with a 379mm^2 die on a similar node is still a significant discrepancy, but reasonable. Remember that Nvidia GPUs use a much larger proportion of their area for AI and RT cores than AMD GPUs, and also that Nvidia architectures since Ampere are fairly bloated due to having twice as many CUDA cores per SM (the extra compute helps a lot with AI and workstation tasks, but doesn't do much for gaming). N4P also seems to be slightly denser than 4N, though by less than 6%, so it isn't a major factor.
I'm officially waiting till Zen6. I refuse to reward bad behavior such as the chaotic launch and the lack of desire to strike at top nvidia cards. (I'm still on 3600x and a 2070 btw.)
@@marktackman2886 doesn't really matter what they're supposed to. Reality matters. And whey they are able to basically launch a 4080 at 600$ that sells like hot cakes, Nvidia can't charge >1000$ even for their 5080 even if that thing is 30% faster (600 * 1.3 + random nvidia tax < 900. AT MOST) anyway, I have hope that this bring GPU prices back to where they belong over the course of the next 2.5 years
[SPON: AliExpress is up to 80% OFF for Back to School! New Users get FREE Shipping & Returns: s.click.aliexpress.com/e/_DE6fZgP?af=MDUS004 ]
[SPON: Until 8/25 use MOORE25 for $100+, MOORE15 for $60+, and MOORE9 for $30+ orders!]
[SPON: Retro Video Game Console: s.click.aliexpress.com/e/_DBkS5EB?af=MDUS004 ]
[SPON: Delux M900PRO Wireless Gaming Mouse: s.click.aliexpress.com/e/_DdE4zMx?af=MDUS004 ]
[SPON: Baseus Bowie 10 Max Wireless Headphones: s.click.aliexpress.com/e/_DEkV7ZV?af=MDUS004 ]
Only 16GB of VRAM :(
Mate, further tanking of Intel is GOOD news not bad. Roll on Ngreedia imploding in the same fashion.
buying engineer sample cpus on ali!
So far as I can tell, Ali Express is a decent place to buy CPUs.
But the biggest question is, will it support Linux beyond just vanilla drivers?
I hope AMD marketing doesn't screw up another great opportunity
I keep getting nvidia ads but never got amd ads
they will
They always deliver 😂
The best part is they will lower the MSRP a week later after getting bad reviews
AMD Marketing never fails to disappoint. Especially Raedon division.
AMD will absolutely blow it. I've never been disappointed by a product I've bought from them. However their greed keeps landing them in second place. They don't have the same install base as Nvidia and never will if they don't take a full generation and under cut the shit out of Nvidia.
All we know is: AMD will never lose the opportunity to lose the opportunity
The never-ending story.
too comfortable with its 20% market share of redditors and contrarians lol
And if they *didn't* lose, that means they just lost the opportunity to lose the opportunity
@@fattestallenalive7148damn, they can't get it right
Seems like every time they're about to have a shot at getting ahead it turns out they were aiming for their foot.
Glad the leak is good news, but I am tempering expectations until we see reviews
The only thing I care about for RDNA 4 is a return to sane GPU pricing from at least one company. RDNA 3 was a hot mess at MSRP and I hope that mistake isn’t repeated again
The one silver lining of RDNA 3 is that prices DID eventually adjust. Ada Lovelace cards by comparison stubbornly clung close to their MSRP.
AMD will always never fail at being a disappointment. Given their shitty Zen 5 launch, I'm not gonna bet on AMD self-correcting and they're still going to chase Nvidia level of pricing at launch.
@@alexmills1329 bet AMD releases at 649 or 699.
@@lyserberg not in my country, 7700xt still overpriced over 4070. that why amd never sell well here.
@@lyserbergEven the RTX 40 series eventually adjust. In my region AMD is still a hard sell since their price to performance ratio is just slightly btter than nvidia while not providing better premium featurs, plus drawing more power.
For the love of God AMD, a next generation GPU win is there for the taking. Don't f**k it up.
They're probably going around the directors board table now brainstorming ideas on how to fck this up proper 😂
Now seems like the perfect time to overhaul AMD Adrenalin and the driver architecture.
-AMD probably
AMD: we are not charity organization.
Now they really need to have a score with this one because their gpu sales has decreased this year. Also they are most likely targeting Intel battlemage which are predicted to be released close to this date.
They will.. all the CEOs are way out of touch these days..
50% faster than 7800XT at the same power draw? I believe it when I see it.
I expect like 20-30%. 50% is just bullshit big number
i recommend the Video of High Yield about RDNA3 expectations vs reality, i think they aim to at least reach what was promised then, not more.
Which would give us about 25% better performance than 7800XT at the same CU count, which would bring it on 7900XT level.
Same, I'm thinking more 4070ti Super at best
@@JamesRussoMillaswith 16GB VRAM, but for what price?
I say if it ends up at 4070TiS performance, they will charge $499, if it ends up somehow 4080 performance they will charge $599.
AMD marketing department scrambling like crazy to find ways to fuck this up
Knowing AMD they'll price it a $100 more expensive than it should be.
I hope Lisa has had a stern talking-to after RDNA3 and Zen5. If they just revert to RDNA2 and Zen4 marketing strategies nobody will complain about anything.
@@Loundsifyand then 3 months later there will be a 100 USD price cut to place it where it should've been day 1 🤦
@@Loundsify thats generous. more like 300 more, then after it bombs for a few weeks, they will lower it to 200 more, then after 3 months of awful sales they will keep it 100 more than it should be
@@Loundsify That's my bet. Tom said 499-599 so they'll launch at 650.
If AMD launches a $500ish GPU that competes with a 4080 with 16GB of vram, i'm buying day one once third party reviews verify the performance.
Not gonna happen, it's going to be rdna4%😂
@@dagnisnierlins188c'mon man let us budget ballers dream. At least that
Not gonna happen bro ..AMD is gonna milk AI and add 50 to 100 dollars 😂
@@lucazani2730Since when has $500 been budget?
@@martineyles it looks like 2 to 3 years, actually. I'm not saying it is right. But that sure looks like it. In the last 3 years 500$ has been budget. I don't make the rules
Oh no... retailers complain that they can't move hugely overpriced gpus in volume? I wonder why...
Everyone who was going to pounce on this generation already has. It's the same every single cycle. I've seen this play out over several decades. They can lower prices all they want by now, and only improve sales a little bit. The value crowd is waiting for next gen to come out, so current gen gets heavily discounted, and the enthusiast crowd isn't going to buy nearly 2 year old products now.
Better make less gpus and increase prices so less will be in storages!
😂
@@andersjjensen I disagree people love bargain sales, lets say 25-50% off..but these products never gets close to this. so if they are stuck with them they get what they deserve. performance electronics do not age like wine
@@hotdog9262 Yeah, you're in the value crowd. That's what I just said. You'll take a 2 years old card at enough of a discount because you expect that to be better perf/$ than next gen. Or you'll wait, because the discount doesn't happen until next gen is actually out, and do your calculations then. But it will never be 50% off. The 6750XT launched at $550 and is now at $320, which is probably at break-even for AMD with only a few dollars for the AIBs.
@@andersjjensen maybe im mistaken, but personally ive never seen any graphics card heavily discounted due to the next gen coming out
So they’re trying for a Ryzen 9000X3D + RX8000 double whammy, maybe this time around the Radeon Group actually delivers and doesn’t just send a douche in a sports jersey on stage.
I'm currently using a 5000 series chip and was considering an upgrade to a 9000 series chip but oh boy did they lose some of my trust with those overblown marketing numbers.
@@RN1441 if you have the 5800x3D, dont upgrade to 9000 in my opinion.
@@crassusmaximus5879agreed unless you aim for 7900xtx or 4090 level of performance should stick out for AM6, the 5000 series will be more than sufficient until that comes in a long time from now
@@RN1441 yeah go intel
I have a 13600kf fml, and RTX 4080 so unfortunately this doesn't have the horsepower. However, now I'm very excited for a RDNA 5 and my next PC will be the 10000x3d? and 9900xtxt? Whatever dumb naming scheme AMD decides lol.
I just hope RDNA 4 goes back to power efficiency again. RX 6600 consumed about 100 watts, but then for some reason the 7600 consumes more than 160 watts.
The RX7600 uses the full die, it's the successor of the 6650XT but they didn't want to call it as such because it was just ~2% faster. They didn't release a cut down lower power version because they still had so many unsold last gen cards. The RX7600 is probably the biggest failure of this GPU generation.
@@lharsay The RX 7600 is a smaller die than the 6600/6650 XT, at least, but still underwhelming in terms of price/performance.
@@hyperixz You know you can always power limit it, right? Of course, you'd have to accept the loss in performance.
It's not like AMD could have just decided to release the card at the same price and performance but using 60 watts less energy. Something has to give.
If you want the card to use less energy just tell it to in settings. But I suspect like most people you don't want to give up the performance... So that really means AMD made the right choice to boost the power, their product overall just isn't as good as people wanted.
The RX 7600 will likely be their $200 budget card for the next 2 years would be my guess.
@@Loundsify It will get replaced by the smaller RDNA4 chip. The 7600 is a very silly card, due to the high power draw the board and the cooler costs more than the ones used for the RTX4060.
Now the production stop of the 7900xtx makes more sense too. That 6-9 months ahead of release sounded a bit long.
I gotta wonder though how much fab capacity they will be willing to give Radeon while also trying to get ahead vs Intel in Laptop.
Addendum: announcing at Gamescom would give them some positive news to distract from the clown show that is Zen 5.
Agree on that last point!
TSMC is reporting under utilization on all nodes. AMD can get exactly what they want when it comes to 6,5,4 and 3 nanometer.
@@andersjjensen Heh heh... that's EXTREMELY good news! ^^ That means the silicon manufacturing industry may have done the GOODY of building over-capacity after the chip-shortage and we can now all enjoy lower chip-prices and GOOD availability in the years to come.
@@predabot__6778 I don't think prices will improve much. Wafer cost is going up with each node for entirely technical reasons. Basically EUV arrived a decade late, and High-NA EUV is still not ready, while also being a bit of a clutch.
Finally something to replace my rx 480 with
At twice the price of the RX 480 8GB.
RX6600/6600xt were the replacement for that, 2x perf +50% price most recently same price, not to mention massive efficiency improvement.
You also get the 7600M eGPU which I would wait for it to be cheaper. Hopefully there is a greater OCuLink push in the future
I was just looking at a 480 to replace my hd 5850
@@martineyles 480 was low tier, its natural replacement was the 6600 or 6600xt not a 80... ofc is gonna be more expensive, meanwhile is gonna be like 8 or 9x times faster...
i had exact same thoughts! but i also gonna build a new am5 platform anyway too
They NEED to leverage their extra "ai" type cores for an FSR update. It's already in the 7000 series, and will likely be in 8000 series. If the ray tracing is around a 4080 with less power draw I'm already interested in it, and if they did that I'd pick it up in a heartbeat.
He said 4070 ti, not 4080 RT performance.
Agreed, they absolutely need a comparable feature set
@@NadeemAhmed-nv2br He said both. For AMD optimized titles specifically could get close to 4080 ray tracing performance.
@@NadeemAhmed-nv2br 4070ti and 4080 aren't THAT far apart tbf
i bet for nvidia optimized games with path tracing it will be a mess, unfortunately
After the zen5 leaks, I'd honestly take this with a large grain of salt to avoid further disappointment.
honestly the radeon 7000 series was also them being sure it would perform insanely well, but had problems they couldn't fix at the last minute. the ryzen 9000 story sounded too familiar in that context.
Usually AMD products need at least 3-6 months in the oven post release before they're actually somewhere regionally close to what was advertised. Lets see how their marketing manages to fuck up another unfuckupable situation :D
From what I've seen Zen 5 is actually a pretty big uplift - if the OS is optimized for it. Ironically on Intel optimized Linux distro Zen 5 is 20%+ stronger than Zen 4.
@@MrSolvalou Yeah but who tf cares about Linux tbh. Not the average zen5 buyer, I'll tell you that for a fact.
@@kurgo_There is a scheduling bug in windows. So don’t write off zen5 yet
if they called it an 8900XTX people would go mad in joy that "Flagship cards" are back down to 500-600 dollars but at the same time hate it cause its not a 5080 competitor.
AMD would be crazy to brand it a '8900XTX' because that would mean their new "flagship" getting absolutely dumpstered by both the RTX 5080 & 5090. Technically it'd also be beaten by Nvidia's last-gen 4090, which would make the perception even worse.
Whatever they call it, it will be their flagship and it will be pathetic performance compared to nvidia. They don't seem to even be trying to compete at the high end this generation.
@@levigoldson It’s been leaked before and talked about publicly afaik that they are in fact not competing in the high end because the cost of the gpu would be “too expensive”.
8890XTX 32GB for $749-$799 is good enough for me!
@@levigoldson The vast majority of people don't feel like dropping $2K on a gpu. Those top end cards may as well not exist. The only thing that matters is P2P.
Fine Tom, I can stay up for another 15 minutes I guessss.
I really shouldn't, but I did anyway.
Same but who am I kidding, I'll likely stay up for longer even after this -_-
Hopefully it won’t end up like Zen 5 launch.
Rdna4% 🤪.
I mean, look at the difference between 6800XT and 7800XT
@@daqtheduck6296 dont just look at naming. spec wise, 7800xt are even lower than 6800
It will
It will. AMD will optimize RDNA4 for servers and at gaming it will be mocked as RDNA4%
RDNA4 will be like Polaris so expect a 4% improvement.
AMD giving us 4070 Ti Super level performance (a $799 card) for $549 would be pretty amazing. I'm not in love with 16GB of RAM for my AI stuff, but that is enough to handle current games in 4K, which is good. If this does happen, it's a huge shift in the midrange that will help a lot of gamers. Thumbs up!
I wouldn’t doubt the 4070TiSuper dropping to that price after next gen launches, if they would’ve done that for last gen sure but I think being a gen later makes it not as big of a deal
@@TopNotchPanch looking at what happened with RDNA 3 and Ada Lovelace, with close to no price to performance improvements apart from the extremely high end models, I'd honestly take a 4070 ti super performing graphics card any day of the week
You can make a higher VRAM card with this die without resorting to clamshell, which for a mass market midrange GPU is not cost effective
Although AMD will happily sell the W8800 with 32GB of RAM, Pro Driver support for more money thou
@@lucazani2730 I think if you’re a few gens behind or lower tier sure, I have a 6800xt I bought over a year ago (used for ~$400) and this isn’t a big enough jump for me to justify buying although it would depend on RT performance to really say. GDDR7 would’ve definitely helped sway me
The 3070 gave you 2080 Ti (like 1100 retail price) performance for 500
Let's keep our expectations extra tempered. The Radeon team might not be the Zen 2-5 team but they have the same marketing department and the duopoly to maintain.
What duopoly? Nvidia has 88% marketshare.
@@GreyDeathVaccine No way is that Nvidia's real market share. Are you joking?
@@vigilant_1934 I am pretty sure It's actually even worse, more like 90%+
By the time this happens the 4080 will be reduced in price to $800 or less and will sitll be a superior card due to better upscaling and ray tracing. Y'all are delusional if you think a large percentage of NVIDIA buyers would switch over
@@sudato3502 it's 70-30
We got baited by "rdna4" in the title of the podcast. I get it tom. At least we did not have to wait long.
haha atleast he delivered...
He did say in the podcast that he would have a video this week with new details on RDNA4.
He puts RDNA 4 in the thumbnail of every podcast lmao
After Zen 5, it is a lot harder to get excited about AMD.
xD
One medicore launch and you give up?
@@Mark_Williams.RDNA3 was a shit launch as well, remember the 7900xt gone for 900$?
On the contrary, this is sounding like AMD is finally waking up a little bit.
@@Mark_Williams."one" they didn't have a good launch in years. we always have decent pricing in the end because nobody were buying them and the market adjusted the price by itself.
AMD: We have another great product in the works
AMD Marketing: muahahahahahahaha
let me say that adrenaline is so much better than gforce experience.
this
When it actually decides to launch when you hit the keys*****
Wish I wasn't reliant on Nvidia for work. Rocm just is not there yet
Debatable, but I love AMD Adrenaline.
I agree, but I prefer to use neither. Drivers only please and thank you.
Couldn’t be happier with my 7900 XTX that I got on a sale for 999€. If you use FSR3 Ray Tracing is no problem for it. If they improve Ray Tracing further in RDNA4, this one is gonna be a winner.
Your forced to be happy with your purchase because you overpaid, especially "on sale" I wouldn't recommend to brag about that purchase.
Did you buy it very early in the generation? Because by todays standard 999€ is anything but cheap for the 7900xtx. Its actually expensive.
@@imlegend8108 Early/mid 2023 when I was looking at prices for the XTX, the best versions were going for $1200-$1300. The cheaper ones were still around $1100.
Edit: I'll add that since the XTX production is stopping, the fact you can still get it for MSRP isn't too bad. It went down to around $899 at one point I think and maybe $849 for open box. Just checked PC Part Picker and there are only 3 or 4 models below $999. Again the best versions, like Sapphire Nitro+, go for $999 or more such as AsRock Taichi OC and all of Power Color's ones. The Nitro+ was the one I had my eye on for months but I decided against spending that much on a card right now when I can't guarantee I can make money with it for a while. Have a 6700XT as a filler until maybe this RDNA 4 release.
...for sale ? This wss the MSRP at launch.
6900XT owner from 2021 and I really wanted to upgrade this year and I don't feel that the 8800XT is going to be enough of a punch up for me :-(
AMD's marketing department, proverbially on a bike, holding the stick, leaning over and just so desperate to place it right in the spokes hard and fast. Good luck AMD, good, luck. Imagine if competent people there were allowed to make the big decisions, I'm sure there are plenty there, they just aren't at the controls.
A 4080 is around 45-50% faster than a 7800 XT
7800 XT has 60 CUs and 8800 XT 64 CUs
RDNA4 is a gem of an architecture and a huge upgrade over RDNA3 for it to trade blows with a 4080 at those specs... Or is it the next gen IC that is doing all the work?
Come on now, let's be serious...
You're saying it's too good to be true?
My 7800XT has been good to me so far, anything above that price is unreasonable for me and I hope they keep that target.
If it's $499 and is actually able to compete with the 4070 ti super and sometimes 4080 I'll buy it.
4080 is TWO YEARS OLD. It should NOT blow your mind that a new card is "trading blows" with that. GPU progress is ridiculous... If they can't increase the performance, they should focus on price and drivers... There is enough competition going around so it looks a bit like all 3 designers are holding back!
100%...my conclusion is most of the comments don't really follow the market, they are just thankful for crumbs of knowledge
It's the PRICE for the same performance that has improved
@@nonameentered1918 the overall point is that it should already be at that price.....AMD slow rolls price cuts before the announcement of a new product to anchor the price
Dude they ARE focusing on price and drivers. AMD drivers have been really solid for a long time now and a 30%+ price drop for the same performance is a solid generational increase.
Of course, we won't know anything for sure until it comes out, but if the leaks here are true this is definitely exciting news for people not trying to spend $3k on a gaming PC.
Would be fine if this was priced right. But knowing AMD they'll go $549. Which is still absurd for a mid range card.
You're welcome for the news on the 8800xt guys. I just bought a 7800xt yesterday, so that's why we have it. 🤣🤣🤣
Can't wait for this to underperform massively and be sold at a massively higher price than anticipated lmao
As always with AMD
So a bit faster than 7900xt with less ram but RT is usable.
Priced at 100£ off 7900xt.
The price is high, I have seen 7900xt go as low to £620 in the UK. Think it's about £680ish lower end now.
In the UK the 8800xt will start off over £600. If US 5-600d.
Sounds good for RT lovers.
Shame about the price
And with substantially less power usage.
@MooresLawIsDead thast actually the thing that could be exciting for me, but if its 200-220w max.
Maybe the 2nd card in the lineup, if it doesnt use more power than rx 6800.
As for the supposed RT uplift... sadly we heard the same things the last 2 generations and it wasnt true at all
Nobody in their right mind would be happy to pay anything above 500 after 2 years of having similar performance with higher ram and the launch price within a hundred Euro/dollar/pound difference of what is already available. Especially as they are looking to get amd buyers who care a lot for ram and less for a small efficiency improvement. The argument that they will launch it higher to empty sales of rdna3 is exactly why their market share keeps shrinking. They should eat the loss and try to double market share by starting out with aggresive pricing.
@@maxs9894 you still have loads of over priced stock from 2 generation ago and so many SKUs of the 7000 series's.
Can they have a big fire sale, clear the cards like 7800xt 7900gre/xt. If these cards are about still then 8800xt will be over priced and people will not touch them.
@@MooresLawIsDead I like the sound of one
of these card myself :)
I really
Really hope it’s not called the 8800xt unless it beats the 7900xtx
AMD needs a marketing win
Even if the performance means it could meet/exceed a “8700xt”
Calling it the Rx 8700xt, solidifies its role as a mid range card, at a mid range price
500$ and edging out the 7900xt, while using 250 watts?
That would look so good for AMD, and the reviewers will test it against the 7700xt, which it will trounce completely
The 7800xt should have been named the 7800, as it was 150$ cheaper then the 6800xt and even 70$ cheaper then the 6800, but was compared to the 6800xt because of its name, leading to a very good GPU getting very bad reviews
All those are pipe dreams. RDNA4 will be like Polaris so expect a 5% better performance from the 8800 XT. Also AMD wouldn't cannibalize their 7900 XT/XTX sales when they decide to not have a high end competitor.
AMD doesn't need marketing wins. Stop conflating wishful thinking with any needs
@@laszlozsurka8991 "RDNA4 will be like Polaris so expect a 5% better performance from the 8800 XT"
5% better than what, the 7900 XT? The RX 480 was slightly slower than the R9 Fury X, but way more than 5% faster than its predecessor, the R9 380. We're expecting the RX 8800 XT to compare similarly to the 7900 XTX and 7800 XT, respectively.
"Also AMD wouldn't cannibalize their 7900 XT/XTX sales"
The sales of a GPU that they aren't making any more because it costs a ton more than Navi 48 to manufacture (another interesting parallel with the Fury X and Polaris), and which they intend to sell out of soon? Why would this be an issue?
@@nathangamble125 5% faster compared to 7800 XT. As I said, RDNA4 will be like Polaris and if you compare the RX 480 to it's successor the RX 580 it's a 5% difference in performance. 8800 XT will be a 7900 GRE in performance.
AMD still makes money from selling the 7900 XT and 7900 XTX so if they make an 8800 XT that would match a 7900 XTX then AMD couldn't sell the 7900 XTX because people would buy the cheaper 8800 XT if it had matching performance and AMD also can't charge $800 or more on an 8800 XT.
If you don't have a successor on the high end, then you don't want to make the lower tier GPU perform too good otherwise you're gonna lose sales on your current high end product.
@@laszlozsurka8991it’ll be more than 5% faster than a 7800xt if it’s even close to the 7900xt performance which is 15% faster than the 7900gre
My second hand 6700XT is still perfectly adequate.
Is it do? I have one too and I can barely play the newest titles on 1440p on more than medium with an 5700 X3D
@@Etemussperfectly adequate at 1080p is what he meant I think
@@Etemuss I am playing a lot of new titles as well and have no problems. Use upscaling and can run everything at high/ultra settings. I think you are being unrealistic. You bought a midrange card and expect high/enthusiast level performance.
@@puffyips I can promise you the performance hit going from 1080p to 1440p is marginal at best with this card. Most games it gets the exact same performance. 10 fps difference in the worst scenerios.
@@Etemuss ultrasettings slave i see, get out of here NVIDIA fanboy.
I honestly don't believe any AMD gpu leaks anymore, we've been burnt some many times, but I do hope it's a good generation
"So many times" is just RDNA3. RDNA1 and RDNA2 were right on the money (unless you follow Red Gaming Tech, in which case everything is a major disappointment because he lies through his teeth to get views from people who hope for miracles).
@@andersjjensen RDNA1 was a mess , performance was there and for the money it wasn't bad but it was let down by its drivers , my wife had the 5700XT Nitro+ SE and it was a disaster, I had Radeon VII it was also a mess , Fury X was disappointing , so was Vega 64 which I also owned , I now had the 7900XTX which I do like actually I have 2 because my wife has one also, hers runs fucking great but my one does have issues here and there , if AMD doesn't get a grip I might move to nVidia for good
I don't see myself upgrading from my 7900xt . I've had it for a year and love it. I could use more Ray tracing but it not worth a extra 500 for a few more frames in Ray tracing won't be enough to be a big change
not too suprised about the specs, kinda what I expected. The only thing I didn't expect was the october/november release date, so if true, AMD really did hoodwink people on the idea of when they were going to release it
People thinking this is a revolution are insane, the 7900GRE already exists, setting up the memory to match the speeds of the 7900XT, makes it perform virtually the same.
This card will just be better at RT.
It needs to be $499, can't be more than that.
How does the 7900GRE beat or equal the 4080?
Techpowerup has the XTX at 37% stronger in raster than the GRE. That is not chump change. Even if the 8800XT is only 90-95% the raster of the XTX, but delivers on the 4070 Ti Super level RT that is still a sizable generation uplift at $599. The GRE's more direct competitor/replacement will be a cut down variant.
@@andersjjensen Even with proper VRAM speeds, matching the 7900XT?
That puts it only 15% away from the XTX.
15% more performance plus matching the RT performance of a 4070Ti, can really make a $600 8800XT look that much better than a currently priced $520 7900GRE?
@@peterfischer2039 it's 15% weaker than the XTX with proper VRAM speed.
@@MaxIronsThird 7900 GRE does have stupid amounts of headroom for overclocking though!
I hope rdna 4 has better idle power consumption than rdna3 ...
I found that idle consumption depends on your OS and power settings. My (RDNA2) 6950XT consumes 50W at idle in Windows 10, and 18W in Linux Mint.
That's been fixed ages ago. My XTX is perfectly fine, running anywhere between 10-15W @ 4K 144Hz when idling. Memory downclocks almost instantaneously when the load is gone. And that's on W11, some Linux distros can indeed perform even better.
@@danieloberhofer9035 That's great to hear. I think I only remembered the launch situation.
@@SciFiFactorywell that’s most people on here who don’t currently have a card to know the situation
Unfortunately *not* fixed with multiple monitors with some types of freesync hardware - mine still idles at 90W no matter what the OS is. Actually bothered sending reports in to AMD with it, but I guess some cases count as no cases.
I'm running Rx 6800 but I'll wait for the RX 8800XT and make sure it has 2x 8-pin connectors.
It shouldn't need dual 8-pins at sub-300W power draw.
@@superscuba73 My current RX 6800 uses two 8-pins
@@superscuba73 And why is that? PCIe 8 pin connector is sertified for 150W and the PCIe slot 75W. How would you do it? 150+75=225 last i checked.
@@superscuba73 ??? Even the 7600 xt wich is a sub 200w card has 2 8 pin....
@@nicane-9966 I have the only model of 7600XT that does not have that. The Asus Dual.
Babe, Tom just fully leaked RDN4
I hope it regains 6 display engines, the 7000 series was a disappointment for going down to 4.
AMD needs to make a 32gb model in order to compete in the AI market. there are already a lot of cheep 16gb GPUs but anything more than that is extremely expensive. I think that if they do launch a 32gb model at 600$ or 700$ they will steal 5090 sells for people that are trying to make high end AI servers on the cheep. At least I would buy a few if the 8800 xt has 32gb.
AMD needs to announce which models will receive ROCm support
I suppose its impressive for AMD to come so close to NV in RT, but RT perf as a whole is just so underwhelming that it doesn't excite me when it's supposed to "come close to a 4080 in some games"
Exactly as strong as I'd hoped. I was betting on 4080 at half the MSRP. Hopefully some last minute bug won't show up cutting perf down! I got a good feeling about this one.
He said 4070 ti super, Not 4080
Thanks Tom, MLID is NOT dead!
I'm full on hopium with this one because that would be the first upgrade I'd consider in ages (moving on from my og 3060)
I fear it may be more false hopium.
Same. Huffin that shit
>in ages
>og 3060
Imagine people that still are on Pascal.
didnt 3060 launch like last year or two ago? its still being produced no?
i have yet to upgrade r9 280 and gtx 980 😅
Me with a 1660 super
Now, are absolutely 100% of the AIB vendors going to throw away the very useful USB-C port for absolutely no reason, as they did with the 7900 series? I sure hope not...
Hope that it's priced reasonably and that they're serious about taking a chunk out of market share this generation.
They aren't, all those dies can goto Mi300x which are in SUPER SHORT supply and they're making money hand over fist with even without 10% ai market share.
They rather make 1000% profit vs under 30 with rdna4
This is the worst case scenario for RDNA4, they are blowing the launch ALREADY. They don't want marketshare, they want margin and profits. They don't get enough chip supply on the latest node to justify good products at good prices. The whole trick is for AMD to convince everybody they have enough supply.
AMD is dumb if they launch the thing at 549-599. They _need_ to gain market share. No one currently believes in them, period. Whatever they do, they need to put graphics cards in people's hands, to show them that there is a legitimate alternative to NVIDIA, and fast, and the only way to do this, is through _very_ competitive pricing. It's been shown time and time again that if the pricing is not _very_ competitive, people just don't care. "Thanks for bringing Nvidia's prices down, now I'm gonna go buy their stuff" - that's what AMD gets every, single, time, and it's sickening
I don't think that AMD can reduce Nvidia prices either way. Nvidia is more popular & compatible for AI applications and AI makes them MUCH more money right now than gaming ever could. AMD could sell their top GPU for $100 and Nvidia would just wait until AMD can't keep that pricing up anymore.
Nvidia prove that you can still get tons of buyer even with super expensive price.
Sadly both companies have calculated that they can make far more money by selling as many chips as TSMC will allow them to make as 'AI' whatevers. Until that bubble pops, GPUs are a drag on their profitability.
@arenzricodexd4409 Nvidia only gets away with that because their brand is so strong. AMD can't do the same if they want to be competitive. It's like trying to compare Fila to Nike. Both shoes are made from the same materials in the same sweat shops, yet Nike can charge top dollar because of their strong brand.
They don't need to gain market share Stop your wishful thinking.
AMD really can’t ask more than 499 for this if it turns out to be somewhere around 30% faster than 7800xt, at 600 dollars it’s almost stagnation again so unless it’s priced aggressively AMD will fumble the bag again as they usually do
Agreed. they have to be disruptive to the market to matter especially in the wake of the Nvidia AI Tsunami. They practically have to give it away to make people care about them and yet again, like zen 5 vs 4, the speed isn't significantly increased outside of rumored Ray tracing. So price to perf and power/thermals need to shine along with price.
Is this going to make rx 6700xt cheaper?
This channel is like super market tabloids.
Great breakdown Tom! Totally what I was expecting... A 2024 release with better performance than a 7900 XT, and I couldn't help but think when we heard that they were cutting production of the whole RDNA3 lineup, that the performance might be closer to 7900 XTX performance, which is awesome. That $500 price point is still critical though!
So anywhere that isn’t the USA this card will probably cost the same price as the 4070ti super, not great. People have shown time and again they don’t want amd gpus if nvidia gpus are anywhere near the price of a amd card. Ether that or nvidia will cut the price of the 4070ti just enough to screw over amd’s launch.
If they both stay playing a price cutting game that's good news for us!
Not exactly. Surely, mid and upper mid cards like x60/ti and x70/s/ti will drop price, but the top and ultra top x80/s/ti and x90 will be completely at ngreedia's mercy. 5090 will be over $1500, maybe even closing in on $2000 msrp tag.
3 Months ago we were all hyped for Zen5 and had low expectations for RDNA4.
Oh boy how things have changed.
It's been a very weird year for sure.
Remember, don’t trust AMD benchmarks.
why.... always the the same power as the competition and 200$ cheaper, hush.
I bought a 7900 XTX a week ago and I have until 9/4/24 to return it. Now I’m very tempted to do so and wait out this RDNA 4 GPU.
Better RT, worse Raster, Less vram, less power consumption
I would return it if you paid over 600
return it. Better RT, on-par raster, choice of overclocking which will push it pass the 7900xtx in raster if it isn't already, less power consumption, $400 less( you could use that $ 400 to upgrade CPU if you want or something else).
@@RX7800XTBenchmarks plus 7900 is an admitted failed design, they are reverting back to monolithic with RDNA4
@@marktackman2886 yh. The 7000 series overall was a massive missed opportunity. Even though I really like my 7800xt, which I think is the best bang for the buck currently.
If the performance of the 8800xt is true, then it is a day one buy for me.
I am a serial overclocker, so getting a GPU faster than a 4080 at $5-600 16gb 256 bus is a crazy deal.
Just want to remind people that he said RDNA 3 will decimate Nvidia in efficiency and 7900xtx will roughly be in the ball park of 4090...Guess how that turned out.
^ this
MLID hasn't been a particularly reliable source all things considered. Zen 5 produced 5% rather than the supposed 15% as well
@@Thirty-Two It seems to be mainly with AMD that the leaks are inaccurate. Or atleast in the recent generations.
@@Thirty-Two zen5nm% was because of how the arch were designed. its a server 1st product. gaming WILL not benefit much from multicore/thread enhancement. unless you play games that use more than 4 core, then you will see the improvement. it will be better on City skyline 2 or X4 foundation(on linux. never seen windows using more than 6 core) in my experience
RDNA 3 caught most people off guard because of the 96 dual issue CU. i seen people hyping up amd actually going to have a very big design and only few people suggesting(I'm one of them during NAAF livestream a few weeks before the event) that it could follow nvidia dual cuda thing and instead of having actual 12288 sp, you have 6144 that can use up to 12288 if a program/engine knows how to use it. with RDNA 3, the dual cu needs to be address on the game engine rather than the driver. so in most cases, RDNA3 are just RDNA2 refresh. and with AMD marketshare, no game dev want to put effort since it doesn't benefit them at all. even PS5 still have RDNA2. maybe with PS5 Pro we will see game to have some boost on RDNA3 since they are finally utilizing RDNA 3 Dual issue CU
@@Thirty-Two with overclocking it gets to that point tho. those cpu are starving for energy.
I cannot say how much of an influence this might have on AMD's potential sales numbers. Yet ray tracing performance is obviously very important in the productivity segment, specifically ... ray tracing in Blender 3D and similar programs. Here nVidia has ruled supreme for a long time now, due to AMD's less than stellar RT figures. I would suspect that having an actually competitive AMD card with 16GB of VRAM, reasonable power efficiency and _not_ using that blasted 12V connector would be very interesting to a lot of people.
If they execute things well with the 8800XT like they did with the 5700XT, RDNA4 may be the real succesor of RDNA2 we didn't get with 3.
Thats a big if tho. Price is similar to what the 6800XT was, and that one was pretty much neck on neck with the 3080 of the same gen, not the previous gen.
@@Tiasung if this get at the same level or slightly better in performance compared to the RTX 5070, like the 5700XT was with the 2070, is possible for the 8800XT to sell really well. I don't expect the 5080 to be less than $900, so the 8800XT trading blows in RT and being better in raster than the 5070 while costing the same or less ($500~600), is kinda the same story as the 5700XT. Let's see if the Radeon executives don't miss the oportunity to miss the oportunity as always.
Hope they execute a bit better than the 5700 XT, the drivers at launch had quite a few problems
RDNA3 was a success you live under a rock, or one bot with the typical obscure repetitive claims.
Sounds like a good incremental increase but nothing spectacular. Hoping the RT is properly improved
If they give us a generational leap as good as RDNA2 and more importantly decent pricing, i'm very interested with what RDNA4 will bring on the table...
They're no leaps. It's still slower than a 7900xtx only beating it in RT
@@NadeemAhmed-nv2br Yes, lower than last gen flagship overall while being released as being 2 models below it. For $350 less. You are delusional if you think this isn't a decent leap for the mid range market. Essentially 4080 performance for just over half the price sounds fantastic right now
@@Mitchello457while you are correct for the most part, x800 XT cards are not midrange
@@rocket2739 Isn't midrange defined by price? At $550, how is it not a midrange card?
@@Mitchello457 No, of course not, by your logic, 7800 XT is an enthusiast card, because GTX 580 was also 499$.
Well I was preparing to buy a 7900gre or 7900xt in December, but it looks like I might be considering a new option. Even if it does come out next year, I think it would easily be worth the wait for a newer generation with better overall performance.
Around 4070Ti Super performance with comparable RT performance and 16GB memory sounds pretty nice, but this still needs to be priced right. At $600 this would've been competitive around the same time the 4070Ti Super launched, but with Blackwell coming soon I don't think $600 will be as impressive as it sounds. Nvidia themselves will likely bring out something that performs similarly at $600-650, at which point I'm getting that instead even if I end up with 12GB memory. If AMD thinks they can get away with comparable raster price/perf as Nvidia just because they have parity in RT applications, they're just being delusional, as usual.
64CUs on 256-bit sounds like a pretty small die, and on a relatively cheaper node too. This needs to be $500 at max, maybe $480 if they wish to be competitive. And if they want to be aggressive, $450. We mustn't let Nvidia's outlandish pricing structure get to our heads, people.
Yea comparing performance and price to the outgoing gen is really only a half battle won. And thus far Nvidia has been very silent about their overall lineup and performance estimates so we really won't know until much closer to those launch dates, but in any case we can expect them to offer better price/performance than Lovelace did at minimum so already the gap is being closed to RDNA4 in terms of pure price/performance at the baseline without considering added benefits like DLSS 4.0, Ray Reconstruction, lower TDP for equal performance level, etc.
I think even if Nvidia has a comparable card that costs more, they'll sell more by default. That has been the case forever, unless they do a major blunder like the 4080 launch which was just way over the line for everyone and only served to either drive people up to the 4090, or over to the 7900 XTX.
@@Real_MisterSir the 4070 Super has an msrp of $600. An RTX 5000 $600 card has to be at least 15% faster to match the 4070Ti Super, 20% to match the 7900XT. There isn't much room for doubt, tbh.
Even with RT parity AMD will still have the much worse overall package. Previously I'd have allowed a 70% price differential, but even if I allow 80% this time the 8800XT cannot be more than $480. You see the issue?
Back when rumors were that we're getting a 7900XTX for $500, it was much more acceptable. This isn't.
Even if, and it's a big, that Nvidia launches something at close to the same price as what the 8800xt msrp is, and it performs similarly, at best it will be MANY months till that releases.. This 8800xt is coming out before the 5080/5090, which will both be drastically more expensive than the 8800xt, so one would need to wait for say a 5070 to launch to even get close to the price/performance that the 8800xt is rumored to bring. I doubt that will happen within 6 months of the launch of the 8800xt, and that means it will have all that time to run effectively unopposed as the indisputable Upper Midrange Mainstream King.
@@slickrounder6045 We don't know the launch dates for anything. The only thing we're mostly sure of is that the 5090 is coming after the 5080, which is coming out some time within the next 4 months. So far there has been no indication the 8800XT is coming out before the 5080. Therefore I highly doubt there will be a significant gap between the 8800XT and 5070's launch, and even if there was I'm sure Nvidia will reduce 4070Ti Super to $600-650 to remain competitive. No matter how you look at it, this is Nvidia's game to lose. More importantly, we as the customer lose either way because we'd be paying overinflated prices regardless of the GPU vendor we go for.
Now if AMD prices it at $500 or below, Nvidia won't be able to price match them unless they bring a $800 card down to $550, which ain't happening. AMD will be able to enjoy a market that Nvidia simply cannot compete in, possibly even with the 5070. $500 or below is what we should be paying for this class of GPU at this point, whether it's Nvidia or AMD.
For me this isn't a 3070Ti vs 6800 situation anymore, the only reason why I went with the latter is because I knew 8GB would bite me in the arse for the type of games I play. If I'm buying a new card in the $550-600 range I'm never getting the 8800XT when a similar option exists on Nvidia, even if the value is bad - that's the reality.
If AMD can pull this off I'll be very impressed! And definitely want to purchase one, that said, if they're trying to clear out stock of the 7900 series, the prices are still way too high. Also, I hope one of the AIBs makes an 8800 series in white :)
Ray tracing is honestly most trash tech invented, give me 3D display technology back! I wish stereoscopic 3D gaming was pushed on the same level as RT is.
Most 120Hz displays are capable of 3D, all nVidia and AMD had to do was sell some cheap glasses and build the tech into their drivers, make game developers implement them into their game, 50 bucks for some quality glasses is nothing when gamers already spend 300+ on their super light "gaming" mouses.
3D gaming is so good, a good implementation of it is unreal its like looking into a window and see little plastic figures move around its mind blowing, RT can never match this level of immersion.
I'd rather see more VR than RT (I guess they could do both if we had enough oomph), but VR is near dead despite being the only real tangible change in the visual presentation of games
@@defeqel6537
VR does not succeed because of price, and most people do not care.
3D displays are same as conventional displays but you get the 3D experience of a VR headset, there is nothing inherently bad about it other than needing software to support 3D games or convert 2D games into 3D ones.
Its only because the perfomance isn't there to back it up. We've seen this before pre tesselation and other graphics features we take for granted today.
Thats why I'm glad that Nvidia is pushing the technology. RT is way more correct way of rendering 3D space and adds a lot more realism and makes them more grounded. AMD just have to stop playing catch up if they want to be relevant in the GPU market. Also helps everyone because like it or not most of their hardware is what drives the games industry because its whats in the consoles.
It's optimised for AI but there are rumors that it can game in a pinch.
USD599 is not impressive tbf.
Yep.. When add in 4080 is still superior due to Ray tracing, DLSS 2 and 3 being better than what AMD has to offer and arguably it having better drivers, it's not the same thing..
Yeah raster performance is already in a good space in the mid range. Current generation GPU's have dropped in price a lot in a lot of places and this kind of performance for RT can be had now at a similiar price point right now and not in a years time. Nvidia also offers better image quality through DLSS.
At $499 this would take the markey by storm. Higher and people would gravitate towards a 5070.
@@michaelangst6078 no one uses ray tracing, 600$ is half your precious card, you will feel useless the day it releases.
This is good news for me, I want to hold off on a full system rebuild for another couple years. And I have an AM4 with a 3600x chugging along with a 3070, just want to swap my gpu and cpu, and then get an oled monitor.
One think i like a lot about you channel is getting bleeding edge news it is fun to get something early you know. mostly right but speculate and talk technology yours is always fun. just like you i feel like all the news the last couple months has been dark that get a touch tiring a lot of us really love this stuff and just want everything to go well. Anyway i know what you mean i certainly feel like i want to hear some good new. I d honestly like to have intel around it would be nice to here anything good over there as well. A lot of time lately i noticed people getting more angry easily maybe thats a sign that it is having an impact on the community.
well said, I agree, I am one of the EXTREMELY angered people of the community. In my opinion AMD needs to cancel RDNA4 and clear house in the graphics division. They are making products with unacceptable price/performance ratios.
AMD Marketing never fails to disappoint. Especially Raedon division.
If it close to a 4080 and is $499 that would be tempting, they're still lacking a good a.i scaler. Which is the main thing that puts me off using their cards. Also the next gen nvidia cards probably won't be that far behind so they really need to price it as they can and I don't think Amd will, they'll go for $599 lol
4070ti is $799… so this will be $750…
@@haukionkannel yeah maybe if nvdia's next gen wasn't around the corner, AMD make bizarre choice but i dont think they'll completely nuke themselves. the Whole point of this gen for AMD is to be competitive in the midrange. a 5070 would likely beat this card and i'd imagine that being around $599. SO if AMD price $50 lower like you say it would more likely be $549, which wont be appealing next to the 5070. Another fail for AMD lol it needs to be under $500 but they're not agressive enough
AMD might have a winner on their hands if they can ship this for $499
More importantly, they should make this a true successor to the 5700XT and call it the 8700XT instead. We all know what the true x800 and x900 class would be like. That being the chiplet-based cards coming with the release of RDNA 5 and RX 9000 series
If Amd actually releases an 8800xt in the coming months at the $500-$600 range that equals the 4080, with equal or better ray tracing than the rtx 4070ti Super, it's hard to imagine that won't be one of the most successful mainstream upper midrange graphic cards in recent memory. I highly doubt Nvidia would have anything even remotely competitive at the price range any time in the near future. By the time a 5070 would even come out, it could be many many months after the 8800xt was already out, and I'd be skeptical that it would be able to match the 8800xt in either performance, price, Vram.
We can pretty much bank on the 5080 being exactly the performance of the 4090D. Nvidia is not going to let sanctions prohibit them from marketing the 5080 in China. The 4090D is only ~10% stronger than the 4080 Super. So for there to be any meaningful segmentation between the 5080 and the 5070, the 5070 can't be much stronger than a 4070 Ti Super, or there will be no room for a 5070 Ti, and even then it's a bit cramped.
@@andersjjensen Yeh i doubt the 5070 will wind up stronger than the 4070ti Super at all, and thus it may not even beat the 8800xt (we could have a repeat of the 7800xt vs 4070 situation, except this time with the 8800xt having a many month lead in terms of release date, giving it a huge advantage). Its not even a certainty that the 5070 will have 16gb of Vram from what the rumors have stated so far. The 5070ti should reach the 16gb of Vram threshold, but it likely will just be a replacement for the 4080ti Super, and it will likely be closer to that price (no reason to expect it to be less than $800 like the of the 4070ti).
I think you'd REALLY be reaching for the price to be that low with that performance. I'd expect 600 to 650 minimum at that performance level.
@@pdmerritt Well if you watched this video and the many rumors/leaks for months now, you would know that the msrp price range of the "8800xt" is $499-$599. Over $600 has never been mentioned as a possibility according to any leaks. Maybe AMD will get greedy like they did with Zen 5 and overprice things though. I personally think if ends up beating the 7900xt but not quite equaling the 7900xtx in raster (in Ray Tracing it should), then $600 is really the most Amd can charge if it wants it to succeed, and actually provide performance per $ improvement gen on gen. At say $650 it would end up more expensive than the 7900xt with 20gb of vram, which would be a bad look and would already not be particularly impressive going into 2025, and it would age even worse by mid/late 2025, and inevitability it will have to fall in price to be competetive. So no, its unlikely even Amd will bungle things by charging more than $599 msrp for the top Rdna 4 model.
If the 8800xt can pull of the RT/Rasterization rumors, I'll totally buy it over the 5080 regardless of whatever tech it may have. 4070ti SUPER level RT for $600? I'm cool with that.
64 CUs, on a 256 bit bus, on the same memory, and it will beat a 7900xt? I'm doubtful on the raster. I buy the RT. Have to go back and check the performance difference between the 6800xt and 7800xt.
then you might want to check the spec on 7800xt vs 6800xt. been seeing comment like your and seems like no one even bother to check the spec of the each card.
@@noobgamer4709 I meant it as a general reference point for investigation. Just watched Ancient Gameplays video comparing the 6800 to the 7800xt and the 6950xt.
The 7800xt was 25% faster than the 6800 on average. 5-7 percent slower than the 6950XT. The 6800 ran at much slower core clocks and VRAM frequency than the 7800xt. 2900/2600 vs. 2400/2100. Likely where most of the difference comes from.
7900XT can be run at 3000/2700 and was 16% faster than the 7800xt at 1080p, 22% at 1440, and 30% at 4K. According to Ancient Gameplays.
I don't see an improvement in memory speeds on the same memory. So that likely won't add increased performance. I can see them pushing the clocks above 3000 consistently better than the 7900xt, but not by much. It would have a latency advantage over 7900xt so I could see it coming close or better in 1080, but I doubt it would close the 30 percent gap at 4K.
It should do much better in RT with that many more cores and the power should be much better.
the overclcoked models from the 7800xt catched the 6900xt...80 CUS card. so why not? besides this is a new architecture, supposed to be improved aswell compared to teh buggy RDNA 3.
@nicane-9966 The 7800XT ran both the memory and core 500 Mhz higher than the 6950XT. Even if they ran a hundred more Mhz in both than the 7900xt I doubt they shrink the performance gap to the amount being implied. 30% at 4K.
@@robertmyers6488 RDNA3 significantly underperformed AMD's initial performance claims, and there were leaks of a stability bug which (or its mitigations) resulted in a loss of performance. If this is true (the alternative is that AMD lied during RDNA3's announcement - not impossible, but I doubt it, as it hurt AMD's reputation and opens them up to being sued by shareholders) then fixing that bug, the ability to clock slightly higher on N4P compared to N5, the cache/memory latency reduction from the monolithic design, and adding an extra 5-10% performance on top of that from architectural improvements, give you your 30% improvement.
This is admittedly a lot of assumptions, but it's not unrealistic. Previous GPU generations (e.g. Maxwell; the GTX 980 is 30% smaller than the GTX 780 Ti, and built on the same 28nm node, but about 10% faster while using much less power) have given similar or greater improvements.
Good timing, I just built a new AMD based PC, going to wait for this beast to release
See this leak might be real..
But we all know the saying
"AMD never misses an opportunity to miss and opportunity"
After the last 2+ years of launches, I'm not holding my breath.
6000 series was highly competitive no? I opted for a 3090 at the time but looking back the 6900XT 6800XT 6800 were some really nice cards.
For what it's worth, I'm hearing this week that marketing is getting shaken up and AMD hired new people to position RDNA 4. We'll see...
@@MooresLawIsDeadMarketing isn't really their only failure. Price/Performance of the 9600x wasn't an improvement over the 7600. The best marketing in the world wouldn't change that.
@@martineyles AMD's marketing should have been honest about what tasks Zen 5's performance gains apply to, and emphasised the efficiency, rather than presenting misleading or inaccurate performance claims which reviewers couldn't replicate. If AMD hadn't overhyped Zen 5, the reviews wouldn't have been so negative.
The engineers and firmware engineers obviously deserve most of the blame for the underwhelming and inconsistent performance gains, but it would be much better for AMD's reputation if their marketing didn't lie about it, even if that means admitting that Zen 5 isn't a lot faster than Zen 4. Zen+ wasn't a lot faster than Zen 1, but was still a good generation which got mostly positive reviews, because AMD didn't overhype it.
@@nathangamble125 The 'efficiency' is a moot point, Zen 5 isn't more efficient compared to Zen 4 non-X/X3D (7900 destroys 9700X in MT including efficiency and 7800X3D in gaming including efficiency).
MLID had a video where he actually said that Zen 5 was not very impressive before the sandbagging video, I hope whoever that source is is the source for this video.
I didn't say it wasn't impressive, I just said the 40% people were insane...and they were.
Look, Zen 5 got the IPC they claimed, it just isn't translating into gaming on Windows right now (but it IS on Linux). Still, that's AMD's fault for over-hyping gaming and fumbling the launch.
@@MooresLawIsDead I bashed on you a bit on some of your recent takes but I swear you had that one video where you actually said a percentage on how fast it would be and you correctly pointed out that Zen 5 won't be faster than 7800x3d and you'd need the x3d version of Zen 5 to match Arrow Lake.
Although you didn't put a figure on how it would perform against Zen 4 but that one actually looked a bit spot on ngl.
I commented on the video though I can't remember which.
More AMD hype that will never live up to it. Its the same story every time.
Finally! Let's just hope they don't mess it up like they did with zen5
Stop drinking and eating memes, they wilk you ez in those hate vids for your like.
One thing AMD taught me not to do about their GPUs: having any hope that they will be better.
RNDA2 was a smash hit technologically. Obviously the 'rona and the crypto crap made the situation suck. If I understand things correctly RDNA1 and RDNA3 was from the same team, and RDNA2 and RDNA4 are from the same team... So there is a sliver of hope, provided Lisa has had a stern talking-to with the marketing team.
5700xt was super good! 6800 was good… So i have no doupt that they can make good gpus. But if this is near level 4070ti…. It will also be close in price… most likely $50 cheaper. So maybe $750 for top model $650 for cut down?
@@haukionkannel 'super good' until you compare it with the RX 480, compared to which you got double the performance for double the price.
well, that's good news. I like my 7800XT a lot but the RT isn't quite where I want it. BTW, is it's PCIe4 or 5?
Why did you buy a 7800xt if you wanted RT?
64 CU's is kind of disappointing. I mean, the 6800xt had 72 CUs as a monolithic 800 tier card. I know that the performance per core might be better, but it almost wasn't with the 7000 series. I'll just keep my 3080 for now....
Not just CUs, but ROPs/TMUs/streams processors/Infinity Cache were also higher compared to the 7800 XT while only using 40W more. All the more underwhelming when the 7800 XT would lose in 1440p/4k benchmarks or draw even with the 6800 XT in games like Dying Light 2, FFXIV, RE4, & Tomb Raider games
what is important is the performance, if manages to tie with the 4080 or close could be a very good jump plus 16gb...
Yeah, I have a 6800xt and I feel the same way.
A mid-range GPU for $500 from AMD with decent ray tracing performance ... wow. WOW! I've been hoping for something like this for years.
This could actually get me to switch to AMD again. Add a proper AI accelerated upscaler and they'd be golden.
Interesting. However - I will be quite sceptical of AMD performance claims after the RDNA3 and Zen 5 debacles.
Zen 5 have the same upgrade % as other gens (you eat too much memes) and RDNA3 is waaay better and 200 cheaper (again dont eat memes kid)
My current consideration, Nvidia hasn't started or has barely started testing the GB203 and lower tier dies as they are still focused on GB202 and its list of video cards. This allows AMD to fully launch an RX 8000 series product before Nvidia can launch one RTX 50 series card. This can look like weakness in Nvidia. But even if AMD comes out with a decent competitor in the RX 8800 XT, will people make the switch?
Nvidia doesnt really give a damn about AMD radeon now, they have so much money from selling AI accelerators that its basically irellevant for them if AMD radeon releases sooner or later than new RTX
@@damara2268 This. The entire RTX product line is a side-project to nVidia at this point, all their money is in AI and encode. If AMD somehow became too difficult to compete with, they would just exit the segment, which just feels like a very funny position for the industry leader to be in.
Why are people so impatient, wait for both to release and then 3-4 months to asses which is better for you.
7900xtx 64 shaders per compute unit. Similar to 2000 series cards which had 64 shaders per core. 3000/4000 series Nvidia cards have 128 shaders per core. The kick in the pants is AMD getting mid range 3000 ray tracing performance with 64 shaders per core. Thats insane. So if/when they double shaders to 128, they will perform extremely competitively in ray tracing. But that also means higher gaming performance as modern "shaders" are just generic compute shaders. Hell even physics run inside shaders. Dedicated cores dont exist on GPUs anymore. They stopped that shit when generic compute shadeds took over....
Ray Tracing Cores don't exist. Read the fucking Microsoft DirectX 12 Ray Tracing white paper. If you are gaming and using ray tracing, you are using DirectX12.... And they literally tell you that ALL Ray Tracing functions, are shader functions. How are you a leaker but you dont even know the basics of how GPUs work.
7900xtx 96 compute units. Also had 96 RT cores. Why? Because when a GPU generic shader runs an RT workload, technically its then an RT core. Same on Nvidia, 4090 has 128 SMs (streaming multiprocessor, their name for graphics cores as a whole, like how AMD calls them compute units) and magically has 128 RT cores. RT cores in the sense of dedicated cores do, not, exist. No, you dont know someone from Nvidia who said they exist. You read marketing which told you they exist.
Then people will "reeee" saying "what about Nvidia's new bvh traversal engine" oh you mean the new engine DirectX12 implemented thats specifically stated in their white papers? That? Because yeah, they are running a new algorithm in the shaders, via DX12 code. Its all there, black and white, in english, for anyone to read....
Relax goblin, not everyone reads DirectX specs for breakfast.
Hey Tom!
Did you get confirmation that wgp did not get changed? 64cu would just be +4cu compared to n32.
Sure, if it clocks to ~3ghz (almost 50% higher than 7800xt) , that'll do. But it seems a bit unlikely that the 4nm node will make that possible.
What if rdna4 has 3cu per wgp?
The only thing I personally care about with RDNA4 is, will there be an, _at least_ 6GB, card that either doesn't need an additional power connector, OR at most only a six-pin connector.
*_That's It_* . . . . something to pair with an 8600G/8700G in a tiny build, that won't generate unnecessary amounts of heat. It's not possible for me to express how _Little I Care_ about raytracing performance.
At 6GB you're expecting a 96-bit card. RDNA3 completely ignored that market. There's no guarantee RDNA4 won't.
Why TF would anyone buy an 8600G or 8700G just to pair it with an entry-level graphics card???
Get a 7500F or 8700F instead, or wait for MiniPCs with Strix Halo.
It might make sense to get a low-end graphics card to upgrade an ITX PC with an 8600G or 8700G in 2+ years, but pairing these APUs with a graphics card is a waste of money right now.
@@nathangamble125 . . nobody asked you for your sh!tty opinion.
Great news!! My 3070 just started failing and I was close to buy a 6800 but now I'll use a 5700xt I have and get the 8800xt when I can. 😊
While I expect n48 to be he Only intresting gaming gpu in 2024/2025, I find it hard to belive That a roughly 240mm2 is going to trade blows (sometimes in RT) with Nvidia ~379 mm2 on the same node.
Yea it wouldnt surprise me if it's very selective as not all RT is equal. On one hand you have a game like CP77 where you have RT enabled to the max and even Path Tracing on top - vs another game like Elden Ring where it's minimally implemented but still advertised as "having RT enabled". You can find plenty of games already where AMD cards perform "better" than Nvidia cards, simply because the RT is so minimally implemented that raster still plays the biggest role and the RT gains don't outweigh it.
Nothing in this regard should be considered at face value until we have actual independent benchmarks from 3rd party testers. If AMD had leapfrogged Nvidia in RT to that degree they wouldn't be able to shut up about it, we all know how AMD marketing department works lol.
The 240mm^2 size was a claim on Twitter by All The Watts which they didn't provide a source for, but was apparently an estimate based on assuming that all parts of the die can be shrunk at the same ratio as logic (which is untrue, PHYs and cache can't be made as dense as logic).
My own calculation, which accounts for the relative area of cache, PHYs, and logic (based on die shots of RDNA3), gives an absolute minimum size for N48 of 263mm^2 if it's built on N4P. Realistically it would be slightly larger than this, considering the additional RT cores and added features, probably about 280-300mm^2. Hypothetically it could be smaller if there are major improvements to circuit routing, but I don't think that's realistic unless RDNA3 is just a fundamentally terrible architecture, which I don't see any evidence for (it underperforms, but this seems to be due to a minor bug rather than a conceptual flaw in the architecture's layout, and RDNA3 is already much denser than RDNA2, so it's not likely that there are many more architecture-level density improvements that can be made).
A 300mm^2 die competing with a 379mm^2 die on a similar node is still a significant discrepancy, but reasonable. Remember that Nvidia GPUs use a much larger proportion of their area for AI and RT cores than AMD GPUs, and also that Nvidia architectures since Ampere are fairly bloated due to having twice as many CUDA cores per SM (the extra compute helps a lot with AI and workstation tasks, but doesn't do much for gaming). N4P also seems to be slightly denser than 4N, though by less than 6%, so it isn't a major factor.
Dayum I love your enthusiasm of AMD potentially surprising again.
Diverting attention from the disaster that is Zen5, hyping up for Rdna4, which will miss perf targets as per AMD tradition
I'm officially waiting till Zen6. I refuse to reward bad behavior such as the chaotic launch and the lack of desire to strike at top nvidia cards. (I'm still on 3600x and a 2070 btw.)
You have a mid range system you were never going to buy top range.
@@declangallagher1448I would but only amd. If a 5090 competitor isn't on the horizon by the end of 25, I'll reluctantly go team green
trade blows with a 4080 but for only 499-599$ ?! heck yeah, that would mean prices go back to where they actually should be
They are ALREADY suppose to be at that pricepoint and performance THIS GENERATION. RDNA 4 is being considered a worst case scenario for AMD right now.
@@marktackman2886 doesn't really matter what they're supposed to. Reality matters.
And whey they are able to basically launch a 4080 at 600$ that sells like hot cakes, Nvidia can't charge >1000$ even for their 5080 even if that thing is 30% faster (600 * 1.3 + random nvidia tax < 900. AT MOST)
anyway, I have hope that this bring GPU prices back to where they belong over the course of the next 2.5 years
DLSS 2 and 3 is still superior compared to AMD's offering. It's not the same thing as a $1000 4080
@@michaelangst6078 But some people go "So you're telling me I'm paying $500 for a software license?"
All it means is the 5070 will be 599 just like the 4070. This won’t cause pricing tiers to shift at all.