Check it out the FlexiSpot Recliner XL6 by clicking bit.ly/4g2sWlc Don't miss out! Use my exclusive code 'YTBXL650' for an extra $50 off! (Buy one and get 50 off, two units and get 100 off). Let’s make 2024 your year of ultimate comfort!
Pat as a CEO was fine. It takes close to 5 years to turn a company around. It's been only 4 years. And his predecessors messed up the company over 10 years. Everything after Sandy bridge up to 11th Gen was a shameless cash grab by bean counters.
that rDNA4 leak is old news... MLID told us this info in SUMMER 2023!!! BUT... we all know how things can change THAT FAR OUT.... I mean back then they were still gonna do the High-End cards.... And they were PLANNING on releasing in Sept 2024 too..... So it's nice to hear that the into was good ... as it ALWAYS is (okay okay, I would say it's correct 98% of the time... NO JOKE!) Also the high-end was supposed to be an MCM, bust as you may know that's been scrapped for rDNA4 and they're still gonna be monolithic.. and only 2 die for this gen... BUT they're supposed to be GOOD and CHEAP... so I'm stoked!!!!
oh I forgot to mention that AMD RE-designed their ray tracing... a completely new system... they filed the patents for it right after rDNA3 released ... so YES 45% IS CORRECT!!!! - NO JOKE bro!!!
@@fs5866Lets be real.. The RTX 5080 will be 15÷ faster than the RTX 4080 in the BEST case. Therefore, if AMD can reach the RTX 4080 for 600€ instead of the leaked 1400€ for the RTX 5080, this would be a big win in my opinion.
@@ziggs123 thats the MSRP of the 7800XT, it will probably be $600+ but who knows. If AMD really wants to get mass adoption in the lower to midrange, they need to do what Zen 1 did to Intel. huge price difference so that you cant ignore the value
but it has no performance where it matters. if you play raster only games, it's already no competition and you would go with AMD unless you want the better up-scaling and software ecosystem.
If it actually _does_ match a 4080 super there's no f'ing way AMD will sell it for a penny under $700. Not unless Nvidia themselves murder the 4080 for less. The only thing AMD is consistent with is ripping you off ever so slightly less than Nvidia while giving you an inferior product.
8800XT only gonna offer some better RT over its ancestors. The raw specs are pretty crap. I think it could atleast have an upgrade on raw specs like the 7900xt had over the 6900xt. Now i get the feeling hte 8900XT wil be a 7900XT in raw specs. but just some better RT
RE: Translation issues on the forum post. Chinese doesn't have tenses. Listeners are expected to know that if they're talking about a future event, it's going to happen in the future! Machine translate struggles with that a lot. "Biohazard" is the name RE4 remake was released under in China, your guess is correct. EDIT: 光栅光追 = Raster AND Ray tracing performance. Chinese frequently drops "and" and just puts 2 subjects together. EDIT2: 跑分得赢 = "It can beat it [the 4060ti]" Google translate got entirely thrown off here. It doesn't say 2 points at all.
Biohazard is not the China release name, it's the freakin' Japanese name of the game which the developer is from. It's basically the original name for the game. Resident Evil is basically the name to sell the game to the western culture. (Actually Biohazard makes a lot more sense than the Resident Evil name.) There are plenty of other movies and games similar to this case. I thought this was common knowledge. I guess everyone is video game history illiterate these days. And " 光栅光追 " looks to me is more like "rasterized ray tracing" which basically means light ray tracing or what you see in RE4 - the AMD cards already weren't as bad as other games in RE4 - which means on heavy RT it will fall behind 4080. And likely not closely.
I know the leaker; he is an AMD fan from China and is an AMD fan forum manager for a forum called "AMD ba," which is like a Chinese version of Reddit. His leak rumor about AMD CPU accuracy is far superior to AMD GPU. He told people that the 7900XTX could compete with the 4090 one month before it was released. So don't put too much trust in this leaker's information about AMD GPUs.
@@n.erdbeer Well, no, because if it could beat a 7900 XTX even in a light ray tracing workload by such a wide margin, that would actually be even more impressive than beating the 7900 XTX in moderate or heavy ray tracing workloads. That would mean that the new top AMD mid-range graphics card has an absolutely massive improvement in standard rasterized performance over previous generation graphics cards, and would likely have better standard rasterized performance than an RTX 4090, and would be MORE THAN TWICE AS FAST as an RTX 4070 or 7800 XT in standard rasterized performance. So, clearly you have absolutely no idea what you're talking about.
@@syncmonism you fail to recognize that current gen AMD already decently competes with Nvidia in raster performance. The next gen being faster in that department is no shocker and not even needed for the most part. New UE5 games struggle with CPU performance, not GPU performance. Hell, you don't need faster GPU for raster performance, when the only huge performance drain is RT at the moment. Showing performance graphs in light RT games like RE4 or Spiderman2 is misleading, since most of the gains come from faster raster performance. For that reason AMD has similar performance compared to Nvidia. A serious comparison would have been heavy RT games to see the generational progress.
@@sinephase literally cause if AMD had any sort of striking distance in terms of Upscaling solutions and RT I would've swapped to AMD ages ago, but since I use DLSS and RT I literally have no choice but to upgrade to another NVIDIA gpu if I don't want to compromise on RT performance and upscaling
If the rumors of the Radeon 8800xt are true, I hope they come out at a super competitive price, because a very low percentage is interested in the RT, the majority want raster price/fps
All that matter is the pricing, if AMD price it right, they may have a banger on their hand, but then again it's AMD, i"ll be surprised if they donn't fumble it this time around..
@@apricotmadness4850 They should just price everything at a dollar so they can finally get some market share and people actually getting used to and buying their stuff later. The Nvidia slopfest is annoying.
That 45% number with the PS5 Pro was in relation to render performance overall not RT. In RT it can be 2-3x the performance calculating the rays. I do wonder how much of that is RDNA4 or just Sony's custom optimization.
As much as Intel's stock has plummeted under Pat Gelsinger, I do think that his plan to get Intel fabs up to snuff again was a good one, and it's something that the company should continue to do. It was under the leadership of Brian Krzanich when they really deprioritized their fabs business and, over time, lost their competitiveness with TSMC's node going much further ahead. It was basically under him when Intel stayed under the same node from 6th gen all the way up to 11th gen, because they deprioritized the funding to their fabs division. When Gelsinger came on board, he made it a clear point that he's going to get almost all of Intel's resources so that they can once again gain a competitive position in the Foundrys business and get their CPUs to a modern node.
Yes the plan was good but trying to do it all at once was the biggest mistake. Committing to over 100 billion dollars worth of fab construction in a short time does not help a company that doesnt make that much in revenue in a year and a down turn in the market or market shares will hurt alot just like it did.
@@blackknight2169also they paid a lot of money to show up late to the party with GPUs and AI. If Alchemist had launched a year sooner (hell, even 6 months), they would have sold way better. Same story with Battlemage: too little, too late. It's DOA if they can't price it under $250. Arrow Lake was supposed to be on their own node, but they couldn't deliver that in time either. Even on TSMC's advanced node it landed like a wet bag of farts. They've strung together so many consecutive boondoggles that they're going to have to merge with another company or start selling parts of their business to survive.
If they don't have a x900 class card, they will likely not call it like that. We've had RX 400 series, that stopped at 480, we had the RX 5000 series that stopped at 5700. It goes by core configurations and what that generation of the core has the potential to look like. RX 5700XT was 2560:160:64, so was RX 6700XT, RX 6800 was already 3840:240:96. They knew the potential feasible maximum of the RDNA architecture when they brought RX 5000 series to the market, just like they did with VLIW and GCN, and the 8000 series is still RDNA, so they will name it accordingly to the core configuration. That's my opinion at least following AMD naming logic which dictates the naming.
doubtful Pat needed more than 4 years to right that ship. Anyone else coming in will face the same challenges will probably take 5-10 years to get intel back on track.
AMD REALLY NEEDS to deliver on FSR 4, even if they catch up with Ada RT wise, Blackwell will be probably a lot faster in RT. What AMD need is for FSR 4 to be available on RDNA3/4 And be supported in plenty of games, and it needs to look at least as good as DLSS 3.0
lmfao if they want fsr 4 to be delivered they CAN'T do shit on rdna 3 because rdna 3 does not have any powerful designated ai learning cores therefore you said nonsense dlss FG was designed only for geforce 40 series because they can boost DLAA to the next level without rendering at low resolution that way using ai cores to compensate for performance boost
Daniel @ 9:10 Even tho RE4 Remake has very minimal change in fidelity with RT enabled on max. it still takes a huge chunk of performance with it as if it was a good implementation
Make good raster implementation that does not differ much from ray tracing. Trash rt implementation for not looking different. Duality of perspective heh
@@damianabregba7476 raster will never look like ray tracing. Cuz RT uses real life physical calculations on how lite refracts and bounces Relative to the perspective of the viewer. Bqked in lighting is something that is not interactive. Its static lighting and shading of light
@@rotm4447 i can name more than 10 games with good RT that makes it look 180 different. But the reality is only 19% of all GPU owners use RT, devs now are making it a non negotiable option in games lately
Yeah i somewhat agree, this potential 8800XT would need to be a whopping +51% in raster Performance to match that of the 7900XTX. Which seems a bit far off. I imagine it will be more in the 7900XT territory which still would require a 30% performance gain. But lets see maybe this highest end rdna 4 card will be the same as the 7900XTX but just more power efficent and with better RT Performance, that would be the best case Scenario.
Pretty sure Gelsinger was "asked" to fall on the sword for a lot of the screw ups and bad press for Intel lately. He leaves the day before a big announcement so that way, people don't see him leaving AFTER the announcement and think, "Oh, he's leaving because it's already DOA." I mean, the day before isn't exactly much better, but it's not like there's a good time to really sack a poorly performing CEO (though immediately following the Intel Ultra launch may have been the "best" time?). I'm not an Intel fangirl by any means, but I really hope they can stick the landing with their GPUs. God knows, we need something decent. (And yes, I've bought Arc cards. Four of them. Because I'm an idiot.)
Resident Evils original name is Biohazard but as you would have guessed it the name was already taken in the US so they renamed it and kept the Biohazard name for most of Asia as it was firmly established as such.
For the 8800XT, if its a 4080S at $600, I'd buy it. And that's the max I would go. Any higher its a no. If released at $500, which was the 7800XT MSRP, then this is a definite BUY BUY BUY.
I won't mind it at $600 IF these rumors of matching the 7900 XTX/4080 are true (and I'm skeptical of that). If it only matches the 7900 XT (with improved RT), though, I hope that it isn't over $500.
I had a feeling that Intel Arc was going to be worthless, but kinda 'had to be done' to figure things out, and then Intel Battlemage would catch up to Radeon 6000 and GeForce 3000 - which would be absolutely 'fine'. I'm not expecting a grand slam hit, but 'CEO quits' was not the driver crash I was expecting.
Nvidia is going to sell a 70 series card, for $800. I'm done spending that kind of money for a mid-series card. I still use a 6950xt at 4k and the performance is just fine. I feel like having an Nvidia card is just heard mentality. Meant for bragging rights. Same with Intel CPUs.
If the RX 8000 XT really is RTX 4080 levels of performance, they will probably sell it for 50 USD cheaper and still have worse upscaling etc AMD never learns.
@@nameless.0190 Does not matter, That is what we get,,, Eithei 5070 12Gb at $700 or 8700XT 16gb at $600-$650… TaKe your middle range pick among those two…
Just for once AMD, use your brain. Call it the 8700XT and have it come in at $500 if FSR4 is not ready to rock and in at least 3 games at launch. I've been having a blast with my XTX, but if the 8700XT is on par with the XTX in raster and goes +/- 5% of a 4080 when RT is cranked to max, then I will happily "side grade" for $500 and sell my XTX to someone who needs the 24GB, just to get the lower TDP. I went from a 5950X + 6700XT to 7950X3D + XTX and I can most certainly tell the difference in heat output.... And remember that there's a lot to a name. If it's called the 8700XT and swings at the 5070 in performance, but with 16GB and a hundred bucks cheaper, with the promis of FSR4, then that is going to be attractive to a lot of not-stupidly-brand-loyal people... And you WERE going for market share this round, right?
@@travisjacobson682 Other sources reported, about six months ago, that they were targeting 4070 Ti Super levels of performance with full path tracing turned on. Since the 4080 is only 15% faster than the 4070 Ti Super, it is not impossible. RT is not some magic technology that is super hard to do right. I mean, Intel did it pretty well on their first attempt. AMD just didn't figure it would be such a big deal for people. And honestly it isn't. Everyone I know turns it off because 60FPS AVG with 37FPS 1% lows suck ass once you've tried 90-120FPS on an adaptive sync monitor.
@andersjjensen Targeting levels of performance is not the same as achieving them. People don't care about RT because performance in RT sucks, not because RT sucks. I agree it's not impossible dark magic, but it *is* incredibly hard to do. We know this because Nvidia has been struggling to do it well since Turing! I would argue that only the 4090 is able to deliver an enjoyable 4k RT experience, and even that often requires concessions in some titles (for example: Black Myth Wukong recommends 4k DLSS Performance mode when RT is set to Very High). The 4070TiS and higher are in rarified air, so I'm skeptical AMD will be able to flip a switch in their architecture labeled "RT" and join the party. AMD's performance in full path tracing games is woefully behind Nvidia. I know they redesigned their RT cores in RDNA 4, but I just won't be convinced they'll match the 2nd/3rd best GPU in RT with a sub 300 mm squared, 250-ish W mid tier offering until I see it with my owe eyeballs. I want to believe! Lol
@@travisjacobson682 Nvidia has not been struggling since Turing. They simply don't want to allocate that much die space to something that is useless to the vast majority of their customers (professionals). If RDNA4 has chosen to stagnate in AI performance compared to RDNA3 (which is already quite good, actually) at the expense of significantly beefed up RT I can see that happening. It is, after all, getting a half-node jump in lithography, whereas Blackwell is on the same node as Lovelace. So going toe to toe with the 5070 at node parity does not seem out of the ordinary to me.
Market doesn’t work like that. Nvidia is a richer company. Meaning they can afford to lower their prices if AMD lowers theirs to raise their market share. Let’s say 8800xt releases at 300. Nvidia then would easily afford to price their 5070 or whatever at the same 300 and keep their market share. Because of this, nvidia and amd just agree on the higher prices they will sell their gpus like 600$ that customers would pay anyway. Most people foolishly ask question like “ oh man why amd doesn’t just lower their prices and outcompete nvidia”. The thing is Nvidia would lower theirs too and would be able to maintain a lower prices even at a loss for much longer. What i’m trying to say is that Nvidia will still be better. Just like 4070 super is better than 7800 xt, 5070 will be better than 8800 xt no matter what
@@РаЫо bro is high af writting all of that. no shit we know nvidia card is better. people just dont want to pay for that if they want to game. tariffs threaten the costs of cards next year
if its on par with 4080 but costs more than 650$, then it's pretty much pointless. We need decent performance at 600$ range. And good performance gpus at 300$ range.
So "8800XT" will be roughly a low cost, low power, RTX 4080 ... Nothing to complain about especially if that price / perf ratio comes out strong. I hope they do, then crash the price of the remaining 7000 series stock - clear out the old inventory, increase brand uptake, etc. If they want to focus on selling more, they need to destroy NV in price/perf for mid and mid high tiers.
even when AMD had a better low-mid gpu at al lower price than Nvida, Nvidia still sold more. it's not the price or performance that Nvidia has 88% AIB market share and AMD has 12%.,
It's gonna be entry level high end once the RTX 50 series launch. Also don't forget that the 8800 XT is gonna be competing with NVIDIA's RTX 50 series not the 40 series.
People seem to forget Pat Gelsinger came OUT of retirement to run the company in the first place. CPUs have such a long lead time that we're just _starting_ to see what would have been built under his leadership. Look how long after Lisa Su took over before AMD finally surpassed Intel, and even _that_ took Intel stumbling pretty hard. Laying all the blame at Pat's feet is exactly the brainless shit I'd expect from an investor that demands profits TODAY, but DGAF about a company's health 5yrs down the road.
I dont think so. RDNA4 probably just adds Hardware BVH to the ray tracing cores. Basically RDNA2 and 3 did not have the same full hardware ray tracing as nvidia had. Some things where still shared with the shaders. Which is why Nvidia was just much faster. RDNA4 probably just does the same as Nvidia and does everthing in hardware.
More L2 is my guess. Ada has a butt-ton more L2 cache whereas RDNA3 leaned heavy on Infinity cache (L3) The large amount of L3 was good on a macro level, which made it competitive and even scoring wins in raster. The L2 cache on Ada is an advantage on a micro level, so RT, ML, AI, the derivative stuff like DLSS.
The 4080 Super having a 83% lead over a 7900xtx in Cyberpunk 2077 is actually even understating Nvidia's lead. Without RT NVidia is like 10% behind AMD in that game, because it loves RDNA3 for some reason in raster. When you turn RT on, most of raster isn't disabled, so with Nvidia pulling 83% ahead, it has to first regain the 10% raster deficit, and then pull 83% ahead of that even. I don't think it be wrong to say that if Nvidia had a pure artificial RT scenario, they'd be 100% ahed of AMD or about 2x AMD RT perf.
Are we in some kind of GPU hell where all cards other than nvidia's flagships approach some kind of asymptote at the level of the 4080 performance, while the Nvidia flagships alone go on above and do so with increasingly outrageous pricing?
You know what i learned from buying flagship 10 years ago? Support is greater on cheaper cards with 5% market share, not top card 0.1%, if indie games used traditional DX11 features only, 90% of people would be happy with 100+ fps and no damn DLSS, TAA, fsr frame gen. UE5 truly is the culprit. Badly optimized engine + devs with little time to optimize = pumped up mega textures, blurry game with 60fps on a 4090. Yet most played games run kn igpus at this point
nvidia is pushing more and more wattage onto their gpu to push more performance. I just hope what happened to intel chips wont happen to nvidia chips in the future.
people complaining about the gpu market is so overblown, you can play everything with low end hardware now. Why do I even want a 4080 to do literally 200 fps in 2k.
@rotm4447 that low end hardware is the cost of high end hardware from 10 years ago. Also, the fact that you made that statement proves you either dont care about money or dont play newer games. A 4060ti is immensely overpriced for a 60 card. That 4080 will be minimum spec in a short time with how incompetently optimized games are getting. The only people "having fun" are the loaded gamers nvidia knows will pay anything.
Alchemist in reality did "overperform" in synthetic benchmarks sure, but that's actually just because drivers and game optimizations couldnt use all the hardware they had, if they did, Alchemist would've performed a tier above what was placed at.
I hope battlemage is good and I hope the 8800xt is as good as the leaks say. Let nvidia have the $2000 market, we need the competition in low and especially midrange.
Nvidia has 88% AIB market share and AMD has 12%. even when AMD low-mid tier is cheaper than Nvidia, people still buy Nvidia. Nvidia is the equivalent of Apple. people will buy a $1,400 iphone cause it's Apple. people will buy Nvidia cause it's Nvidia. btw, 4090 was sold out day 1 of release. it didn't even take 1 day for the gpu to sell out. on steam survey, the 4090 users are more than any AMD gpu. we're talking about the 4090. the reason for lack of competition in every tier is not the price.
@@SapiaNt0mataprobably because AMD is non-existent in the laptop GPU market and on desktop GPU nvidia is available all over the world Stop blaming consumers for nvidia being the only choice
@SapiaNt0mata, I buy Nvidia's GPUs because they perform a lot better in RT, consume less power and they cost the same (sometimes even cheaper) as AMD's GPUs in my country.
@@jesusbarrera6916 there's an estimated of 1.9 billion PC gamers(not laptop), so i don't think laptop is enough to justify the difference in market share. i don't know about laptops, but AMD sells even in Lithuania, so i don't know how you say that Nvidia sells all over the world meaning that AMD isn't. you tell me not blame the consumers, while you make false claims like AMD don't sell worlwide to blame AMD.
@@TheSEWEGI best selling gpu are rtx 4060 followed by 3060, so the majority of PC gamers can't enable RT. and if we include laptops, most laptops sold have 4060 and below gpu, so the majority also can't enable RT. as for power consumption, the average consumer has no idea about this, so they don't buy Nvidia cause it consumes less power. and what is this utter nonsense about "consumes less power"? it's not like the electricity bill increases by $200 per month. the increase is negligible and is not concerning to make such a fuss about it. btw, do you have any idea how much money you waste without needing too? i don't see you talking about this waste of money. but you draw a line paying a few $ per year for power consumption. absolute laughable. peak comedy. as i said, most people buy Nvidia the same way people buy iphone. it's brand recognition.
@@nameless.0190 could be. I'm thinking of selling my 7900 xt for 500-550 and then buying the 8800 xt. If the lower consumption is true it would be great (live in europe ☠️)
Okay, I know we Gamers get older, with my taste in classic games and especially remembering the 8800XT(X) that came before this one, I'm already feeling quite old sometimes, but that chair ad still scared me. Yes, I know I was already alive when Tomb Raider 1 came out, but I wasn't when Buddy Holly died - could you youngsters please at least pretend I'm still fit enough for a gaming chair? :)
This might be wishful thinking. If AMD does release a 4080 Tier GPU and price it aggressively, it would be a step in the right direction. But given RTG's mismanagement when it comes to PR, I'm cautiously optimistic.
its not just firing lil bro he couldn't take any more about ppl shitting on their gaming performance, 0% arc market share, amd dominating in cpu sales - TOP 10 cpus sold on amazon all ryzen 13-14th gen desktop cpus degradation and denying customers RMAs, Arrow Lake having tile teething problems getting murdered in stock market -50% share down from last year, dropping all the way to the bottom of $18 of stock when they were dominant at $60 at 2020-2021 🗑🗑🗑
@@BlackJesus8463 idk man.. you don't become a CEO just to get out asap. Also the guy looked enthusiastic and excited. I actually kinda liked the guy, it's gonna be sad when he's replaced by some soulless marketing corpo. Engineer is a different breed
intel was priced on the stock market 4 years ago at 250 billions, look at how much is worth the company, 100 billions the company is ready to be sold in pieces only government could save them from being sold in pieces
I think that if they want to communicate budget/ value, they would call them 8800xt/8700xt reminiscent of the 480 and 470, if thats what they are going for.
According to information obtained from manufacturers and employees of certain companies, the 5000 series from nvidia looks terrible in terms of specifications and price. Intel arc a570 and lower models look good $160 for 8gb the model is a perfect match looking at the rtx 5050 4gb for $220.
As long as the new flagship is not worse performance-wise than the 7900 XTX and efficiency improves by 25% it's still good. Slap the 7900 GRE price on it and it'll sell like hot cakes (like the 9800x3d). The 80/90 gap has been steadily widening, price wise the 4090 is more than 2x more expensive in EU vs 4080 super = 223% the price for like 30% performance uplift in my country, it's ridiculous. Not that the 4080 super is any good value, 7900 XTX beats it fair and square. You pay the 25% Nvidia tax for DLSS and a bit of ray tracing if you dig that. I was skeptical to their new strategy at first as a regular Radeon flagship buyer and enthusiast, but it could be a good temporary stepping stone if they pull this off right. They wrecked Intel which was deemed unsinkable and dominant at first. Hell, I used to pair my Radeons with intel's i7's back in the day (like 10 years ago). The problem with Polaris was that they only competed with 60-series at most. A good line-up of cards up to the 80-series level which beat their respective competitors price/performance is better than trying to release a 5090 competitor and neglecting everything else to do that. The future looks good for Radeon, please don't mess this up AMD.
IF those leaks are accurate (I'm taking it with a big 50lb bag of salt) I'm excited to see what people can do to them with the Elmor Labs EVC2. you can push the 7900Xt/XTX up to at least 600W for some improvements to performance (albeit pretty big diminishing returns past like 400 maybe 450 watts) so if they just turned the power down from previous gen to turn it down and not because it REQUIRES lower power you may be able to give it that 25% extra power or more and see some benefits.
There's a rumor that Seasonic listed RX 8800 XT as 220W, so if this GPU is going to be ~275W, AMD might call it RX 8900 XT. If it was called RX 8800 XT, it should cost $500-600, which would be amazing for RTX 4080-like performance. I think it'll be more like $650-750, still not that bad. The actual RX 8800 XT should delivery something between RTX 4070 Ti and Ti Super performance for $500-600.
ray tracing performance is a bigger deal than people make it out to be. A lot of new games have some ray tracing by default that will make nvidia GPUs out perform the AMD GPUs in spots with heavy rays
When the lighting system is completely run by raytracing it will matter and look amazing. But that is many years away from being practical. The most popular gpu is 3060 so raytracing will be optional gimmick for a long long time.
Ref: Gamer chair. I have a knackered spine and sitting at a desk causes unbearable pain. I had a MRI of the top half spin ~2 years ago, the basic comment was that "not a single vertibra or disk was correctly aligned". This is why my PC’s monitor is a 65" LG TV that has VRR (AMD Freesync) from 48 to 120Hz, I use my PC sitting in a comfy recliner armchair, about 8-9 feet away. I just have two long USB extension cables so my keyboard and mouse are used with a plywood board that sits across the arms of the chair. I’ve suffered back pain most of my life, which is common for people like me who are over 6’ tall. I think the rapid decline was caused by close to 6 years commuting to London by train. I remember taking my dad for a check-up before having a heart bypass. On the train he said to me: “I’m certain that somewhere there must be a hunched back midget who finds these seats comfortable”
unfortunately chances to be that lucky one is 0.000000000001% common sense statistics: there are 7-8 billion of people exist, and how many of them would have that type of luck? idk but card could have damaged cooling or internally damaged but if he tested and it ran perfect then hes lucky af
So then the RX 8800 XT (if that what's gonna be called) is gonna be a 4080 SUPER in raster performance so it's gonna be competing with the RTX 5070 Ti. That's good since the 5070 Ti is gonna be a $800 GPU so the 8800 XT being around $500 is gonna be a good alternative to NVIDIA's next gen entry level high end class.
Nobody said there won''t be a Intel Arc dGPU. Thay are just not chasing high end any more. A770 wasn't highend but they were ***aiming for *** 3070, it had delays after delays and didn't hit the target. With BMG they are not even **aiming** for 4070 anymore.
Basically, ye, if they make the current 4080/7900xt/xtx class cards affordable but also boost the RT to be competitive then they'd fly off the shelves. So far, AMD has lagged behind for a generation with each iteration, but as we all know, 20 series cards were beta products that couldn't actually Ray trace at any significant speed. 3080/90 were the first to actually be usable sub 60 FPS, but I'd argue the 4090 is the only full RT card cause only it can raytrace at 60fps in most games.
Boeing conducted a buyout of a smaller, troubled company by the name of McDonnell Douglas. They had quality issues. What happened is post merger, people from that company eventually were promoted in Boeing to run things. They ran it into the ground for profits, no pun intended. Continuously cutting the R&D budget to buyback stock. Boeing essentially got “Trojan horsed” by the smaller, garbage company.
Is it possible that the RE4 RT result is actually the low point? because if it's theoretically a game where RDNA3 didn't hit the RT core bottleneck issue it means it was running possibly at full speed. If it's 45% faster in a non bottle necked scenario, does that mean it will be better in other scenarios like Cyberpunk? which overloaded RDNA3's cache setup with the ALU's which is where the bottle necking came from.
Just go with AMD with more VRAMs, comparable RT and AI FSR4(AI PSSR like upscaling). I am sure the unreleased 5070, 5060 TI and 5060 are already obsoleted and they will make good e-waste.
@@РаЫоthey'll probably be pretty similar, with the 5070 pulling ahead in heavy RT titles (AW2, CP2077, BMW, etc), but also falling behind in 4k due to 12GB VRAM
@@inkara3771no. They are lying. I went with 7900GRE 2 months ago, 0 issues and I'm happy to use Adrenaline on top . Also runs quite and cool and with some smart options in Adrenaline you can lower energy consumption if you care without losing much performance, or you can OC super simple
@@rasluffi when bluds talking about aaa games and say 120 fps at 4k max settings with 7900xtx yk he means 2019 or maybe some 2020 games he would never include alan wake 2, cyberpunk 2077, wukong, stalker 2 and list goes on... star wars outlaws, horizon forbidden west etc.. and would never mean 2022-2024 games its like all known same scam icon on the PS5 or XBOX box "4k@120 fps"
We absolutely should stop associating number schemes (xx60, xx70, xx80 or x6, x7, x8) with expected performance or performance tiers. The formerly sort-of valid "70-level performance" and similar phrases have become completely unreliable. Pay attention to specs, performance, and price, not numbering schemes. If this new RDNA4 GPU (the 8800XT?) is as good as this so-called leak apparently indicates (note all those backhanded disclaimers), and if it retails for
I personally wouldn't buy a graphics card with anything less than 16gb of vram. Once you're at 4k, it becomes necessary. I constantly had issues with 12. OLED monitors need to fix their subpixel layout so text is legible. People who can afford them have to work to pay for their gaming habit, and you can't work on OLEDs comfortably at the moment.
@@nameless.0190 It is because to many games "meant to be played with nvidia" cause they lazy fkers and want auto tune their games instead of doing it manually. so lets force other manufacturers into the same gimmick software/driver crap cause the games desire it.
well hey like always AMD never misses an opportunity to miss an opportunity, so they'll probably price this at $650-$700 mark making it not very competitive at all.
I hope Pat isn't having health issues. It's hard to imagine Intel's doing so badly that he'd pull a "resigned effective immediately" move over that (or he got forced out) ... but maybe. The timing and sudden departure are bizarre.
I mean, they're at one of their worst moments in history. Pat has made some big mistakes recently that cost the company a lot in lost deals, I imagine he was encouraged to retire
nothng bizzarre, company lost well over 150 billions in value in 3 years, there are 3 companies interested in buying intel in pieces or as a whole intel sank
the bugs with the intel 200 cpus wasn't great but otherwise I would think Pat Gelsinger has done fairly well in a difficult situation that he didn't create, I certainly wouldn't replace him with marketing folks unless he had done something really bad. Why not let him stay until a replacement is found, that just looks bad for intel as a whole. If I was intel I would have focused more on enterprise NPUs because afaik tensor chips are relatively straight forward to design, tsmc is the great equalizer and nvidia products are 40x overpriced which leaves and incredible margin to disrupt. I know there is a bit of a software moat but if you can get a neural warehouse for 1/10 the price, you can adapt some software. Just meta alone has spent ~25bn$ with nvidia, cutting that down to 2.5bn would put intel on the map and make nvidia's stock price drop faster than israel's reputation. AMD should have done the same long ago, I don't know what they are doing. Those cards cost less than 1000$ to make so how are they not absolutely destroying nvidia's 35k price point. Something strange is going on. Speaking of AMD neural fumble, if RDNA4 doesn't have strong tensor performance they have seriously failed. Also they have a big opportunity to defeat nvidia by putting much more ram on the cards. Ram has dropped to a fraction in price and nvidia is trying to give us the same ram the 3rd generation in a row which is especially grievous given how mem hungry AI is. AI is coming to games, it's going to be big that NPCs aren't just dumb puppets flailing about but interact like they are real. The language model to make them speak alone could swallow 12GB
In the beginning I think I saw it said that the new card is 45% faster than the 7900XTX in Rasterization but equals the 4080. Was confused cause the 7900XTX is usually faster than the 4080 in raster but I assume it's a slight improvement in raster to equal the 4080 but it's RT equals a 4080? If so, and it's priced around $600 or cheaper, then that will be a game changer for pc gaming for the price and availability.
I’ll keep an eye on this release. If I can get good performance at a reasonable price, I will forget about Nvidia gpus. I can’t get an nvidia gpu anyway… it seems like they purposely control production in order to create artificial scarcity..
@@laszlozsurka8991 This GPU is not gonna be level of 4080… this leak is bokus! The rt part makes no sense at all. A game where 7900 is as fast as 4090 in raytrasing… is not a raytrasing game.
Bro, if the fastest AMD RDNA 4 is only as fast as a 4080 then they haven't only given up the halo segment. They have given up at competing at the 5080 level also. Meaning the fastest RDNA4 will probably be at a 5070 level. Talking about gaming with raytracing on because that is the only claimed data point we have. The real question is how will AMD stack up with heavier titles and pathtracing and any AI load. Because that is where they have been massively behind.
@@Jasontvnd9yes that exactly my point. You realize that RDNA 4 is not competing with the 4080 and 4090 right? But the 5090 and the 5080. They will only offer 5070 level performance is my bet.
Unlikely that the 5070 will catch 4080 anyway, plus 12gb again. 8800 xt should be faster snd more vram, problem is price, the most sold gpu is the 3060 amd doesnt care about higher end proeucts this time, besides when they build them nobody buys them so why bother
@@cajampaamd has said exactly this, youre not onto anything new lol. yeah 88xt will be between 5070 and 5080 raster and will beat it in price. just like right now, 4070 vs 78xt is a blowout in terms of raster value, the question is whether enough people care about value. 4080/90 buyers dont gaf
IF those numbers are true, there are no others issues then this card would go insane if it was sold for $499. Now granted this is very unlikely, probably it'll be $549 or maybe even $599 which tbh for 4080/4080S performance is still really good. But damn if they price it at $499 then holy crap, that should take market share. If those numbers turn out to be true I'm sold.
I can totally believe this one. AMD know they need to sort out Ray Tracing, and bump up RDNA, with RDNA 4, this time round, to stay at all relevent this generation. Considering they arnt compeating at the high end this time around, after botching their high end cards.
Check it out the FlexiSpot Recliner XL6 by clicking bit.ly/4g2sWlc Don't miss out! Use my exclusive code 'YTBXL650' for an extra $50 off! (Buy one and get 50 off, two units and get 100 off). Let’s make 2024 your year of ultimate comfort!
Haruki Murakami plug 😂
Pat as a CEO was fine. It takes close to 5 years to turn a company around. It's been only 4 years. And his predecessors messed up the company over 10 years. Everything after Sandy bridge up to 11th Gen was a shameless cash grab by bean counters.
This is exactly the kind of chair Homer wanted to have in this one episode.
that rDNA4 leak is old news... MLID told us this info in SUMMER 2023!!! BUT... we all know how things can change THAT FAR OUT.... I mean back then they were still gonna do the High-End cards.... And they were PLANNING on releasing in Sept 2024 too..... So it's nice to hear that the into was good ... as it ALWAYS is (okay okay, I would say it's correct 98% of the time... NO JOKE!) Also the high-end was supposed to be an MCM, bust as you may know that's been scrapped for rDNA4 and they're still gonna be monolithic.. and only 2 die for this gen... BUT they're supposed to be GOOD and CHEAP... so I'm stoked!!!!
oh I forgot to mention that AMD RE-designed their ray tracing... a completely new system... they filed the patents for it right after rDNA3 released ... so YES 45% IS CORRECT!!!! - NO JOKE bro!!!
Quitting a day before a giant conference is totally a good sign
Totally
Yuuuup…
Hope no one invested in them before this
@@stunytsuR looks like investors are happy. stock went up 5 percent in the last 6 hours
@@Athrein oh nice
Re 8800xt pricing, don't forget that AMD never miss an opportunity to miss an opportunity.
I mean an 8800xt being on par with a 4080 is really underwhelming, AMD is barely catching up while Nvidia is about to release their 5000 series.
@@fs5866that’s just in rt tho lol
@@fs5866 If it cost $500-$600 we really dont need more
@@fs5866Lets be real.. The RTX 5080 will be 15÷ faster than the RTX 4080 in the BEST case. Therefore, if AMD can reach the RTX 4080 for 600€ instead of the leaked 1400€ for the RTX 5080, this would be a big win in my opinion.
@@fs5866 It's way faster other than RT
And yes, Resident Evil is called Biohazard in Asia. So that is meant by that.
yep even Japanese versions specific called biohazards
Biohazard is the original name. It was changed to "Resident Evil" for English markets because there was a copyright conflict on the name.
@@flamingspinach I must say Biohazard is way better name for the game series. Resident Evil isn't really a description for what the game is about.
@@b.s.7693it is for the first game but thats about it tbh
not in whole asia. in south asia(india) its called RE4 remake
4080S perf for $500 is the pipe dreamiest of pipe dreams.
I'll have what he's smoking.
4080s is like 1200$, there is no way dude 😂
@@ziggs123 thats the MSRP of the 7800XT, it will probably be $600+ but who knows. If AMD really wants to get mass adoption in the lower to midrange, they need to do what Zen 1 did to Intel. huge price difference so that you cant ignore the value
but it has no performance where it matters. if you play raster only games, it's already no competition and you would go with AMD unless you want the better up-scaling and software ecosystem.
If it actually _does_ match a 4080 super there's no f'ing way AMD will sell it for a penny under $700. Not unless Nvidia themselves murder the 4080 for less.
The only thing AMD is consistent with is ripping you off ever so slightly less than Nvidia while giving you an inferior product.
I’ll believe performance when I see it, and not until. People always talk up GPUs before launch.
They make videos on them daily just to farm views
to build up hype ofc, and people could still got disappointed even when the hype was high
it's their career and money is involved, so obviously
8800XT only gonna offer some better RT over its ancestors. The raw specs are pretty crap. I think it could atleast have an upgrade on raw specs like the 7900xt had over the 6900xt. Now i get the feeling hte 8900XT wil be a 7900XT in raw specs. but just some better RT
Why did you guys clicked on this video then
RE: Translation issues on the forum post. Chinese doesn't have tenses. Listeners are expected to know that if they're talking about a future event, it's going to happen in the future! Machine translate struggles with that a lot. "Biohazard" is the name RE4 remake was released under in China, your guess is correct.
EDIT: 光栅光追 = Raster AND Ray tracing performance. Chinese frequently drops "and" and just puts 2 subjects together.
EDIT2: 跑分得赢 = "It can beat it [the 4060ti]" Google translate got entirely thrown off here. It doesn't say 2 points at all.
That's quite informative, thank you
So as usual media are adjusting information for clickbait and fake hype
Biohazard is not the China release name, it's the freakin' Japanese name of the game which the developer is from. It's basically the original name for the game. Resident Evil is basically the name to sell the game to the western culture. (Actually Biohazard makes a lot more sense than the Resident Evil name.) There are plenty of other movies and games similar to this case. I thought this was common knowledge. I guess everyone is video game history illiterate these days.
And " 光栅光追 " looks to me is more like "rasterized ray tracing" which basically means light ray tracing or what you see in RE4 - the AMD cards already weren't as bad as other games in RE4 - which means on heavy RT it will fall behind 4080. And likely not closely.
@@berkertaskiran This is a really weirdly hostile response where you appear to have imagined I said something I didn't.
I know the leaker; he is an AMD fan from China and is an AMD fan forum manager for a forum called "AMD ba," which is like a Chinese version of Reddit. His leak rumor about AMD CPU accuracy is far superior to AMD GPU. He told people that the 7900XTX could compete with the 4090 one month before it was released. So don't put too much trust in this leaker's information about AMD GPUs.
To be fair. AMD thought they'd compete with the 4090 at that time.
it does in raster performance and if you actually put the same power through it, rt was never an option tho.
It does in terms of raster performance and in value (price per frame) it beats it
@@sargonsblackgrandfather2072Not at all.
So the Chinese Moore's Law is Dead
Resident Evil 4 RT performance basically means nothing.
Finally someone who gets it. It's not even Spiderman 2 RT, which is quite light on its RT load as well...
@@n.erdbeer Well, no, because if it could beat a 7900 XTX even in a light ray tracing workload by such a wide margin, that would actually be even more impressive than beating the 7900 XTX in moderate or heavy ray tracing workloads. That would mean that the new top AMD mid-range graphics card has an absolutely massive improvement in standard rasterized performance over previous generation graphics cards, and would likely have better standard rasterized performance than an RTX 4090, and would be MORE THAN TWICE AS FAST as an RTX 4070 or 7800 XT in standard rasterized performance.
So, clearly you have absolutely no idea what you're talking about.
@@syncmonism you fail to recognize that current gen AMD already decently competes with Nvidia in raster performance. The next gen being faster in that department is no shocker and not even needed for the most part. New UE5 games struggle with CPU performance, not GPU performance. Hell, you don't need faster GPU for raster performance, when the only huge performance drain is RT at the moment. Showing performance graphs in light RT games like RE4 or Spiderman2 is misleading, since most of the gains come from faster raster performance. For that reason AMD has similar performance compared to Nvidia. A serious comparison would have been heavy RT games to see the generational progress.
The fact they even picked this title to begin with shows really heavy bias by who's posting it, or by who leaked it to this person making the post.
@@syncmonism Which is why this leak is likely completely BS.
If they call it the 8900XT they'll probably price it accordingly as well, and then drop the price by $100+ a bit later.
price themselves out of the market because the DLSS and RT are just much better than AMD's solutions
@@sinephase literally cause if AMD had any sort of striking distance in terms of Upscaling solutions and RT I would've swapped to AMD ages ago, but since I use DLSS and RT I literally have no choice but to upgrade to another NVIDIA gpu if I don't want to compromise on RT performance and upscaling
FSR2 and 3 are more than comparable to DLSS
@@vyor8837no
@@sinephase doubt they have Nvidia prices
If the rumors of the Radeon 8800xt are true, I hope they come out at a super competitive price, because a very low percentage is interested in the RT, the majority want raster price/fps
RT is becoming built in to most modern and future titles. For example Stalker 2 and BMW when you click on Ultra RT turns itself on.
Don't forget ML upscaling. The reason why I am with Nvidia is because of DLSS and DLDSR.
I want it all.
No such thins as a competitive price. Learn economics
@@РаЫо jajajajajaja
In East Asia: Resident Evil = Bio Hazard
All that matter is the pricing, if AMD price it right, they may have a banger on their hand, but then again it's AMD, i"ll be surprised if they donn't fumble it this time around..
They probably will screw it up.
Doesn’t matter what price they have it at. People will still choose Nvidia.
@@apricotmadness4850 They should just price everything at a dollar so they can finally get some market share and people actually getting used to and buying their stuff later. The Nvidia slopfest is annoying.
@@apricotmadness4850 at the lower end it does matter. The problem is the difference is never big enough, AMD only slightly undercut NVIDIA every time.
@@apricotmadness4850 bs
That’s a retirement gaming chair!!! I’d never leave it 😂
lol
Still not close to where RT performance needs to be.
45% faster RT? Where have I seen this before recently? Yeah...with the PS5 Pro. Curious. Very curious.
Don't be such a downer man, any hope is good for the current gpu market.
@@shk0014 I just connect the dots. Nothing more and nothing less.
That 45% number with the PS5 Pro was in relation to render performance overall not RT.
In RT it can be 2-3x the performance calculating the rays. I do wonder how much of that is RDNA4 or just Sony's custom optimization.
They said PS5 Pro performance was 45% faster in Raster not 45% boost to RT
So faster with frame generation!
That makes sense. If they make fake frames we get more speed!
Intel’s CEO quitting is definitely not a good sign for the company especially considering all that’s currently happening with the company.
As much as Intel's stock has plummeted under Pat Gelsinger, I do think that his plan to get Intel fabs up to snuff again was a good one, and it's something that the company should continue to do. It was under the leadership of Brian Krzanich when they really deprioritized their fabs business and, over time, lost their competitiveness with TSMC's node going much further ahead. It was basically under him when Intel stayed under the same node from 6th gen all the way up to 11th gen, because they deprioritized the funding to their fabs division. When Gelsinger came on board, he made it a clear point that he's going to get almost all of Intel's resources so that they can once again gain a competitive position in the Foundrys business and get their CPUs to a modern node.
Yes the plan was good but trying to do it all at once was the biggest mistake. Committing to over 100 billion dollars worth of fab construction in a short time does not help a company that doesnt make that much in revenue in a year and a down turn in the market or market shares will hurt alot just like it did.
@@blackknight2169also they paid a lot of money to show up late to the party with GPUs and AI. If Alchemist had launched a year sooner (hell, even 6 months), they would have sold way better. Same story with Battlemage: too little, too late. It's DOA if they can't price it under $250.
Arrow Lake was supposed to be on their own node, but they couldn't deliver that in time either. Even on TSMC's advanced node it landed like a wet bag of farts.
They've strung together so many consecutive boondoggles that they're going to have to merge with another company or start selling parts of their business to survive.
I mean, they are said to have twice as many rt boosting cores
Fun fact, Resident Evil 7: Biohazard is called Biohazard 7: Resident Evil in the asian market
Seems legit as here we got small brains and big Johnson, and they have big brains and small Johnson
Lol
If they don't have a x900 class card, they will likely not call it like that. We've had RX 400 series, that stopped at 480, we had the RX 5000 series that stopped at 5700. It goes by core configurations and what that generation of the core has the potential to look like. RX 5700XT was 2560:160:64, so was RX 6700XT, RX 6800 was already 3840:240:96. They knew the potential feasible maximum of the RDNA architecture when they brought RX 5000 series to the market, just like they did with VLIW and GCN, and the 8000 series is still RDNA, so they will name it accordingly to the core configuration. That's my opinion at least following AMD naming logic which dictates the naming.
Good thing Intel didn't name any engineers to replace Pat, otherwise they'd actually fix the place.
doubtful Pat needed more than 4 years to right that ship. Anyone else coming in will face the same challenges will probably take 5-10 years to get intel back on track.
The Resident Evil series is called Biohazard in Japan (and presumably other parts of Asia)
AMD REALLY NEEDS to deliver on FSR 4,
even if they catch up with Ada RT wise, Blackwell will be probably a lot faster in RT.
What AMD need is for FSR 4 to be available on RDNA3/4
And be supported in plenty of games, and it needs to look at least as good as DLSS 3.0
lmfao if they want fsr 4 to be delivered they CAN'T do shit on rdna 3 because rdna 3 does not have any powerful designated ai learning cores therefore you said nonsense
dlss FG was designed only for geforce 40 series because they can boost DLAA to the next level without rendering at low resolution that way using ai cores to compensate for performance boost
I want them to drop RT and stop this BS. And give us actual hardware upgrades.
This is essentially what MILD has said before for top end RDNA 4, 7900xtx/xt raster and up to 4080 ray tracing performance.
If AMD can deliver that for $500 or less it will be an absolute hit.
Daniel @ 9:10 Even tho RE4 Remake has very minimal change in fidelity with RT enabled on max. it still takes a huge chunk of performance with it as if it was a good implementation
Make good raster implementation that does not differ much from ray tracing.
Trash rt implementation for not looking different.
Duality of perspective heh
there's like 4 games with good ray tracing no one is even playing and yet it consumes 90% of air in the room when discussing performance.
@@damianabregba7476 raster will never look like ray tracing. Cuz RT uses real life physical calculations on how lite refracts and bounces Relative to the perspective of the viewer. Bqked in lighting is something that is not interactive. Its static lighting and shading of light
@@rotm4447 i can name more than 10 games with good RT that makes it look 180 different. But the reality is only 19% of all GPU owners use RT, devs now are making it a non negotiable option in games lately
@@sufian-ben I doubt 19% of gpu owners have a card that's even close to capable considering steam stats.
Doubt about those performance figures for RDNA 4.
they lied before, and will again
Yeah i somewhat agree, this potential 8800XT would need to be a whopping +51% in raster Performance to match that of the 7900XTX. Which seems a bit far off. I imagine it will be more in the 7900XT territory which still would require a 30% performance gain. But lets see maybe this highest end rdna 4 card will be the same as the 7900XTX but just more power efficent and with better RT Performance, that would be the best case Scenario.
@@rulewski33 its been over 2 years bro.aint that hard
@@Ericozzz who lied? wasn't this supposed to be a leak :D
Yeah. Fun boy style ”info”…
Pretty sure Gelsinger was "asked" to fall on the sword for a lot of the screw ups and bad press for Intel lately. He leaves the day before a big announcement so that way, people don't see him leaving AFTER the announcement and think, "Oh, he's leaving because it's already DOA." I mean, the day before isn't exactly much better, but it's not like there's a good time to really sack a poorly performing CEO (though immediately following the Intel Ultra launch may have been the "best" time?).
I'm not an Intel fangirl by any means, but I really hope they can stick the landing with their GPUs. God knows, we need something decent. (And yes, I've bought Arc cards. Four of them. Because I'm an idiot.)
How many PCs do you have? 🙂
Resident Evils original name is Biohazard but as you would have guessed it the name was already taken in the US so they renamed it and kept the Biohazard name for most of Asia as it was firmly established as such.
For the 8800XT, if its a 4080S at $600, I'd buy it. And that's the max I would go. Any higher its a no. If released at $500, which was the 7800XT MSRP, then this is a definite BUY BUY BUY.
Us: $500 👈
Amd: $600 👀
I mean XTs are still over $600 and this is its replacement
I won't mind it at $600 IF these rumors of matching the 7900 XTX/4080 are true (and I'm skeptical of that). If it only matches the 7900 XT (with improved RT), though, I hope that it isn't over $500.
@@Osprey850 Agree with that
I hope it will be priced at $500.
With tech hardware, nothing can ever be a “mild leak” or an “interesting leak”, it has to be a MASSIVE LEAK
Do you know what else is massive ?
I had a feeling that Intel Arc was going to be worthless, but kinda 'had to be done' to figure things out, and then Intel Battlemage would catch up to Radeon 6000 and GeForce 3000 - which would be absolutely 'fine'.
I'm not expecting a grand slam hit, but 'CEO quits' was not the driver crash I was expecting.
No way this thing is as fast as a 7900 XTX with 24Gb of basically 1Tbs of bandwidth. I'll be BLOWN AWAY if that's the case or even near it
Wouldn’t be that crazy. The 4070 Ti is slightly faster than the 3090 with about half the memory bandwidth
The Radeon VII had 1TB of memory bandwidth before the 7900 XTX did, and a 128-bit 7600 XT is in the same ballpark of performance (likely faster, even)
Nvidia is going to sell a 70 series card, for $800. I'm done spending that kind of money for a mid-series card. I still use a 6950xt at 4k and the performance is just fine. I feel like having an Nvidia card is just heard mentality. Meant for bragging rights. Same with Intel CPUs.
If the RX 8000 XT really is RTX 4080 levels of performance, they will probably sell it for 50 USD cheaper and still have worse upscaling etc AMD never learns.
$50 cheaper is just fine!
Thats what im expecting too.
@@haukionkannel it has never been though lol AMDs market share stinks.
@@nameless.0190
Does not matter, That is what we get,,,
Eithei 5070 12Gb at $700 or 8700XT 16gb at $600-$650… TaKe your middle range pick among those two…
@@haukionkannel I would pick the 5070 ti over the 8800xt just because fsr is terrible
If we get a 4080 super for 500-600 USD that would be the biggest grand slam since....the inception of the Grand Slam.
Just for once AMD, use your brain. Call it the 8700XT and have it come in at $500 if FSR4 is not ready to rock and in at least 3 games at launch. I've been having a blast with my XTX, but if the 8700XT is on par with the XTX in raster and goes +/- 5% of a 4080 when RT is cranked to max, then I will happily "side grade" for $500 and sell my XTX to someone who needs the 24GB, just to get the lower TDP. I went from a 5950X + 6700XT to 7950X3D + XTX and I can most certainly tell the difference in heat output.... And remember that there's a lot to a name. If it's called the 8700XT and swings at the 5070 in performance, but with 16GB and a hundred bucks cheaper, with the promis of FSR4, then that is going to be attractive to a lot of not-stupidly-brand-loyal people... And you WERE going for market share this round, right?
What you laid out really is the most sensible move for them (though I doubt 4080 perf in titles with path tracing).
@@travisjacobson682 Other sources reported, about six months ago, that they were targeting 4070 Ti Super levels of performance with full path tracing turned on. Since the 4080 is only 15% faster than the 4070 Ti Super, it is not impossible. RT is not some magic technology that is super hard to do right. I mean, Intel did it pretty well on their first attempt. AMD just didn't figure it would be such a big deal for people. And honestly it isn't. Everyone I know turns it off because 60FPS AVG with 37FPS 1% lows suck ass once you've tried 90-120FPS on an adaptive sync monitor.
@andersjjensen Targeting levels of performance is not the same as achieving them.
People don't care about RT because performance in RT sucks, not because RT sucks.
I agree it's not impossible dark magic, but it *is* incredibly hard to do. We know this because Nvidia has been struggling to do it well since Turing!
I would argue that only the 4090 is able to deliver an enjoyable 4k RT experience, and even that often requires concessions in some titles (for example: Black Myth Wukong recommends 4k DLSS Performance mode when RT is set to Very High). The 4070TiS and higher are in rarified air, so I'm skeptical AMD will be able to flip a switch in their architecture labeled "RT" and join the party. AMD's performance in full path tracing games is woefully behind Nvidia. I know they redesigned their RT cores in RDNA 4, but I just won't be convinced they'll match the 2nd/3rd best GPU in RT with a sub 300 mm squared, 250-ish W mid tier offering until I see it with my owe eyeballs.
I want to believe! Lol
@@travisjacobson682 Nvidia has not been struggling since Turing. They simply don't want to allocate that much die space to something that is useless to the vast majority of their customers (professionals). If RDNA4 has chosen to stagnate in AI performance compared to RDNA3 (which is already quite good, actually) at the expense of significantly beefed up RT I can see that happening. It is, after all, getting a half-node jump in lithography, whereas Blackwell is on the same node as Lovelace. So going toe to toe with the 5070 at node parity does not seem out of the ordinary to me.
My thoughts exactly! What we need is a card slightly better than Intel B580 from AMD, nothing too expensive. A card like that would sell like crazy.
Bye PAT. If the 8800XT is really that powerful... if AMD prices it right, it will be a massive hit. Biohazard is the asian name for resident evil.
Market doesn’t work like that. Nvidia is a richer company. Meaning they can afford to lower their prices if AMD lowers theirs to raise their market share. Let’s say 8800xt releases at 300. Nvidia then would easily afford to price their 5070 or whatever at the same 300 and keep their market share. Because of this, nvidia and amd just agree on the higher prices they will sell their gpus like 600$ that customers would pay anyway. Most people foolishly ask question like “ oh man why amd doesn’t just lower their prices and outcompete nvidia”. The thing is Nvidia would lower theirs too and would be able to maintain a lower prices even at a loss for much longer. What i’m trying to say is that Nvidia will still be better. Just like 4070 super is better than 7800 xt, 5070 will be better than 8800 xt no matter what
@@РаЫо bro is high af writting all of that. no shit we know nvidia card is better. people just dont want to pay for that if they want to game. tariffs threaten the costs of cards next year
@@РаЫоI stopped reading when you mentioned Nvidia lowering their prices. This has never been the case.
@@dakbassett Jesus Christ. I said in response to AMD lowering their prices. How are people this dumb
@@talontedperson1305 You will buy 5060 and be happy
if its on par with 4080 but costs more than 650$, then it's pretty much pointless.
We need decent performance at 600$ range.
And good performance gpus at 300$ range.
Why is nobody talking about the 450 dollar/euro range for GPUs. I think thats the sweet spot.
@@tontsar91 agreed, we need a good gpu at that price range too.
A true to form ad for that recliner would be you falling asleep to a football game after a Thanksgiving meal
250W 7900XTX is kinda epic. It of course depends on pricing, but since I am stuck with AMD GPUs (linux gaming) I would definitely get one.
RE4 Remake as a RT title? And only 45%? If it would have been Cyberpunk in Path Tracing or Alan Wake 2 I would have been hyped!
imagine if nvidia had that with geforce 50 series, it would have been an absolute mind blast
So "8800XT" will be roughly a low cost, low power, RTX 4080 ... Nothing to complain about especially if that price / perf ratio comes out strong. I hope they do, then crash the price of the remaining 7000 series stock - clear out the old inventory, increase brand uptake, etc. If they want to focus on selling more, they need to destroy NV in price/perf for mid and mid high tiers.
even when AMD had a better low-mid gpu at al lower price than Nvida, Nvidia still sold more. it's not the price or performance that Nvidia has 88% AIB market share and AMD has 12%.,
45% faster rt than 7900xtx and 4080 raster level of performance with lower TDP is not mid range, its top tier GPU.
It's gonna be entry level high end once the RTX 50 series launch. Also don't forget that the 8800 XT is gonna be competing with NVIDIA's RTX 50 series not the 40 series.
@@laszlozsurka8991 Nvidia is gonna gatekeep any extra performance and whatever they give you will be more expensive
Doubt it's going to be 7900xtx and 4080 proformance. Think people are going be very disappointed. If they truly believe these rumors
@@BigAndTattooed Yep, last time AMD lied.
Its gonna have to compete with the 5070 when it comes out. Hopefully its priced appropriately.
Biohazard is the original name for the RE games in Japan, seems like Asia in general adopted the title with the west seeing them called Resident Evil.
@5:20 - ish...remember, the 5700xt was the top-o-the line for the 50 series and they didn't call it the 5900xtx now did they?
That is why 8700XT would be good name for this!
People seem to forget Pat Gelsinger came OUT of retirement to run the company in the first place. CPUs have such a long lead time that we're just _starting_ to see what would have been built under his leadership. Look how long after Lisa Su took over before AMD finally surpassed Intel, and even _that_ took Intel stumbling pretty hard.
Laying all the blame at Pat's feet is exactly the brainless shit I'd expect from an investor that demands profits TODAY, but DGAF about a company's health 5yrs down the road.
I bet for RDNA 4 they doubled the RT cores per WG vs 7000 series and its where the +45% RT comes from.
F fsr4 better run good on rdna3 still
The rautrasing info does not make sense at all.. all before would say that raytrasing is 7900XT level and pure rasterisation weaker than that.
I dont think so. RDNA4 probably just adds Hardware BVH to the ray tracing cores. Basically RDNA2 and 3 did not have the same full hardware ray tracing as nvidia had. Some things where still shared with the shaders. Which is why Nvidia was just much faster. RDNA4 probably just does the same as Nvidia and does everthing in hardware.
More L2 is my guess. Ada has a butt-ton more L2 cache whereas RDNA3 leaned heavy on Infinity cache (L3)
The large amount of L3 was good on a macro level, which made it competitive and even scoring wins in raster. The L2 cache on Ada is an advantage on a micro level, so RT, ML, AI, the derivative stuff like DLSS.
XTX comparable performance at GRE power sounds decent, so long as it's priced not-dissimilarly to the GRE.
Still a heinous amount of money for a GPU.
Battlemage will do well if they really have sorted out their drivers
and power consumption and dx 11 and older support, and general problem with encoding videos, and i bet there is more...
The 4080 Super having a 83% lead over a 7900xtx in Cyberpunk 2077 is actually even understating Nvidia's lead. Without RT NVidia is like 10% behind AMD in that game, because it loves RDNA3 for some reason in raster. When you turn RT on, most of raster isn't disabled, so with Nvidia pulling 83% ahead, it has to first regain the 10% raster deficit, and then pull 83% ahead of that even. I don't think it be wrong to say that if Nvidia had a pure artificial RT scenario, they'd be 100% ahed of AMD or about 2x AMD RT perf.
Lets play Nvidia gimmick game to test amd card and explain how nvidia is better...
Are we in some kind of GPU hell where all cards other than nvidia's flagships approach some kind of asymptote at the level of the 4080 performance, while the Nvidia flagships alone go on above and do so with increasingly outrageous pricing?
You know what i learned from buying flagship 10 years ago? Support is greater on cheaper cards with 5% market share, not top card 0.1%, if indie games used traditional DX11 features only, 90% of people would be happy with 100+ fps and no damn DLSS, TAA, fsr frame gen. UE5 truly is the culprit. Badly optimized engine + devs with little time to optimize = pumped up mega textures, blurry game with 60fps on a 4090. Yet most played games run kn igpus at this point
nvidia is pushing more and more wattage onto their gpu to push more performance.
I just hope what happened to intel chips wont happen to nvidia chips in the future.
@@sc9433 I hope it does, its the only thing keeping their pseudo monopoly on the gpu market
people complaining about the gpu market is so overblown, you can play everything with low end hardware now. Why do I even want a 4080 to do literally 200 fps in 2k.
@rotm4447 that low end hardware is the cost of high end hardware from 10 years ago. Also, the fact that you made that statement proves you either dont care about money or dont play newer games. A 4060ti is immensely overpriced for a 60 card. That 4080 will be minimum spec in a short time with how incompetently optimized games are getting. The only people "having fun" are the loaded gamers nvidia knows will pay anything.
Alchemist in reality did "overperform" in synthetic benchmarks sure, but that's actually just because drivers and game optimizations couldnt use all the hardware they had, if they did, Alchemist would've performed a tier above what was placed at.
I hope battlemage is good and I hope the 8800xt is as good as the leaks say. Let nvidia have the $2000 market, we need the competition in low and especially midrange.
Nvidia has 88% AIB market share and AMD has 12%. even when AMD low-mid tier is cheaper than Nvidia, people still buy Nvidia. Nvidia is the equivalent of Apple. people will buy a $1,400 iphone cause it's Apple. people will buy Nvidia cause it's Nvidia. btw, 4090 was sold out day 1 of release. it didn't even take 1 day for the gpu to sell out. on steam survey, the 4090 users are more than any AMD gpu. we're talking about the 4090. the reason for lack of competition in every tier is not the price.
@@SapiaNt0mataprobably because AMD is non-existent in the laptop GPU market and on desktop GPU nvidia is available all over the world
Stop blaming consumers for nvidia being the only choice
@SapiaNt0mata, I buy Nvidia's GPUs because they perform a lot better in RT, consume less power and they cost the same (sometimes even cheaper) as AMD's GPUs in my country.
@@jesusbarrera6916 there's an estimated of 1.9 billion PC gamers(not laptop), so i don't think laptop is enough to justify the difference in market share. i don't know about laptops, but AMD sells even in Lithuania, so i don't know how you say that Nvidia sells all over the world meaning that AMD isn't.
you tell me not blame the consumers, while you make false claims like AMD don't sell worlwide to blame AMD.
@@TheSEWEGI best selling gpu are rtx 4060 followed by 3060, so the majority of PC gamers can't enable RT. and if we include laptops, most laptops sold have 4060 and below gpu, so the majority also can't enable RT.
as for power consumption, the average consumer has no idea about this, so they don't buy Nvidia cause it consumes less power. and what is this utter nonsense about "consumes less power"? it's not like the electricity bill increases by $200 per month. the increase is negligible and is not concerning to make such a fuss about it. btw, do you have any idea how much money you waste without needing too? i don't see you talking about this waste of money. but you draw a line paying a few $ per year for power consumption. absolute laughable. peak comedy.
as i said, most people buy Nvidia the same way people buy iphone. it's brand recognition.
I feel like ray tracing gets such disproportionate focus. I almost never turn it on, it's such a performance hit for what it gives.
8800xt musnt cost more than 500 and tbh wouldnt expect more than 7900xt levels of perf
If it's really 4080S performance it'll be 15% faster than 7900xt. 500 is a bit unrealistic though imo.
@@l1nthalo196 7900xt rn costs around 600. So they will release 8800xt for 600, and lower it to 580 a week later.
@@nameless.0190 could be. I'm thinking of selling my 7900 xt for 500-550 and then buying the 8800 xt. If the lower consumption is true it would be great (live in europe ☠️)
I cant wait for the 5090 3rd party benchmarks
Okay, I know we Gamers get older, with my taste in classic games and especially remembering the 8800XT(X) that came before this one, I'm already feeling quite old sometimes, but that chair ad still scared me. Yes, I know I was already alive when Tomb Raider 1 came out, but I wasn't when Buddy Holly died - could you youngsters please at least pretend I'm still fit enough for a gaming chair? :)
That chair has your name on it old man😂😂
This might be wishful thinking. If AMD does release a 4080 Tier GPU and price it aggressively, it would be a step in the right direction. But given RTG's mismanagement when it comes to PR, I'm cautiously optimistic.
Pat getting fired is bad news.. intel is even worse than I expected. Firing an engineer CEO is very very bad
He wanted out.
its not just firing lil bro
he couldn't take any more about ppl shitting on their gaming performance, 0% arc market share, amd dominating in cpu sales - TOP 10 cpus sold on amazon all ryzen
13-14th gen desktop cpus degradation and denying customers RMAs, Arrow Lake having tile teething problems
getting murdered in stock market -50% share down from last year, dropping all the way to the bottom of $18 of stock when they were dominant at $60 at 2020-2021
🗑🗑🗑
@@BlackJesus8463 idk man.. you don't become a CEO just to get out asap. Also the guy looked enthusiastic and excited. I actually kinda liked the guy, it's gonna be sad when he's replaced by some soulless marketing corpo. Engineer is a different breed
@@despairdx that's just intel. The guy was there for a hot minute, things like this don't turn on a dime. Intel is worse off without him
intel was priced on the stock market 4 years ago at 250 billions, look at how much is worth the company, 100 billions
the company is ready to be sold in pieces
only government could save them from being sold in pieces
I think that if they want to communicate budget/ value, they would call them 8800xt/8700xt reminiscent of the 480 and 470, if thats what they are going for.
also reminds me of the old 8800gt from way back
If only there were humans who knew both Chinese and English. Wccftech would pay their weight in gold, I tell you!
According to information obtained from manufacturers and employees of certain companies, the 5000 series from nvidia looks terrible in terms of specifications and price.
Intel arc a570 and lower models look good $160 for 8gb the model is a perfect match looking at the rtx 5050 4gb for $220.
9:00 yes Biohazard 4 is the Japanese and Chinese name of RE4
As long as the new flagship is not worse performance-wise than the 7900 XTX and efficiency improves by 25% it's still good. Slap the 7900 GRE price on it and it'll sell like hot cakes (like the 9800x3d). The 80/90 gap has been steadily widening, price wise the 4090 is more than 2x more expensive in EU vs 4080 super = 223% the price for like 30% performance uplift in my country, it's ridiculous. Not that the 4080 super is any good value, 7900 XTX beats it fair and square. You pay the 25% Nvidia tax for DLSS and a bit of ray tracing if you dig that.
I was skeptical to their new strategy at first as a regular Radeon flagship buyer and enthusiast, but it could be a good temporary stepping stone if they pull this off right. They wrecked Intel which was deemed unsinkable and dominant at first. Hell, I used to pair my Radeons with intel's i7's back in the day (like 10 years ago). The problem with Polaris was that they only competed with 60-series at most. A good line-up of cards up to the 80-series level which beat their respective competitors price/performance is better than trying to release a 5090 competitor and neglecting everything else to do that. The future looks good for Radeon, please don't mess this up AMD.
IF those leaks are accurate (I'm taking it with a big 50lb bag of salt) I'm excited to see what people can do to them with the Elmor Labs EVC2. you can push the 7900Xt/XTX up to at least 600W for some improvements to performance (albeit pretty big diminishing returns past like 400 maybe 450 watts) so if they just turned the power down from previous gen to turn it down and not because it REQUIRES lower power you may be able to give it that 25% extra power or more and see some benefits.
The 7900XTX run slightly under 400W at *stock* speeds... this leak is suggesting the 8800XT will be a 300W card.
@@glenndoiron9317 that is correct... that doesn't change any of what I said...
There's a rumor that Seasonic listed RX 8800 XT as 220W, so if this GPU is going to be ~275W, AMD might call it RX 8900 XT. If it was called RX 8800 XT, it should cost $500-600, which would be amazing for RTX 4080-like performance. I think it'll be more like $650-750, still not that bad. The actual RX 8800 XT should delivery something between RTX 4070 Ti and Ti Super performance for $500-600.
ray tracing performance is a bigger deal than people make it out to be. A lot of new games have some ray tracing by default that will make nvidia GPUs out perform the AMD GPUs in spots with heavy rays
When the lighting system is completely run by raytracing it will matter and look amazing. But that is many years away from being practical. The most popular gpu is 3060 so raytracing will be optional gimmick for a long long time.
Ref: Gamer chair.
I have a knackered spine and sitting at a desk causes unbearable pain. I had a MRI of the top half spin ~2 years ago, the basic comment was that "not a single vertibra or disk was correctly aligned".
This is why my PC’s monitor is a 65" LG TV that has VRR (AMD Freesync) from 48 to 120Hz, I use my PC sitting in a comfy recliner armchair, about 8-9 feet away. I just have two long USB extension cables so my keyboard and mouse are used with a plywood board that sits across the arms of the chair.
I’ve suffered back pain most of my life, which is common for people like me who are over 6’ tall. I think the rapid decline was caused by close to 6 years commuting to London by train. I remember taking my dad for a check-up before having a heart bypass. On the train he said to me: “I’m certain that somewhere there must be a hunched back midget who finds these seats comfortable”
Oh boy! This guy who found a working 4090 in a trash can is blessed! I would like to have this kind of luck.
unfortunately chances to be that lucky one is 0.000000000001%
common sense statistics: there are 7-8 billion of people exist, and how many of them would have that type of luck?
idk but card could have damaged cooling or internally damaged but if he tested and it ran perfect then hes lucky af
Better RT is fine but im here for FSR4.
So then the RX 8800 XT (if that what's gonna be called) is gonna be a 4080 SUPER in raster performance so it's gonna be competing with the RTX 5070 Ti.
That's good since the 5070 Ti is gonna be a $800 GPU so the 8800 XT being around $500 is gonna be a good alternative to NVIDIA's next gen entry level high end class.
Highly doubt they price it that low, 600 bucks at the very minimum
Nobody said there won''t be a Intel Arc dGPU. Thay are just not chasing high end any more. A770 wasn't highend but they were ***aiming for *** 3070, it had delays after delays and didn't hit the target. With BMG they are not even **aiming** for 4070 anymore.
if 8800xt is 700 bucks max its gonna be insane deal
at 700 +100 bucks you can buy the 7900xtx imho
Basically, ye, if they make the current 4080/7900xt/xtx class cards affordable but also boost the RT to be competitive then they'd fly off the shelves.
So far, AMD has lagged behind for a generation with each iteration, but as we all know, 20 series cards were beta products that couldn't actually Ray trace at any significant speed.
3080/90 were the first to actually be usable sub 60 FPS, but I'd argue the 4090 is the only full RT card cause only it can raytrace at 60fps in most games.
Boeing made a similar to move to Intel, replacing Engineers with Financial and Marketing bods, that went well.
Boeing fired almost all of its quality control. I mean it doesn't take an engineer to bolt the doors in.
@@BlackJesus8463 Boeing also stopped using washers under bolts in fuselage. Flying this shit is a death wish.
Boeing conducted a buyout of a smaller, troubled company by the name of McDonnell Douglas. They had quality issues. What happened is post merger, people from that company eventually were promoted in Boeing to run things. They ran it into the ground for profits, no pun intended. Continuously cutting the R&D budget to buyback stock. Boeing essentially got “Trojan horsed” by the smaller, garbage company.
Is it possible that the RE4 RT result is actually the low point? because if it's theoretically a game where RDNA3 didn't hit the RT core bottleneck issue it means it was running possibly at full speed.
If it's 45% faster in a non bottle necked scenario, does that mean it will be better in other scenarios like Cyberpunk? which overloaded RDNA3's cache setup with the ALU's which is where the bottle necking came from.
Just go with AMD with more VRAMs, comparable RT and AI FSR4(AI PSSR like upscaling). I am sure the unreleased 5070, 5060 TI and 5060 are already obsoleted and they will make good e-waste.
PiSSeR is a mess of an upscaling
5070 will destroy 8800 xt. We can come back to this comment to confirm
The 5070 is going to be faster and probably only $100 the rx8800xt lol
@@РаЫо ah i see why you were high in the last comment. you are an nvidia fuckboy fan
@@РаЫоthey'll probably be pretty similar, with the 5070 pulling ahead in heavy RT titles (AW2, CP2077, BMW, etc), but also falling behind in 4k due to 12GB VRAM
I love new gpu releases as always, because older gens used go cheaper
I purchased the sapphire nitro 7900xtx 4 months ago and I couldn’t be more happy with performance. 120fps @4k epic settings on most AAA games!!
Nice cards but too expensive.
Which games? Do you use upscaling?
Are drivers still an issue? I was considering that myself, but I’m worried about all the people I heard had crashing issues.
@@inkara3771no. They are lying. I went with 7900GRE 2 months ago, 0 issues and I'm happy to use Adrenaline on top .
Also runs quite and cool and with some smart options in Adrenaline you can lower energy consumption if you care without losing much performance, or you can OC super simple
@@rasluffi when bluds talking about aaa games and say 120 fps at 4k max settings with 7900xtx yk he means 2019 or maybe some 2020 games
he would never include alan wake 2, cyberpunk 2077, wukong, stalker 2 and list goes on... star wars outlaws, horizon forbidden west etc.. and would never mean 2022-2024 games
its like all known same scam icon on the PS5 or XBOX box "4k@120 fps"
We absolutely should stop associating number schemes (xx60, xx70, xx80 or x6, x7, x8) with expected performance or performance tiers. The formerly sort-of valid "70-level performance" and similar phrases have become completely unreliable. Pay attention to specs, performance, and price, not numbering schemes.
If this new RDNA4 GPU (the 8800XT?) is as good as this so-called leak apparently indicates (note all those backhanded disclaimers), and if it retails for
I personally wouldn't buy a graphics card with anything less than 16gb of vram. Once you're at 4k, it becomes necessary. I constantly had issues with 12. OLED monitors need to fix their subpixel layout so text is legible. People who can afford them have to work to pay for their gaming habit, and you can't work on OLEDs comfortably at the moment.
Even at 1440p i wouldn't want anything less with how some of these modern titles are absolute pigs on ram usage
I am a 3d animator and i have been working on m 32 inches 4k qd-oled displa for over 8 months now without any problem.
If you can afford a gaming oled you can afford 2 cheap monitors for work and save the oled for only games and movies.
AMD has done a lot of naming a low number when they have a smaller gpu, see the 5700 series for example
Ray Tracing is the most over-rated and useless tech feature, since 3D TVs.
Coping hard, lets be real.
You said the same about DLSS, and same about frame gen until AMD added it. We know how you AMD fanboys operate lol
@@nameless.0190 Guess you haven't seen the HUB video yet.
amd fanboys getting mad
pure truth, no one wants to look at puddles of mudd to see reflections, you want to play a fun game
@@nameless.0190 It is because to many games "meant to be played with nvidia" cause they lazy fkers and want auto tune their games instead of doing it manually. so lets force other manufacturers into the same gimmick software/driver crap cause the games desire it.
well hey like always AMD never misses an opportunity to miss an opportunity, so they'll probably price this at $650-$700 mark making it not very competitive at all.
I hope Pat isn't having health issues. It's hard to imagine Intel's doing so badly that he'd pull a "resigned effective immediately" move over that (or he got forced out) ... but maybe. The timing and sudden departure are bizarre.
I mean, they're at one of their worst moments in history. Pat has made some big mistakes recently that cost the company a lot in lost deals, I imagine he was encouraged to retire
@@jonah11111 Fair, and probably the most likely explanation(s).
What do you mean "imagine"? Intel is literally a dumpster fire
nothng bizzarre, company lost well over 150 billions in value in 3 years, there are 3 companies interested in buying intel in pieces or as a whole
intel sank
the bugs with the intel 200 cpus wasn't great but otherwise I would think Pat Gelsinger has done fairly well in a difficult situation that he didn't create, I certainly wouldn't replace him with marketing folks unless he had done something really bad. Why not let him stay until a replacement is found, that just looks bad for intel as a whole.
If I was intel I would have focused more on enterprise NPUs because afaik tensor chips are relatively straight forward to design, tsmc is the great equalizer and nvidia products are 40x overpriced which leaves and incredible margin to disrupt. I know there is a bit of a software moat but if you can get a neural warehouse for 1/10 the price, you can adapt some software. Just meta alone has spent ~25bn$ with nvidia, cutting that down to 2.5bn would put intel on the map and make nvidia's stock price drop faster than israel's reputation. AMD should have done the same long ago, I don't know what they are doing. Those cards cost less than 1000$ to make so how are they not absolutely destroying nvidia's 35k price point. Something strange is going on.
Speaking of AMD neural fumble, if RDNA4 doesn't have strong tensor performance they have seriously failed. Also they have a big opportunity to defeat nvidia by putting much more ram on the cards. Ram has dropped to a fraction in price and nvidia is trying to give us the same ram the 3rd generation in a row which is especially grievous given how mem hungry AI is. AI is coming to games, it's going to be big that NPCs aren't just dumb puppets flailing about but interact like they are real. The language model to make them speak alone could swallow 12GB
He quit, to stand in line at the nearest Microcenter for an Arc!
8800xt should be 2x in rt of the 7900xtx for match the 4080 in rt
"quit"
I love how you went to the source and broke down your interpretation for us step by step
In the beginning I think I saw it said that the new card is 45% faster than the 7900XTX in Rasterization but equals the 4080. Was confused cause the 7900XTX is usually faster than the 4080 in raster but I assume it's a slight improvement in raster to equal the 4080 but it's RT equals a 4080? If so, and it's priced around $600 or cheaper, then that will be a game changer for pc gaming for the price and availability.
I’ll keep an eye on this release. If I can get good performance at a reasonable price, I will forget about Nvidia gpus. I can’t get an nvidia gpu anyway… it seems like they purposely control production in order to create artificial scarcity..
So now AMD is competing with rival last gen GPUs
AMD is competing with 5070…
@haukionkannel lets see how that goes
@@haukionkannel Nah, 5070 Ti because the regular 5070 is gonna be around a 4070 Ti SUPER. RTX 5070 Ti is what's gonna be at around a 4080 SUPER level.
@@laszlozsurka8991
This GPU is not gonna be level of 4080… this leak is bokus! The rt part makes no sense at all. A game where 7900 is as fast as 4090 in raytrasing… is not a raytrasing game.
5:18 - By that logic, the 5700XT-Platinum Edition would've been called the 5900XTX.
Bro, if the fastest AMD RDNA 4 is only as fast as a 4080 then they haven't only given up the halo segment. They have given up at competing at the 5080 level also.
Meaning the fastest RDNA4 will probably be at a 5070 level. Talking about gaming with raytracing on because that is the only claimed data point we have.
The real question is how will AMD stack up with heavier titles and pathtracing and any AI load. Because that is where they have been massively behind.
You realise that the 4080 is a high end GPU right... 4090 is enthusiast level.
@@Jasontvnd9yes that exactly my point.
You realize that RDNA 4 is not competing with the 4080 and 4090 right?
But the 5090 and the 5080.
They will only offer 5070 level performance is my bet.
RTX 5080 = better performance, $1200+
RX 8800 = maybe 25/30% slower, $700 or less.
Yeah I think I know what GPU I'm picking next gen.
Unlikely that the 5070 will catch 4080 anyway, plus 12gb again. 8800 xt should be faster snd more vram, problem is price, the most sold gpu is the 3060 amd doesnt care about higher end proeucts this time, besides when they build them nobody buys them so why bother
@@cajampaamd has said exactly this, youre not onto anything new lol. yeah 88xt will be between 5070 and 5080 raster and will beat it in price. just like right now, 4070 vs 78xt is a blowout in terms of raster value, the question is whether enough people care about value. 4080/90 buyers dont gaf
IF those numbers are true, there are no others issues then this card would go insane if it was sold for $499. Now granted this is very unlikely, probably it'll be $549 or maybe even $599 which tbh for 4080/4080S performance is still really good. But damn if they price it at $499 then holy crap, that should take market share. If those numbers turn out to be true I'm sold.
I can get framegen with LSFG, so im looking forward to this as i have no use for nvidia frame gen
I can totally believe this one. AMD know they need to sort out Ray Tracing, and bump up RDNA, with RDNA 4, this time round, to stay at all relevent this generation. Considering they arnt compeating at the high end this time around, after botching their high end cards.