I can 100% understand AMD. Let’s be real: they are not competing with NVIDIA in high end gaming. So focussing on the areas that actually make margins. Maybe in a few years we will see a comeback like with Ryzen? One can only hope…
They are not competing with NVIDIA at AI either. At least not according to Sales revenue. NOT YET at least. AMD is still Tiny in comparison. And we are at that turning point, where people understand that AI is not all that it claimed to be. At least not in the civil market. The Defense Industry is an entirely different matter though.
@@dihartnellWell the majority of the market voted for Nvidia, regardless of their business practices. If people actually bought AMD cards things might've been different. I personally went to AMD because Nvidia's so anti-consumer and gamers at this point.
@@JamieR agreed most people are buying NVIDIA in the descrete GPU. My point was the last time AMD ceded the high end and stopped trying, NVIDIA 10 series, and 20 series both saw big jumps in price. No or little competition is bad for the market as a whole.
@@tomthebadasscat , I also think so. I'm hoping AI actually gives us "next-gen" experiences regarding NPC interactions. Graphical leaps are becoming less impressive and aren't the selling point they used to be.
not really. gaming doesn't need any more hardware than it already has. just stop bloating games with stupid graphical features and have reasonable render targets. to me, strix halo in a handheld pc thing (with the possibility of docking) is the future of pc gaming, not a billion dollar pc with an rtx 5090
@@GraveUypoYou are right people dont get it why you want ai make games is bullshit i find ai interting with helping reducimg workloads in devoloment but not takimg the human factor to the trash or you got a soulless game with you content to fill
I’m fine with AMD not chasing the 4090/5090 but they should at least match whatever the 4080/5080 is. I think this is smart for AMD but as long as it’s still cheaper than their counterparts.
The 80 series is still high end. Supposedly they attempted to develop another chiplet based GPU that would (try to) compete at the high end but their prototypes would be too expensive to bring into mass production. And so they scrapped the idea for this generation.
They need to come out with an RTX 5080 killer at $700, an RTX 5070 killer at $400, and an iGPU that can beat the 5060M on a desktop chip that costs $200. That is the only path to victory.
@@RobloxianX that is never going to happen ever. At best they’d make a 5080 for 899 or 750. I think a 5080 for 700 would be a steal. They would never do that. Minimum of 800
"Intel is frustrated with the government's slow pace" Isn't that funny? I wonder how people felt with their 13th and 14th gen RMA pace? I wonder.............. 🤔
And they will sell as well as the Zen 5 CPUs do now. For gamers, there is no reason to upgrade, so there is no reason to buy even more overpriced similarly performing stuff. And the current derivative AI bubble hype is going to end in a few dozen months as well.
At some level, I have to ask, am I supposed to shed a tear for people who have enough money to buy a 5090 and only wanted AMD to sell expensive cards so their Nvidia card can be cheaper? It wouldn't hurt them to skip a gen. And it would absolutely not hurt Nvidia if people didn't buy their expensive gaming GPU. Meanwhile, people are still holding on to their 10 series cards for dear life because prices are so bad in most of the world outside of the US.
2 месяца назад+14
They can price however they please, but people will have the same amount of expendable money. They'll have to be careful or they might shoot themselves on the foot. People are very ready to keep their old GPUs. Even then, based on these news, AMD is aiming to make money somewhere else, and I don't think they are wrong. Data centers and AI computing make highest end GPU sales laughable.
@@trowawayacc There is nothing special about any field or line of work. It is just a more sophisticated internet search. It gives access to both $50 / hr lawyers and $1k / hr lawyers.
High-end graphics cards used to cost around $800. Now that's a mid-range card. Seems to me like we just introduced a tier above high-end, one that nobody can afford.
@@unvergebeneid it's a cold war mentality, Nvidia want all the nukes AMD are like you go ahead and bankrupt yourself, we make close enough for half the price.
It’s $2000, that’s nothing to a ton of people, I’m so lost on why people whine about it being unaffordable. A new car used to cost 3500, things change.
Are you trying to compare a car with a gpu? And yes, for the basic costumers that is indeed expensive. Even more so for countries outside of the US. You have to take into account the majority of the population arent going to be able to spend that much conpare to the 5 percent.@@jeffreylyons1531
Blame government inflation/waste/radical policies in the long run and competition from AI bubble in the short term. Devaluating your currency and making you earn less every year and paying for more.
@@jeffreylyons1531 Mate...that is outrageous. If $2000 (USD) is nothing to you, try to imagine, just for a moment, that is not the case for most of the world's population. Then try not to be dense enough to write something like that in the middle of the most difficult economic times in recent history. Then try to imagine that it's possibly not wise to overprice things when people are struggling more and more to afford things like computer parts.
AMD is a great company, but they are a chip designer now, not a chip manufacturing company. They were at one time but spun the wafer fabs off as Global Foundries. The Biden admin is taking so long to disperse the chips act money, might never happen. That is what happens when no one in government understands running a business.
@@zeerevolutionwillbetelevised Advanced Micro Devices, Inc. (AMD) is an American multinational corporation and fabless semiconductor company based in Santa Clara, California, that designs ... Wikipedia.....
UDNA was said to be a new strategy moving forward, they sure made it sound like this isn't something they been working on. If they have, I'd be happy. And I hope my 7900xtx can utilize it.
The reason AMD is dipping from high end is that me, having a 7900xtx means I'm in probably 0.1% of gamers. It's not a brag, it's a sad story. 79xtx is a fantastic card.
I got the 7900 GRE because it was the best I could afford at the time, and it was at the right balance of price/performance in that moment. I don't have any complaints.
@@TheRealSkeletor 7900GRE is one of the best Price-to-Performance cards to come out in the past ~5years for "excellent" 1440p and "reasonable" 4K. It does fine at 4K if you are willing to tune game settings for best quality-to-performance (i.e. reduce shadows/reflections because no one cares, but keep textures and view distance high). I have yet to find a game at 1440p that you can't crank to Max/Ultra and be below 60fps. In my opinion 1440P with 144hz and 3ms or less IPS Monitor is the best gaming experience, unless you need a monitor larger than ~27"/30". I'd rather have the smoother experience than hitching and dips at 4K at or under 60 fps. Also, my recommendation above is simply for Native rasterization. If you include FSR/DLSS the story may change a bit, but that's really only needed for 4K on the 7900GRE.
AMD is complicit on GPU prices inflating. They are willing to fall on the sword for Nvidia by fortying it's position in the midrange. AMD's strategy is too dependent on Intel's Battlemage being a failure. If they get it wrong their will be a price war, and gamers will win. If Battlemage has superior rt performance, it's lit!
@@OggaDugga it's true but they can't be on an infinite losing streak forever unless it's internal sabotage. Gross incompetence has a different pattern of failure.
@@aberkae it’s going to take time to recover from a 15k+ job layoff. Laying off those people will undoubtedly go to your competitors is a tough battle to overcome.
@zackw4941 I was going to respond with a lengthy comment on your ridiculous comment, but I decided to delete it all and type: leave politics out of this. Let’s all enjoy the commentary on the tech sector stupidity(and other actions) without the [dumb] political commentary.
@@mhouse1712 I appreciate your restraint. It sucks that most people can't seem to communicate over the internet without lashing out like toddlers. To your point, government assistance to large corporations is not an a-political subject. Those are our tax dollars. I know health care in America is a touchy subject and I don't want to start a heated debate. I've been around in circles with my conservative buddy/coworker who wants nothing to do with government and thinks healthcare should be left entirely to the private sector. My stance is simply that the private health care sector, like most other corporate sectors, have seen prices soar out of control with no oversight. I'm a 40 year old blue collar guy who works hard, full time. My body is wearing out a lot faster than I'm approaching retirement. Despite multiple sources of income, I can't afford the ludicrous price of health insurance out of pocket, let alone the fees that still won't be covered, if I need anything. Which I'm sure I do.. Make of this what you will. I rather have an incompetent organization actually trying to help me, than a competent one who is only interested in money and has priced me out of the market.
@@mhouse1712I appreciate your modicum of restraint. People should be able to communicate their ideas to each other without acting like toddlers. The headline is already political. Government gives $8.5 billion tax dollars to Intel. As a tax payer, I find this outrageous. Intel is a big company with a lot of resources and this isn't where I want my money spent. It's not like I'm going to get an Arrow Lake processor in the mail for this.. I bring up health care as an alternative that I would prefer to see attended to first. I know it's a touchy subject. I know plenty of conservatives who want nothing to do with government and I get it. But to me, it's a moot point. Private health care, just like every other private industry operating with no oversight, has seen prices soar out of control. I'm a 40 year old blue collar guy. I work hard every day and have multiple sources of income. But I'm not wealthy and I'm not in a situation where any health insurance assistance is provided. And I straight up can't afford to pay for it out of pocket. Let alone the medical fees on top of that, that still wouldn't be covered if I went in for something. My body is wearing out a lot faster than I'm approaching retirement and I don't really know what to do about that. I feel like the government I pay taxes to should try to help me out, before giving my money to the wealthy for free. If that makes me a dumb socialist, then I'm a dumb socialist. Maybe you think I should have my head examined for thinking this way? I'll add it to the list, if I ever get access to health care.
the thing with uk pricing is that youre not only paying the early adopters fee but also VAT which is 20 or 21%. Like most hardware. prices should drop 6-12 months after its initial launch
AMD builds GPU that beats the 4080 and prices it two hundred dollars cheaper. Everyone craps all over the better price performance ratio because "it doesn't have DLSS", refuses to buy it, purchases inferior 60 and 70 class cards because muh ray traced upscaling. AMD looks at bottom line, "This is stupid and a misallocation of resources. We're going to skip the high end and focus on data center and budget to mid range GPU development." Drooling consumers, "Why would AMD do this? Now nVidia can charge whatever they want, and I'll be stuck paying $1200 for the 5070!"
Its been this way for a decade. People were refusing to buy better peforming cheaper AMD parts for the last 10 years even before DLSS and RT were a thing. You cant fix stupid, and Nvidia are masters at marketing to the tech illiterate. Everyone is convinced their computer house and cat will catch on fire if they dare buy an AMD gpu and here we are with 2000 dollar Nvidia garbage.
Or people just spent the extra money and got the Nvidia version. Instead of spending $1000 for an XTX they bought 4080 for $1200. Instead of spending $900 on a 7900XT, they bought a 4070ti . Instead of buying 7800xt, they got a 4070 for $100 more. They got similar raster, but better raytracing and Frame Generation. AMD can only undercut Nvidia if they have the same feature set. Nvidia does a good job of making sure AMD doesn't catch up.
@@Gofr5 I am gonna wait for RDNA4. If they don't deliver a decent 1440p card at a good price, then screw it and i will buy RDN2. RDNA3 is still very expensive outside of the US ($150 more than what the news says for the US market).
I wonder if AMDs decided to not compete on the ray tracing front and just release good cards with reasonably high vram. Therefore not being competitive with high end nvidia but still being better value for their prices.
that "better value for their price" is has been the thorn for AMD for years now. not only mean it is less profit for them having better price did not really increase their sales either. in fact their market share has been eroded for the past 15 years.
@@taylorhickman84 AMD has done it since forever with their FireGL (now called Radeon pro). thing is nvidia dominates those market by over 80% for more than 15 years now.
Their presence in the higher-end market was the only thing tethering Nvidia's prices to a shred of sanity. You can bet that Nvidia's next two gens GPUs will skyrocket that price so high that covid prices will look like a speed bump. Now if you want a high-end GPU your only option is going to be Nvidia. Companies exiting smaller markets to pursue more profitable ones, leaving the end user underserved and overcharged - not a new phenomenon, and it pisses me off every time.
@@gctypo2838Your not alone there. GPU's being priced so high, that regular folk not being able to afford them, it's not right at all. People go to computer stores and ask, "What's the best GPU?", they're told the latest high end, then because it's too expensive, they buy the one that's in their price range. Basically people buy cheaper models, of whoever makes the fastest flagship. That's why AMD not competing at the high end is a mistake.
AMD: Releases 7900XTX $200 cheaper than the 4080 on release trades blows and sometimes outperforms the 4080 Gamers: NO DLSS muh Ray Tracing AMD: Releases 7900GRE cheaper than 4070 Super while trading blows Gamers: 4070 Super has DLSS3.0 and Frame Gen AMD: Releases their own Frame Gen and offers both FG and FSR free along with Freesync Gamers: DLSS is better and AMD bad because muh Ray Tracing Gsync go brrrrr AMD: We are concentrating on other markets from now on and will only release mid range lower end cards Gamers: Why would AMD do this? They need to compete with at least the 4080.......................... Why should they compete? 80% of you wont buy their cards anyways.
People would buy AMD cards if they are the best available option, but Radeon is always a step or two behind cost less or not. In computer building its always been "Pay more get more" its that simple.
@@conorgillespie7832 Doesn't change what I said though. 99% of gamers do not buy the best option, you might but you do not represent the majority. The 4090 is the best option and how many people own them? Less than 1% according to Steam Hardware. Its quite indicative of the mind share Nvidia has, example: The only card that can semi do native PT is the 4090 and without downscaling your resolution to under your panels native res not even a $3500 GPU (Here) can do it without running the game under your panels native resolution. The majority made a decision based on the top 1% of consumers because Nvidias marketing is powerful so are the influencers they use to market their products. Most people cannot afford to use RT and most people who bought RTX cards did not buy them for RT more so DLSS. Now with saying that what does Nvidia offer over Radeon? Ray Tracing and better hardware accelerated downscaling. Pretty cool trick Nvidia pulled on gamers convincing them downscaling is upscaling.. Its funny Sony tried the exact same thing recently Nvidia pulled off with their downscaling except people lost the plot but turned around and accepted Nvidia doing the exact same thing. Now I repeat my question: Why should they compete? 80% of you wont buy their cards anyways.
The AI band guy might be okay, a district court ruled that for an offense to rise to the level of fraud you had to specifically intend to make your victim poorer, not merely hope to make yourself richer. Now all he needs is for the Supreme Court to sign off on this bold new interpretation.
while it'll purely theoretically suck for the wealthiest elite gamers who already didn't care how much the price was, if AMD could release cards in the 4060/70 range at 2/3rd of the nvidia price or less... they could easily build market share and developer support very fast and render nvidia a niché brand... I'm currently forced to use nvidia due to CUDA support though...but finally AMD is getting smart about that too
AMD can do that they will still not going to gain market share. Because all nvidia need to do is adjust their pricing and they still win at sales. To beat CUDA AMD need to spend more money and effort to increase their ROCm adoption. Simply going open source here and there will not going to cut it.
Awesomesauce video Paul and Joe! Now I feel even worse, not picking up that open box Sapphire 7900 XTX while I was at Microcenter yesterday. So much for buying a higher end AMD GPU. 😢 I guess I will go back to dreaming of Captain Crunch that always goes so well with PHTN. (Too much sugar for my system.) Caffeinator out.
RDNA 4 is said to be highly competitive in the low-end and midrange. Battlemage is also expected to be competitive in the low-end and midrange segments. Intel has made massive progress with their drivers for Alchemist. I don't think drivers would be nearly as big of a problem for Battlemage at launch. If RDNA 4 and Battlemage can deliver powerful midrange GPUs at reasonable prices, there is a good chance AMD and Intel can gain quite a bit of market share, and that should hopefully pressurise Nvidia to do better. Competition can make that happen. It's the key to a healthy industry, and it benefits us consumers.
@@sultanofsick Hard agree i use a 6750XT which is faster, and has more VRAM and was cheaper than a 3070. But DA RAYTRACING!! idgaf. when Raytracing is in 99% of games ill care.. but its in 0.1% of games. So its a non-feature i care about.
Highly competitive with what ? The current nvidia card or the 5000 series ? Because RDNA is already barely able to compete with the 3000 series and once again i fear MAD will deliver an underperforming and overpriced product compared to Nvidia.
I hate that saying, because it's not even true most of the time. Normies don't give a fuck about racing. They buy a 1.0l Peugot 207 because it's cheap and not because Red Bull won F1 again or whatever...
@@careem3463 It absolutely works like that here. People buy Ryzen 5 5600 because later they can upgrade to a 5700X3D or 5800X3D and have one of the best (formerly the best) CPUs for gaming.
@@careem3463it works like that in a different way. If Nvidia's GPU architecture is better at the high end, then it is also better at the low end. The only thing that happens by AMD not competing at the high end is that Nvidia completely dominates the high end and we get the status quo for the midrange and budget GPUs.
In case you haven’t noticed, many gaming companies are closing, and even developers at Microsoft are being laid off. The gaming era might be reaching a plateau, if not a decline.
So are Spotify and apple at all going to reimburse the ad costs to the advertisers that were charged for all the bot views? I can wait for the answer hahaha.
Gamers need to wake up and realize they’re no longer the primary concern of these GPU makers. NVidia’s last quarter earnings showed that very clearly, AI/DC revenue: $26.3 billion, Gaming revenue: $2.9 billion AMD shareholders see the same thing and the message is clear, why are we focusing on gaming? When the big money is in AI/DC. A lot of gamers with more dollars than sense and an incurable case of FOMO buy into NVidia’s overpriced marketing which simply keeps supporting their continued justification for price gouging the consumer. Gamers need to get off this fixation that GPU companies really care about them when the pricing of the products and their price/performance for various SKUs has been continuously dropping since the crypto boom.
Modern midrange should be called high end. The 4090 level of performance might as well be called extreme. So AMD is not competing in the extreme line of GPU products.
Most PC parts with a 9 in it is meant for extreme / enthusiast. Intel Core 9 , Ryzen 9, 4090 , 7900XTX. In the past the 9 level parts never really existed. Nvidia GPUs ended at 80 series, and Intel only had i7s
Intel doesn't deserve subsidies. The CHIPS Act was always bad legislation. The government should not be putting our tax dollars towards giving the richest corporations on the planet some tax breaks and subsidies. No more socialism for corporations while actual people are struggling in so many ways. I hope the government never cuts those checks for Intel. Put it towards something that can benefit people, like clean air in schools and buildings so we aren't spreading as much disease, school lunches for kids like we had during the height of the pandemic and it worked really well for everyone, or start giving everyone universal basic income and health care. 'Socialism' for the people, not for the corporations.
From what I can tell it's a matter of national security. China has its eyes on Taiwan, where TSMC is located. China doesn't have to necessarily take over the island by physical force to access the technology and designs. With its close proximity they may already have spies already in or working towards getting into the fabs. The US building their own means the technology and designs are less likely to get into China's hands. With the AI era just heating up, any country who wants to retain or increase their dominance/influence in the world would do well to have secure access to chip manufacturing technology and designs. I'd feel better if it was mostly a military project, (similar to the way internet came about) with Intel acting in a consulting/collaborating role due to Intel being a publicly traded company with a "fiduciary responsibility" to its shareholders. If this act does go through I'm sure there's something in the contract about not using the funds for share buybacks, but all it takes is some tricky accounting to skirt that stipulation.
I believe this in my heart to be true. It’s maybe 10% the fact that NVIDIA has destroyed them and 90% they don’t give af about spending $ on gaming gpu’s.
Moore's Law Is Dead reported a long ass time ago that top end RDNA4 was cancelled because it was a way too ambitious chiplet design that didn't pan out. AMD wanted to use the same bonding technology as 3D V-Cache uses to make make top mounted silicon "interposers" that straddled the GCDs, but for some or other reason(s) it didn't pan out. So we are just going to get the monolithic variants this time around.
Man I hate American pricing on everything. Lol. $1,600 USD ($2,175 CAD) for a 4090? The cheapest you'll find a 4090 near me brand new is $2,400 Canadian Pesos ($1,765 USD). Every time I watch a video saying "Build this awesome PC for $700!" or read a comment like "For $700, you could build a PC that would destroy the PS5 Pro". I call bull shit. I will use the exact same parts they use on the video, and still come out over $1,000 USD. My PC was around $1,600 CAD (Obviously just the tower) when I built it back in 2019 (Literally right before anyone ever heard of COVID). Ryzen 3600X (now has a 5800X3D) and RX 5700 XT, B450 Tomahawk MOBO, 16GB of 3200 Mhz RAM, 500GB M.2 SSD (Boot drive), 2TB SSD (Later added another 2TB SSD). By no means a "high end" gaming rig. But I couldn't even consider building a whole new rig today with the prices the way they are.
Radeons should have always been muuuch cheaper than what they are, to be a viable option. Now that they have realized it themselves, this should be good for AMD. Stop wasting money with the high-end stuff as they can't match a 90-series card anyway, the nVidia product package is just overwhelmingly greater - which also is bad for consumers, due to not having real competition. Also a reminder; AMD really has shot itself into its feet, not taking advantage of all the chances it was handed; such incompetence. So AMD needs to make lower-tier cards that they could actually sell, like they are going to do. The high-end "battle" wasn't really a thing, definitely wasn't a balanced rivalry. Intel and AMD now just have to fight over the left-overs. I still sense hard times for Radeon, due to Intel's GPU's being pretty damn good propositions already.
Hey Paul, nice MATS! But I'm more of a GN desk mat guy. I like em big and thick, so it looks like they're coming out from under the monitors and come to the egde for the desk.
2:10 didn't they do a similar thing was the 400 series. Which that and nivida's 10 series were considered some of the BEST gpus to get pre-dlss and A.I.?
It is sad how they raise the cost and make their boards as cheep as possible. I was doing research on the Gigabyte Ice boards to find one that has 8-layer PCB and only the Master comes in 8-layer as even the second board X870E AORUS PRO ICE come with the cheaper 6-layer PCB. They stated that the 8-Layer PCBs design effectively lower the component temperature by its high thermal conductivity and low impedance where the 6-Layer only has Mid-Loss. It is a shame you can't even buy a quality mother board in ATX.
i mean, on the top end it needs more than just competitive rasterization performance. while up to mid tier providing better raster performance is somewhat compelling and makes people that aren't fanboys to at least consider a radeon over a nvidia. similar to how people might consider an arc in that market segment now. AMD currently only has the usually better driver support for games going for them in that market segment compared to intel and intel is still constantly improving arc drivers, putting pressure mostly on amd...
Adored tried to tell us where Radeon was headed a few years back, and now it's official. AMD is allowing Nvidia to set high prices, then slot it's high margin, low effort (cost-effective to develop) GPUs within the pricing stack. they stopped chasing the perf crown with Fury, tho, they really stopped being competitive when GTX600 launched, & have been behind since. the GPU war is (has been) over.
[Question] It’s very interesting and a question comes to me because I’m a novice. So we are talking about AI with the next FSR 4 and maybe compatible with 7000, which would make sense. It’s about "ROPs"? I’m trying to understand. Because if I compare and it matches these rendering units, we have some on the manufacturer’s maps: 192 on 7900XTX vs 112 on 4080 Super vs 176 on 4090 So we could expect from AMD something even better on the 7000 than what NVIDIA currently offers? And more on RDNA4, of course?
Amd really need to get into prebuilts if the want to grab market share where a lot of people just buy a "gaming" computer with no thought for the internals
I think Strix Halo is their attempt to do exactly that, by targeting laptops first. Being the first gen, it will likely be limited to higher cost devices, but should those do well, next gen will see wider availability
High End GPUs is currently just high risk/low reward for AMD. People buying those cards are very set on Nvidia and AMD needs to be around 40% better performance/price in the high end to get any sales. Most people crying about AMD skipping high end are just those people who want Nvidia GPUs cheaper...
Raw greed. But perhaps it's a good thing. Perhaps developers and gamers will realise that chasing more performance is a needless game, and we could have a GPU release cycle more like the Elder Scrolls, forcing devs to optimise for the hardware we have, not what they want us to have. And perhaps this makes it easier for new players to enter the consumer market, and hopefully nvidia and AMD pay a hefty price for abandoning their core base.
No company is magnanimous, but you have to understand that this strategy IS the one that got us Ryzen as good as it is. The framing “turning back on gaming market” I think is just the wrong way to look at it. Eventual UDNA merge means we’ll get all those datacenter benefits down the line. They’re going to be designed to scale up to mega-chips that take entire racks with chiplet based designs. Once that’s sorted it becomes (relatively speaking) trivial for them to scale future generations from low end go high end just like Ryzen, pretty much only constrained by power for how many chips they can cram onto the highest end one. RDNA4 is midrange only because it’s a stopgap, a write off that isn’t worth the effort to make a fundamentally different set of cards like the 7900XT and XTX were. I don’t know if RDNA5 or UDNA6 but one of them is going to come back to high end; assuming they can pull off chipleted GPUS as well as they managed Ryzen. And 7900XT and XTX aren’t truly chiplet GPU designs; the IO and memory controllers are split off but all COMPUTE DIES are monolithic. When they can split off and combine compute dies is when they’ll stroll back into a full product range, because it’ll cost them very little to do because the development would have been funded by the datacenter side which needs them to figure out that problem at many times the scale.
Hey Paul, How is life , family , BitWit and Urtube these days? Matt here from long time viewing . Love Your skits and scrips . Was just wondering if You use AI to write Your scrips ir do just think of theme Yourself. I think there really good . O well keep up the great work !
When comparing the rumored motherboard-prices (9:55) - it seems this compiled list is not considering tax. The "compiler" seems not to have simply converted GBP or EUR to USD - but the USD-prices seem to be based on the European prices including tax, while you are used to seeing $-prices without tax. So still expensive, but not that much more.
Silicon will not reach 10ghz ,it would require subanbient cooling just to inhibit resonance in the silicon at those.frequencies. silicon starts "ringing" at ultra high frequencies
Unfortunately AMD already smokes NVidia in the mid range. So people still aren’t going to buy it I mean I will but I already took The chance with my 680xt and it was an absolute win but most people will just believe the hype and overpay for NVidia
I would not be surprised that the triple AAA gaming implosion of the recent months was also another reason why AMD walking away from GPU production. Future demand for graphic cards is likely to slump down. Time will tell.
Before I watch, so trouble making GDDR7 stable, or so much better,, they don't want to wreck their own gravy train? Been waiting on GDDR7 for a couple years?
There are number of parameters that go into business decision, and we don't know majority of them. Personal opinion, but there is no point for AMD to focus on high-end consumer product. It's high cost, low volume product, which doesn't really fund things further. There is a lot more money to be made using their wafer allocation towards DC products (AI in particular) for the next 2-3 years. Focusing on less SKUs (i.e. less tapeouts, etc.), better value product (hopefully) to grow market share is a better goal for short-mid term. UNDA is also fixing a mistake. Every Nvidia GPU for 10+ years (IIRC) was CUDA capable. That means every student was able to write code, test locally on their gaming laptops/desktop and then scale it up on their university supercomputer. You can do that on low-end Nvidia gaming GPU. This helped a lot for Nvidia to become number one choice for anything GPGPU-compute, HPC/Scientific, AI, etc. Checking, and it seems that GeForce GT 430 was the 1st GPU to support CUDA (released in 2010).
Yep, I’ve been watching the growth of GPGPU since NVidia announced and deployed their CUDA SDK for their GPUs. And note that the first real breakthrough research in AI (deep learning) was in 2012 in computer vision (Alexnet) using two 3GB GTX 580s, that tells you that NVidia’s investment in CUDA wasn’t a joke or distraction as it was characterized back in 2010 by some gaming publication and journalists. According to wiki the first GPUs supported were even earlier, the GeForce 8800 Ultra, GeForce 8800 GTX, GeForce 8800 GTS(G80) series in 2007.
Noting I do really requires a 4090. In fact, when I bought my last PC, I went with a 4080 at a very good discount price, and I have never regretted the decision.
Price to performance at the mid-range matters more than having some Flagship GPU that the competition's Flagship eats if you cant make up cost why make the product? the *VAST* majority of people buy at the $200-500 MSRP range. That is where the market is, not in flagships.
1. AMD already announced at the very start of the year that they would no longer be focusing on high end GPUs. Somehow the mainstream either ignored that, didn't believe it, or never saw it. 2. Why would people "rejoice" at this news, assuming AMD would now focus *more* on entry level and mid-ranged GPUs? That's not what this means. It just means they are no longer going to bother with high end GPUs. Their entry to mid-range offerings will be the same as they ever were. 3. Those that think this is "bad' because it means "less competition" haven't been paying attention. AMD never was competition for Nvidia in the high end, and especially not since the 30 series. Nvidia already knows that AMD was irrelevant in the decision making of those who wanted enthusiast or flagship cards, because their cards would be the ones purchased 95% of the time anyway. Nvidia's pricing is not dictated by having AMD in the race. They've already proven how out of touch they are with reality, yet their cards still sell. The 4080 didn't, but that was primarily because the "value proposition" was low due to how much it already cost that people just decided to be upsold on the 4090.
@@lucasrem "They've already proven how out of touch they are with reality, yet their cards still sell." Because their customer base nowadays is the likes of AWS, Microsoft, Google, Oracle, IBM, OpenAI, Meta, HP Enterprise and so on. They bring BILLIONS and BILLIONS of dollars into NVIDIA's revenue, and generally that's what the Shareholders WANT, since they're the kind of people who last time they played any video games, it was back when the Atari 2600 was the hot product.
@@RockstarRomaniaThe main institutional shareholders of NVidia and AMD: Vanguard, Blackrock, State Street. Along with the Big Tech customers you’ve named tells me that gamers desires and opinions are meaningless. Look at the last quarterly earnings for NVidia and AI/DC vs gaming is 9:1. Meta, Google and MS have announced $100+ billion DC projects and are buying AI GPUs as fast as they can make them. What’s kind of pathetic are those gamers whining that their favourite GPU is now $800 when it used to be $600, when a single 80GB NV H100 now goes for $45,000. That same piece of silicon makes them 50x the money as an AI chip rather than a gaming GPU. What’s amazing is that NVidia or AMD are devoting ANY wafer allocations at all to gaming GPUs considering the ROI in the AI market.
My 1080ti still does a great job. I don't want to pay $900AUD for a "mid level" gaming GPU. I want value for money. Most people I know who game, do not have 3080s or 4090s. Most of us rock low-mid end cards. If AMD simplify their consumer GPU offerings to 2-3 value for money bangers, that's a win. Less R&D developing multiple SKUs, which simplifies binning, yields, etc etc. I think it's a good strategy.
It's all well and good for AMD to focus on Low to mid-range GPUs let's hope they don't pull some type of snake oil move on consumers and put high-end GPU prices on mid-range GPU's
I think the GPU strategy is primarily a decision to sidestep a crazy inflated high speed GDDR market strained by demand for AI datacenter cards. If they were to use GDDR6X or GDDR7 themselves in any measurable volume, that just bids up an already high demand market further and everyone loses. They have seen this as a long-term issue, long enough to design a whole generation of cards around which would have required they began such plans at least a year ago and expect the strategy to work for a couple a years going forward in that segment.
If we can have a good value battle in the midrange the consumers win, but it sucks that the high end (assumedly 5080 onward) is just gonna be iNvidias ballhouse. At this point, I just want there to be good options in the 60 class.
The way UK-US exchange rates work with tech, is you just change the £ sign for a $ sign. So the Motherboards won`t cost as much as you think, on the State side of the pond. Case and point with the PS5 Pro, £699 or $699. If it was done via real conversion rates, it would be equivelant to $1050 USD.
Until new X3Ds come out, they can keep their new motherboards. I would never have believed that I'm actually considering going with 285k this time. Everyone seem to be dragging their feet with new releases. Nvidia too.
0:28 "I'd hesitate to dunk on Console Gamers too boldly a misfired meme could easily cause damage to our own tempered glass side panels" This a great statement as a lot of PC channels have proved that they: A - do not understand either PC or console hardware B - Do not know how to build a proper PC as almost all of the builds guarantee stutter and poor performance and those that did not were almost all by accident. I have had to write off a number of channels as serious information sources because of their responses, which is a pity.
Why not Core Ultra 9 290K? Or Core Ultra 7 270K? Or Core Ultra 5 260K? The fact that they only have 285K/265K/245K suggests, at least to me, that Intel might be holding something back for when AMD releases its X3D. But who knows!
So you're saying that Intel isn't providing the US government enough... intel?
Alert! There's a Spy in or intel!
@kcmsterpce
Why you need the federal government ?
Just do as in California, ignore them !
So, you're saying that intel is a US state funded company now? A lot of governments (and people) around the world are gonna ditch intel.
At this point I think a 1980s road atlas would be a better source of Intel than Intel 😂
I can 100% understand AMD.
Let’s be real: they are not competing with NVIDIA in high end gaming. So focussing on the areas that actually make margins. Maybe in a few years we will see a comeback like with Ryzen? One can only hope…
They are not competing with NVIDIA at AI either. At least not according to Sales revenue. NOT YET at least. AMD is still Tiny in comparison. And we are at that turning point, where people understand that AI is not all that it claimed to be. At least not in the civil market. The Defense Industry is an entirely different matter though.
They are doing wonders for ps5 though.
While I also can understand it, it's bad for us consumers. watch Nvidia take advantage and raise prices yet again.
@@dihartnellWell the majority of the market voted for Nvidia, regardless of their business practices. If people actually bought AMD cards things might've been different. I personally went to AMD because Nvidia's so anti-consumer and gamers at this point.
@@JamieR agreed most people are buying NVIDIA in the descrete GPU. My point was the last time AMD ceded the high end and stopped trying, NVIDIA 10 series, and 20 series both saw big jumps in price. No or little competition is bad for the market as a whole.
Who the fuck predicted that AI would kill gaming? Damn you Skynet!
You youngin's have no patience. AI andvdevs will get back together sooner or latter and gaming will rock on. 😁
@@tomthebadasscat , I also think so. I'm hoping AI actually gives us "next-gen" experiences regarding NPC interactions. Graphical leaps are becoming less impressive and aren't the selling point they used to be.
not really. gaming doesn't need any more hardware than it already has. just stop bloating games with stupid graphical features and have reasonable render targets. to me, strix halo in a handheld pc thing (with the possibility of docking) is the future of pc gaming, not a billion dollar pc with an rtx 5090
@@GraveUypoYou are right people dont get it why you want ai make games is bullshit i find ai interting with helping reducimg workloads in devoloment but not takimg the human factor to the trash or you got a soulless game with you content to fill
I’m fine with AMD not chasing the 4090/5090 but they should at least match whatever the 4080/5080 is. I think this is smart for AMD but as long as it’s still cheaper than their counterparts.
The 80 series is still high end. Supposedly they attempted to develop another chiplet based GPU that would (try to) compete at the high end but their prototypes would be too expensive to bring into mass production. And so they scrapped the idea for this generation.
@@TheSwayzeTrain I know that. I was saying they should of gone up to the 4080 series. I know they aren’t but they should of.
They need to come out with an RTX 5080 killer at $700, an RTX 5070 killer at $400, and an iGPU that can beat the 5060M on a desktop chip that costs $200. That is the only path to victory.
If AMD is targeting the mainstream, that's more 50 to 60 class these days, not the $800 70 class or higher
@@RobloxianX that is never going to happen ever. At best they’d make a 5080 for 899 or 750. I think a 5080 for 700 would be a steal. They would never do that. Minimum of 800
AI being used to "rip off" spotify tastes like a most delicious irony.
Fube Tuck
Ai is where you can make money !
What will gamers pay for extreme GPU cards ?
Spotify ????
Shame he got too cocky. Should've kept a low profile.
To bad people misunderstand irony… or AI in this case.
You don't know what irony means. Irony simply means unexpected. What's ironic about using AI to ripoff Spotify?
@@troletrainwhy don’t you look up a definition before you comment?
0:43 image of broken glass panel then shows Paul Shardware. 👍👍
That's a bloody sharp one mate! 😂
Definitely intentional, what with the shameless glass-like gleam of the URL as it's displayed.
You could say he... Sharded.
"Intel is frustrated with the government's slow pace"
Isn't that funny? I wonder how people felt with their 13th and 14th gen RMA pace? I wonder.............. 🤔
Tbf Intel never promised you it would replace them haha
Agreed. I’m pretty sure Nvidia is laughing in money at this announcement. The 5090/5080 will go uncontested so they can price it however they please.
At least I can give NVIDIA credit for not going the Skylake+++++++++++ route like Intel did when AMD was absent from the market.
@@RockstarRomania Introducing the Intel® Core™ i17 Quantum Xtreme SuperDuper HyperThreaded TurboMax UltraFusion RGB+ AI-Powered Gen Z 9000Xtreme++ SkyLake+++++ Platinum Elite Supreme Edition KF
And they will sell as well as the Zen 5 CPUs do now. For gamers, there is no reason to upgrade, so there is no reason to buy even more overpriced similarly performing stuff. And the current derivative AI bubble hype is going to end in a few dozen months as well.
At some level, I have to ask, am I supposed to shed a tear for people who have enough money to buy a 5090 and only wanted AMD to sell expensive cards so their Nvidia card can be cheaper? It wouldn't hurt them to skip a gen. And it would absolutely not hurt Nvidia if people didn't buy their expensive gaming GPU. Meanwhile, people are still holding on to their 10 series cards for dear life because prices are so bad in most of the world outside of the US.
They can price however they please, but people will have the same amount of expendable money. They'll have to be careful or they might shoot themselves on the foot. People are very ready to keep their old GPUs.
Even then, based on these news, AMD is aiming to make money somewhere else, and I don't think they are wrong. Data centers and AI computing make highest end GPU sales laughable.
11:10 we've seen an AI lawyer before, it got laughed out of court so hard the provider had to take 'lawyer' out of their list of services.
The one caught, hundreds still in service.
@@Serketry88 Using AI to do mundane legal tasks is actually a great use. Probably can cut costs 90%
@@harryniedecken5321😂 you mean lay off paralegals that need the expirience to finish school? Or just make them question engeneers. 😂
@@trowawayacc There is nothing special about any field or line of work. It is just a more sophisticated internet search. It gives access to both $50 / hr lawyers and $1k / hr lawyers.
@@trowawayaccExactly, that's how you cut cocts
High-end graphics cards used to cost around $800. Now that's a mid-range card. Seems to me like we just introduced a tier above high-end, one that nobody can afford.
@@unvergebeneid it's a cold war mentality, Nvidia want all the nukes AMD are like you go ahead and bankrupt yourself, we make close enough for half the price.
It’s $2000, that’s nothing to a ton of people, I’m so lost on why people whine about it being unaffordable. A new car used to cost 3500, things change.
Are you trying to compare a car with a gpu? And yes, for the basic costumers that is indeed expensive. Even more so for countries outside of the US. You have to take into account the majority of the population arent going to be able to spend that much conpare to the 5 percent.@@jeffreylyons1531
Blame government inflation/waste/radical policies in the long run and competition from AI bubble in the short term.
Devaluating your currency and making you earn less every year and paying for more.
@@jeffreylyons1531 Mate...that is outrageous. If $2000 (USD) is nothing to you, try to imagine, just for a moment, that is not the case for most of the world's population. Then try not to be dense enough to write something like that in the middle of the most difficult economic times in recent history.
Then try to imagine that it's possibly not wise to overprice things when people are struggling more and more to afford things like computer parts.
transcribethis AI fixes this (AI Transcriptions/ Audio to Text). AMD Ditches High-End Gaming GPUs
Lol. Only big corporations are allowed to have fake songs listened to by fake customers...
AMD never got offered a penny from the CHIPS act.
I'd hate to think of where desktop computing would be if Ryzen never existed.
Theyre not an american company. Intel is.
@@zeerevolutionwillbetelevised AMD is not an American company?
Bro, what are you smoking???
AMD is a great company, but they are a chip designer now, not a chip manufacturing company. They were at one time but spun the wafer fabs off as Global Foundries.
The Biden admin is taking so long to disperse the chips act money, might never happen. That is what happens when no one in government understands running a business.
@@zeerevolutionwillbetelevised Advanced Micro Devices, Inc. (AMD) is an American multinational corporation and fabless semiconductor company based in Santa Clara, California, that designs ... Wikipedia.....
Isn't the CHIPS act focused on producing chips. AMD is a chip designer, but they don't produce them.
Unless they already started work years ago, we shouldn't expect any of the UDNA changes to result in an actual product for another 2 to 4 years.
They've most likely already been working on it for a few years. Companies don't normally announce roadmaps for things they haven't even started yet.
UDNA was said to be a new strategy moving forward, they sure made it sound like this isn't something they been working on. If they have, I'd be happy. And I hope my 7900xtx can utilize it.
@@kriminallogic2283 That is how it goes. If you are hearing it publicly then it has been talked about and worked on internally for a while now.
The reason AMD is dipping from high end is that me, having a 7900xtx means I'm in probably 0.1% of gamers.
It's not a brag, it's a sad story. 79xtx is a fantastic card.
Exactly, the high end market is so small. 99% of us can't afford a 4090. If you can, congratulations, but that ain't most of us.
0.1% represent. Definitely will be keeping my 7900XTX for at least 2-3 gens before looking to change it out. It's a banger of a card.
sometimes i wonder if i'd be happier now with that card rather than my rtx 4080.
probably not, but i don't think i'd be much worse off either.
I got the 7900 GRE because it was the best I could afford at the time, and it was at the right balance of price/performance in that moment. I don't have any complaints.
@@TheRealSkeletor 7900GRE is one of the best Price-to-Performance cards to come out in the past ~5years for "excellent" 1440p and "reasonable" 4K. It does fine at 4K if you are willing to tune game settings for best quality-to-performance (i.e. reduce shadows/reflections because no one cares, but keep textures and view distance high). I have yet to find a game at 1440p that you can't crank to Max/Ultra and be below 60fps. In my opinion 1440P with 144hz and 3ms or less IPS Monitor is the best gaming experience, unless you need a monitor larger than ~27"/30". I'd rather have the smoother experience than hitching and dips at 4K at or under 60 fps.
Also, my recommendation above is simply for Native rasterization. If you include FSR/DLSS the story may change a bit, but that's really only needed for 4K on the 7900GRE.
Not chasing the 90 series is fine, but amd should be competing with the 80 series.
AMD is complicit on GPU prices inflating. They are willing to fall on the sword for Nvidia by fortying it's position in the midrange. AMD's strategy is too dependent on Intel's Battlemage being a failure. If they get it wrong their will be a price war, and gamers will win. If Battlemage has superior rt performance, it's lit!
@@aberkaeone can hope but I think Intel has other issues it needs to solve first.
@@OggaDugga it's true but they can't be on an infinite losing streak forever unless it's internal sabotage. Gross incompetence has a different pattern of failure.
@@aberkae it’s going to take time to recover from a 15k+ job layoff. Laying off those people will undoubtedly go to your competitors is a tough battle to overcome.
the 80 series is a super small segment. your feelings are just that. feelings. not fact or based on any rational thought
How can the government give a company like Intel 8.5 billion tax dollars and we still don't have public health care?
Because our government only protects the most wealthy and powerful.
@zackw4941 I was going to respond with a lengthy comment on your ridiculous comment, but I decided to delete it all and type: leave politics out of this. Let’s all enjoy the commentary on the tech sector stupidity(and other actions) without the [dumb] political commentary.
@@mhouse1712 Behave. Be better than this. Shameful!
@@mhouse1712 I appreciate your restraint. It sucks that most people can't seem to communicate over the internet without lashing out like toddlers.
To your point, government assistance to large corporations is not an a-political subject. Those are our tax dollars.
I know health care in America is a touchy subject and I don't want to start a heated debate. I've been around in circles with my conservative buddy/coworker who wants nothing to do with government and thinks healthcare should be left entirely to the private sector.
My stance is simply that the private health care sector, like most other corporate sectors, have seen prices soar out of control with no oversight. I'm a 40 year old blue collar guy who works hard, full time. My body is wearing out a lot faster than I'm approaching retirement. Despite multiple sources of income, I can't afford the ludicrous price of health insurance out of pocket, let alone the fees that still won't be covered, if I need anything. Which I'm sure I do.. Make of this what you will. I rather have an incompetent organization actually trying to help me, than a competent one who is only interested in money and has priced me out of the market.
@@mhouse1712I appreciate your modicum of restraint. People should be able to communicate their ideas to each other without acting like toddlers.
The headline is already political. Government gives $8.5 billion tax dollars to Intel. As a tax payer, I find this outrageous. Intel is a big company with a lot of resources and this isn't where I want my money spent. It's not like I'm going to get an Arrow Lake processor in the mail for this..
I bring up health care as an alternative that I would prefer to see attended to first. I know it's a touchy subject. I know plenty of conservatives who want nothing to do with government and I get it. But to me, it's a moot point. Private health care, just like every other private industry operating with no oversight, has seen prices soar out of control.
I'm a 40 year old blue collar guy. I work hard every day and have multiple sources of income. But I'm not wealthy and I'm not in a situation where any health insurance assistance is provided. And I straight up can't afford to pay for it out of pocket. Let alone the medical fees on top of that, that still wouldn't be covered if I went in for something.
My body is wearing out a lot faster than I'm approaching retirement and I don't really know what to do about that. I feel like the government I pay taxes to should try to help me out, before giving my money to the wealthy for free. If that makes me a dumb socialist, then I'm a dumb socialist. Maybe you think I should have my head examined for thinking this way? I'll add it to the list, if I ever get access to health care.
the thing with uk pricing is that youre not only paying the early adopters fee but also VAT which is 20 or 21%. Like most hardware. prices should drop 6-12 months after its initial launch
AMD builds GPU that beats the 4080 and prices it two hundred dollars cheaper. Everyone craps all over the better price performance ratio because "it doesn't have DLSS", refuses to buy it, purchases inferior 60 and 70 class cards because muh ray traced upscaling.
AMD looks at bottom line, "This is stupid and a misallocation of resources. We're going to skip the high end and focus on data center and budget to mid range GPU development."
Drooling consumers, "Why would AMD do this? Now nVidia can charge whatever they want, and I'll be stuck paying $1200 for the 5070!"
100%. Kind of sad.
Its been this way for a decade. People were refusing to buy better peforming cheaper AMD parts for the last 10 years even before DLSS and RT were a thing. You cant fix stupid, and Nvidia are masters at marketing to the tech illiterate. Everyone is convinced their computer house and cat will catch on fire if they dare buy an AMD gpu and here we are with 2000 dollar Nvidia garbage.
Well I did my part with a 7900XTX in my system, but yes you are 100% correct.
Or people just spent the extra money and got the Nvidia version. Instead of spending $1000 for an XTX they bought 4080 for $1200. Instead of spending $900 on a 7900XT, they bought a 4070ti . Instead of buying 7800xt, they got a 4070 for $100 more. They got similar raster, but better raytracing and Frame Generation. AMD can only undercut Nvidia if they have the same feature set. Nvidia does a good job of making sure AMD doesn't catch up.
@@Gofr5 I am gonna wait for RDNA4. If they don't deliver a decent 1440p card at a good price, then screw it and i will buy RDN2. RDNA3 is still very expensive outside of the US ($150 more than what the news says for the US market).
Imagine a guy out there listening to those bot songs unironically. This news just rocked his world.
I wonder if AMDs decided to not compete on the ray tracing front and just release good cards with reasonably high vram. Therefore not being competitive with high end nvidia but still being better value for their prices.
that "better value for their price" is has been the thorn for AMD for years now. not only mean it is less profit for them having better price did not really increase their sales either. in fact their market share has been eroded for the past 15 years.
@@arenzricodexd4409 true but my r9 290 still works great and I bought it in 2013.
I wonder why they haven't taken a stab at the workstation market, Nvidia's quadro cards need some competition.
@@taylorhickman84 AMD has done it since forever with their FireGL (now called Radeon pro). thing is nvidia dominates those market by over 80% for more than 15 years now.
I've been saying it for years, AMD does not want to be in the dGPU market. Their desktop parts only exist for R&D and marketing purposes.
Their presence in the higher-end market was the only thing tethering Nvidia's prices to a shred of sanity. You can bet that Nvidia's next two gens GPUs will skyrocket that price so high that covid prices will look like a speed bump. Now if you want a high-end GPU your only option is going to be Nvidia. Companies exiting smaller markets to pursue more profitable ones, leaving the end user underserved and overcharged - not a new phenomenon, and it pisses me off every time.
@@gctypo2838Your not alone there. GPU's being priced so high, that regular folk not being able to afford them, it's not right at all.
People go to computer stores and ask, "What's the best GPU?", they're told the latest high end, then because it's too expensive, they buy the one that's in their price range.
Basically people buy cheaper models, of whoever makes the fastest flagship. That's why AMD not competing at the high end is a mistake.
well make sense. they acquire ATI back in 2006 to kill discrete GPU with APU.
AMD: Releases 7900XTX $200 cheaper than the 4080 on release trades blows and sometimes outperforms the 4080
Gamers: NO DLSS muh Ray Tracing
AMD: Releases 7900GRE cheaper than 4070 Super while trading blows
Gamers: 4070 Super has DLSS3.0 and Frame Gen
AMD: Releases their own Frame Gen and offers both FG and FSR free along with Freesync
Gamers: DLSS is better and AMD bad because muh Ray Tracing Gsync go brrrrr
AMD: We are concentrating on other markets from now on and will only release mid range lower end cards
Gamers: Why would AMD do this? They need to compete with at least the 4080..........................
Why should they compete? 80% of you wont buy their cards anyways.
People would buy AMD cards if they are the best available option, but Radeon is always a step or two behind cost less or not. In computer building its always been "Pay more get more" its that simple.
@@conorgillespie7832 Doesn't change what I said though. 99% of gamers do not buy the best option, you might but you do not represent the majority. The 4090 is the best option and how many people own them? Less than 1% according to Steam Hardware.
Its quite indicative of the mind share Nvidia has, example:
The only card that can semi do native PT is the 4090 and without downscaling your resolution to under your panels native res not even a $3500 GPU (Here) can do it without running the game under your panels native resolution. The majority made a decision based on the top 1% of consumers because Nvidias marketing is powerful so are the influencers they use to market their products. Most people cannot afford to use RT and most people who bought RTX cards did not buy them for RT more so DLSS.
Now with saying that what does Nvidia offer over Radeon? Ray Tracing and better hardware accelerated downscaling. Pretty cool trick Nvidia pulled on gamers convincing them downscaling is upscaling.. Its funny Sony tried the exact same thing recently Nvidia pulled off with their downscaling except people lost the plot but turned around and accepted Nvidia doing the exact same thing.
Now I repeat my question:
Why should they compete? 80% of you wont buy their cards anyways.
The AI band guy might be okay, a district court ruled that for an offense to rise to the level of fraud you had to specifically intend to make your victim poorer, not merely hope to make yourself richer. Now all he needs is for the Supreme Court to sign off on this bold new interpretation.
while it'll purely theoretically suck for the wealthiest elite gamers who already didn't care how much the price was, if AMD could release cards in the 4060/70 range at 2/3rd of the nvidia price or less... they could easily build market share and developer support very fast and render nvidia a niché brand... I'm currently forced to use nvidia due to CUDA support though...but finally AMD is getting smart about that too
AMD can do that they will still not going to gain market share. Because all nvidia need to do is adjust their pricing and they still win at sales. To beat CUDA AMD need to spend more money and effort to increase their ROCm adoption. Simply going open source here and there will not going to cut it.
@@arenzricodexd4409 nvidia wont
Awesomesauce video Paul and Joe! Now I feel even worse, not picking up that open box Sapphire 7900 XTX while I was at Microcenter yesterday. So much for buying a higher end AMD GPU. 😢
I guess I will go back to dreaming of Captain Crunch that always goes so well with PHTN. (Too much sugar for my system.)
Caffeinator out.
A win for Nvidia is not a win for YOU, the customer.
Customer never actually win.
Yes it literally is. Nvidia sells to customers. Nvidia wins because customers choose them, because Nvidia gives them wins.
RDNA 4 is said to be highly competitive in the low-end and midrange. Battlemage is also expected to be competitive in the low-end and midrange segments. Intel has made massive progress with their drivers for Alchemist. I don't think drivers would be nearly as big of a problem for Battlemage at launch. If RDNA 4 and Battlemage can deliver powerful midrange GPUs at reasonable prices, there is a good chance AMD and Intel can gain quite a bit of market share, and that should hopefully pressurise Nvidia to do better. Competition can make that happen. It's the key to a healthy industry, and it benefits us consumers.
The key to making that happen is gamers actually BUYING amd/intel, not just saying they want them to exist to keep their nvidia purchase prices down.
@@sultanofsick I agree.
@@sultanofsick Hard agree i use a 6750XT which is faster, and has more VRAM and was cheaper than a 3070. But DA RAYTRACING!! idgaf. when Raytracing is in 99% of games ill care.. but its in 0.1% of games. So its a non-feature i care about.
Highly competitive with what ? The current nvidia card or the 5000 series ? Because RDNA is already barely able to compete with the 3000 series and once again i fear MAD will deliver an underperforming and overpriced product compared to Nvidia.
@@dragonmares59110Yeah keep telling yourself that...The 7900xtx beats the 4080s handily.
Well - as you say in racing: Win on sunday, sell on monday. So by just having the fastest GPU, NVidia is selling 4060s.....
I hate that saying, because it's not even true most of the time.
Normies don't give a fuck about racing. They buy a 1.0l Peugot 207 because it's cheap and not because Red Bull won F1 again or whatever...
It doesn't work like this here. Nobody buys Ryzen 5 5600 because AMD sells threadreaper, people buy it because of price/performance ratio
@@careem3463 It absolutely works like that here. People buy Ryzen 5 5600 because later they can upgrade to a 5700X3D or 5800X3D and have one of the best (formerly the best) CPUs for gaming.
@@careem3463it works like that in a different way. If Nvidia's GPU architecture is better at the high end, then it is also better at the low end. The only thing that happens by AMD not competing at the high end is that Nvidia completely dominates the high end and we get the status quo for the midrange and budget GPUs.
with x870/870e coming, I expect x670/670e supply to slow down and discontinue
As always excellent work Paul n Joe 🤘💯
they gave up because nvidia mindshare is so intense they refuse to buy amd....
I went back to amd gpu lately and its very good
In case you haven’t noticed, many gaming companies are closing, and even developers at Microsoft are being laid off. The gaming era might be reaching a plateau, if not a decline.
AMD realized that they don't want to fall for the same trap of having to choose between selling $1500 consumer GPU or a $5000 datacenter level GPU.
So are Spotify and apple at all going to reimburse the ad costs to the advertisers that were charged for all the bot views? I can wait for the answer hahaha.
Please mention about the 25-30 tb hard disc drives
Gamers need to wake up and realize they’re no longer the primary concern of these GPU makers.
NVidia’s last quarter earnings showed that very clearly, AI/DC revenue: $26.3 billion, Gaming revenue: $2.9 billion
AMD shareholders see the same thing and the message is clear, why are we focusing on gaming? When the big money is in AI/DC.
A lot of gamers with more dollars than sense and an incurable case of FOMO buy into NVidia’s overpriced marketing which simply keeps supporting their continued justification for price gouging the consumer.
Gamers need to get off this fixation that GPU companies really care about them when the pricing of the products and their price/performance for various SKUs has been continuously dropping since the crypto boom.
They are using their production capacity and wafers to make cards for the servere side, with UDNA they will be able to make both .
Modern midrange should be called high end. The 4090 level of performance might as well be called extreme. So AMD is not competing in the extreme line of GPU products.
Most PC parts with a 9 in it is meant for extreme / enthusiast. Intel Core 9 , Ryzen 9, 4090 , 7900XTX. In the past the 9 level parts never really existed. Nvidia GPUs ended at 80 series, and Intel only had i7s
good point, like with cars a new category had to be named that was leagues faster than previous super cars, so now we have the "hypercar" category.
You have poor people buying Mercedes and BMWs with $1500/mo payments. consumers are too ignorant to chase value en mass.
Intel doesn't deserve subsidies. The CHIPS Act was always bad legislation. The government should not be putting our tax dollars towards giving the richest corporations on the planet some tax breaks and subsidies. No more socialism for corporations while actual people are struggling in so many ways. I hope the government never cuts those checks for Intel. Put it towards something that can benefit people, like clean air in schools and buildings so we aren't spreading as much disease, school lunches for kids like we had during the height of the pandemic and it worked really well for everyone, or start giving everyone universal basic income and health care. 'Socialism' for the people, not for the corporations.
From what I can tell it's a matter of national security. China has its eyes on Taiwan, where TSMC is located. China doesn't have to necessarily take over the island by physical force to access the technology and designs. With its close proximity they may already have spies already in or working towards getting into the fabs. The US building their own means the technology and designs are less likely to get into China's hands. With the AI era just heating up, any country who wants to retain or increase their dominance/influence in the world would do well to have secure access to chip manufacturing technology and designs.
I'd feel better if it was mostly a military project, (similar to the way internet came about) with Intel acting in a consulting/collaborating role due to Intel being a publicly traded company with a "fiduciary responsibility" to its shareholders. If this act does go through I'm sure there's something in the contract about not using the funds for share buybacks, but all it takes is some tricky accounting to skirt that stipulation.
This is purely miliatary and national security related. US needs their military chips made in the US, by american company.
I'll never buy an Intel product again after how they screwed me on my 14900k.
I don't understand these newfangled cases. How do fans, pointed at the tempered glass, keep the components cool ?😳
air... flow.
Translation; they dont.
I believe this in my heart to be true. It’s maybe 10% the fact that NVIDIA has destroyed them and 90% they don’t give af about spending $ on gaming gpu’s.
Moore's Law Is Dead reported a long ass time ago that top end RDNA4 was cancelled because it was a way too ambitious chiplet design that didn't pan out. AMD wanted to use the same bonding technology as 3D V-Cache uses to make make top mounted silicon "interposers" that straddled the GCDs, but for some or other reason(s) it didn't pan out. So we are just going to get the monolithic variants this time around.
All these companies see the billions of dollars being spent on AI. The gaming market just can’t compete with that.
All these coments about the "ai bubble" popping from people who can't distinguish the companies building the roads from companies making the cars.
If I hear the line about selling shovels again I might vomit.
The only thing AMD’s statement will do is increase prices for used Nvidia GPU’s and that’s good for me as I have quite a few 30 series cards.
Man I hate American pricing on everything. Lol. $1,600 USD ($2,175 CAD) for a 4090? The cheapest you'll find a 4090 near me brand new is $2,400 Canadian Pesos ($1,765 USD). Every time I watch a video saying "Build this awesome PC for $700!" or read a comment like "For $700, you could build a PC that would destroy the PS5 Pro". I call bull shit. I will use the exact same parts they use on the video, and still come out over $1,000 USD. My PC was around $1,600 CAD (Obviously just the tower) when I built it back in 2019 (Literally right before anyone ever heard of COVID). Ryzen 3600X (now has a 5800X3D) and RX 5700 XT, B450 Tomahawk MOBO, 16GB of 3200 Mhz RAM, 500GB M.2 SSD (Boot drive), 2TB SSD (Later added another 2TB SSD). By no means a "high end" gaming rig. But I couldn't even consider building a whole new rig today with the prices the way they are.
Radeons should have always been muuuch cheaper than what they are, to be a viable option. Now that they have realized it themselves, this should be good for AMD. Stop wasting money with the high-end stuff as they can't match a 90-series card anyway, the nVidia product package is just overwhelmingly greater - which also is bad for consumers, due to not having real competition. Also a reminder; AMD really has shot itself into its feet, not taking advantage of all the chances it was handed; such incompetence.
So AMD needs to make lower-tier cards that they could actually sell, like they are going to do. The high-end "battle" wasn't really a thing, definitely wasn't a balanced rivalry. Intel and AMD now just have to fight over the left-overs. I still sense hard times for Radeon, due to Intel's GPU's being pretty damn good propositions already.
Hey Paul, nice MATS! But I'm more of a GN desk mat guy. I like em big and thick, so it looks like they're coming out from under the monitors and come to the egde for the desk.
2:10 didn't they do a similar thing was the 400 series. Which that and nivida's 10 series were considered some of the BEST gpus to get pre-dlss and A.I.?
Amd did the same with rx 580 and rx 5700x, and both cards were awesome and are still relevant today, because of aging like fine wine.
It is sad how they raise the cost and make their boards as cheep as possible. I was doing research on the Gigabyte Ice boards to find one that has 8-layer PCB and only the Master comes in 8-layer as even the second board X870E AORUS PRO ICE come with the cheaper 6-layer PCB. They stated that the 8-Layer PCBs design effectively lower the component temperature by its high thermal conductivity and low impedance where the 6-Layer only has Mid-Loss. It is a shame you can't even buy a quality mother board in ATX.
3:41
tbh how much pressure to the 4080 and up segment of nvidia were the radeon 7900s?
i mean, on the top end it needs more than just competitive rasterization performance.
while up to mid tier providing better raster performance is somewhat compelling and makes people that aren't fanboys to at least consider a radeon over a nvidia. similar to how people might consider an arc in that market segment now.
AMD currently only has the usually better driver support for games going for them in that market segment compared to intel and intel is still constantly improving arc drivers, putting pressure mostly on amd...
My first visit on this channel. Nice format. Sub.
Adored tried to tell us where Radeon was headed a few years back, and now it's official.
AMD is allowing Nvidia to set high prices, then slot it's high margin, low effort (cost-effective to develop) GPUs within the pricing stack. they stopped chasing the perf crown with Fury, tho, they really stopped being competitive when GTX600 launched, & have been behind since.
the GPU war is (has been) over.
[Question] It’s very interesting and a question comes to me because I’m a novice. So we are talking about AI with the next FSR 4 and maybe compatible with 7000, which would make sense. It’s about "ROPs"? I’m trying to understand. Because if I compare and it matches these rendering units, we have some on the manufacturer’s maps:
192 on 7900XTX vs 112 on 4080 Super vs 176 on 4090
So we could expect from AMD something even better on the 7000 than what NVIDIA currently offers?
And more on RDNA4, of course?
Amd really need to get into prebuilts if the want to grab market share where a lot of people just buy a "gaming" computer with no thought for the internals
I think Strix Halo is their attempt to do exactly that, by targeting laptops first. Being the first gen, it will likely be limited to higher cost devices, but should those do well, next gen will see wider availability
Thanks Paul and Joe! Keep up the good stuff!
High End GPUs is currently just high risk/low reward for AMD. People buying those cards are very set on Nvidia and AMD needs to be around 40% better performance/price in the high end to get any sales. Most people crying about AMD skipping high end are just those people who want Nvidia GPUs cheaper...
Raw greed. But perhaps it's a good thing. Perhaps developers and gamers will realise that chasing more performance is a needless game, and we could have a GPU release cycle more like the Elder Scrolls, forcing devs to optimise for the hardware we have, not what they want us to have.
And perhaps this makes it easier for new players to enter the consumer market, and hopefully nvidia and AMD pay a hefty price for abandoning their core base.
No company is magnanimous, but you have to understand that this strategy IS the one that got us Ryzen as good as it is.
The framing “turning back on gaming market” I think is just the wrong way to look at it. Eventual UDNA merge means we’ll get all those datacenter benefits down the line. They’re going to be designed to scale up to mega-chips that take entire racks with chiplet based designs. Once that’s sorted it becomes (relatively speaking) trivial for them to scale future generations from low end go high end just like Ryzen, pretty much only constrained by power for how many chips they can cram onto the highest end one.
RDNA4 is midrange only because it’s a stopgap, a write off that isn’t worth the effort to make a fundamentally different set of cards like the 7900XT and XTX were. I don’t know if RDNA5 or UDNA6 but one of them is going to come back to high end; assuming they can pull off chipleted GPUS as well as they managed Ryzen. And 7900XT and XTX aren’t truly chiplet GPU designs; the IO and memory controllers are split off but all COMPUTE DIES are monolithic. When they can split off and combine compute dies is when they’ll stroll back into a full product range, because it’ll cost them very little to do because the development would have been funded by the datacenter side which needs them to figure out that problem at many times the scale.
These are so great Paul. You do a really nice job researching all this hububalou and presenting it
Our NC dude - who knew someone local could figure all that out. Great brief as always, man!
I might be mistaken, but I don’t think you used the word “dubious” correctly.
Hey Paul, How is life , family , BitWit and Urtube these days? Matt here from long time viewing . Love Your skits and scrips . Was just wondering if You use AI to write Your scrips ir do just think of theme Yourself. I think there really good . O well keep up the great work !
When comparing the rumored motherboard-prices (9:55) - it seems this compiled list is not considering tax. The "compiler" seems not to have simply converted GBP or EUR to USD - but the USD-prices seem to be based on the European prices including tax, while you are used to seeing $-prices without tax.
So still expensive, but not that much more.
When do we get 10ghz CPU's? We been lingering between 5-6ghz for years now. 🤔
Silicon will not reach 10ghz ,it would require subanbient cooling just to inhibit resonance in the silicon at those.frequencies. silicon starts "ringing" at ultra high frequencies
Unfortunately AMD already smokes NVidia in the mid range. So people still aren’t going to buy it I mean I will but I already took
The chance with my 680xt and it was an absolute win but most people will just believe the hype and overpay for NVidia
ditching or they just can't keep up with nvidia?
I would not be surprised that the triple AAA gaming implosion of the recent months was also another reason why AMD walking away from GPU production. Future demand for graphic cards is likely to slump down. Time will tell.
it is already in slump. gamer rgardless pc or console are buying less. we already saw that effect for two quarters straight in AMD revenue.
Before I watch, so trouble making GDDR7 stable, or so much better,, they don't want to wreck their own gravy train? Been waiting on GDDR7 for a couple years?
NVIDIA GPUs costing $1K and over is crazy... AMD at least is being reasonable...
7900XTX is $1000. 6950 XTX is $1100.
Pay more get more... its the way the world works. You dont have a Vauxhall if you can have a Ferrari.
The biggest segment in discrete graphics cards is the midrange so yeah theyre going all in for best value in class for greatest market share.
There are number of parameters that go into business decision, and we don't know majority of them. Personal opinion, but there is no point for AMD to focus on high-end consumer product. It's high cost, low volume product, which doesn't really fund things further. There is a lot more money to be made using their wafer allocation towards DC products (AI in particular) for the next 2-3 years. Focusing on less SKUs (i.e. less tapeouts, etc.), better value product (hopefully) to grow market share is a better goal for short-mid term. UNDA is also fixing a mistake. Every Nvidia GPU for 10+ years (IIRC) was CUDA capable. That means every student was able to write code, test locally on their gaming laptops/desktop and then scale it up on their university supercomputer. You can do that on low-end Nvidia gaming GPU. This helped a lot for Nvidia to become number one choice for anything GPGPU-compute, HPC/Scientific, AI, etc. Checking, and it seems that GeForce GT 430 was the 1st GPU to support CUDA (released in 2010).
Yep, I’ve been watching the growth of GPGPU since NVidia announced and deployed their CUDA SDK for their GPUs. And note that the first real breakthrough research in AI (deep learning) was in 2012 in computer vision (Alexnet) using two 3GB GTX 580s, that tells you that NVidia’s investment in CUDA wasn’t a joke or distraction as it was characterized back in 2010 by some gaming publication and journalists.
According to wiki the first GPUs supported were even earlier, the GeForce 8800 Ultra, GeForce 8800 GTX, GeForce 8800 GTS(G80) series in 2007.
Is it me or paul's monitor backlight bleeds like hell?
Noting I do really requires a 4090. In fact, when I bought my last PC, I went with a 4080 at a very good discount price, and I have never regretted the decision.
Funny that my phone did leave your vids. Im back now
Price to performance at the mid-range matters more than having some Flagship GPU that the competition's Flagship eats
if you cant make up cost why make the product? the *VAST* majority of people buy at the $200-500 MSRP range. That is where the market is, not in flagships.
I do hope AMD at least still makes the x800 (6800/7800) series. They're still nicely priced and good performance boost from their x700 variant.
11:27 Darn I was wanting to see Zygotic Washdance play their big hit "AI Yeah"
That does make a bit more sense and is sad, hopefully they get back to top end soon.
Where do you draw the line between mid-range and high-end GAMING GPUs besides just what's more expensive vs. less expensive?
Seems a really bad idea to have glass thats prone to a temper
Thank you Paul. I’m also not positive about them dropping now nvidia is uncontested.
1. AMD already announced at the very start of the year that they would no longer be focusing on high end GPUs. Somehow the mainstream either ignored that, didn't believe it, or never saw it.
2. Why would people "rejoice" at this news, assuming AMD would now focus *more* on entry level and mid-ranged GPUs? That's not what this means. It just means they are no longer going to bother with high end GPUs. Their entry to mid-range offerings will be the same as they ever were.
3. Those that think this is "bad' because it means "less competition" haven't been paying attention. AMD never was competition for Nvidia in the high end, and especially not since the 30 series. Nvidia already knows that AMD was irrelevant in the decision making of those who wanted enthusiast or flagship cards, because their cards would be the ones purchased 95% of the time anyway.
Nvidia's pricing is not dictated by having AMD in the race. They've already proven how out of touch they are with reality, yet their cards still sell. The 4080 didn't, but that was primarily because the "value proposition" was low due to how much it already cost that people just decided to be upsold on the 4090.
Nobody needed the AMD GPU computing for Ai ...
@@lucasrem "They've already proven how out of touch they are with reality, yet their cards still sell."
Because their customer base nowadays is the likes of AWS, Microsoft, Google, Oracle, IBM, OpenAI, Meta, HP Enterprise and so on.
They bring BILLIONS and BILLIONS of dollars into NVIDIA's revenue, and generally that's what the Shareholders WANT, since they're the kind of people who last time they played any video games, it was back when the Atari 2600 was the hot product.
Your statement on Nvidia being out of touch with reality contradicts itself. If they were out of touch with reality they would no be selling well.
@@I.C.Weiner YES. If they were out of touch, Jensen would've been fired as CEO by the Board.
@@RockstarRomaniaThe main institutional shareholders of NVidia and AMD: Vanguard, Blackrock, State Street.
Along with the Big Tech customers you’ve named tells me that gamers desires and opinions are meaningless.
Look at the last quarterly earnings for NVidia and AI/DC vs gaming is 9:1.
Meta, Google and MS have announced $100+ billion DC projects and are buying AI GPUs as fast as they can make them.
What’s kind of pathetic are those gamers whining that their favourite GPU is now $800 when it used to be $600, when a single 80GB NV H100 now goes for $45,000. That same piece of silicon makes them 50x the money as an AI chip rather than a gaming GPU.
What’s amazing is that NVidia or AMD are devoting ANY wafer allocations at all to gaming GPUs considering the ROI in the AI market.
If they embed DirectX12/Vulcan/Proton on the GPU itself - and can be reprogrammed/reflashed - that could be very interesting!
My 1080ti still does a great job. I don't want to pay $900AUD for a "mid level" gaming GPU. I want value for money. Most people I know who game, do not have 3080s or 4090s. Most of us rock low-mid end cards. If AMD simplify their consumer GPU offerings to 2-3 value for money bangers, that's a win. Less R&D developing multiple SKUs, which simplifies binning, yields, etc etc. I think it's a good strategy.
So why can't they do both?
It's all well and good for AMD to focus on Low to mid-range GPUs let's hope they don't pull some type of snake oil move on consumers and put high-end GPU prices on mid-range GPU's
Sunday morning here, since it is Paul’s tech news I’ll skip the coffee ;)
Although, where is your beer Paul?
I think the GPU strategy is primarily a decision to sidestep a crazy inflated high speed GDDR market strained by demand for AI datacenter cards. If they were to use GDDR6X or GDDR7 themselves in any measurable volume, that just bids up an already high demand market further and everyone loses. They have seen this as a long-term issue, long enough to design a whole generation of cards around which would have required they began such plans at least a year ago and expect the strategy to work for a couple a years going forward in that segment.
Not a chance the X670E Crosshair will be in the £400 region. I paid £600 for the X670E Crosshair.
If we can have a good value battle in the midrange the consumers win, but it sucks that the high end (assumedly 5080 onward) is just gonna be iNvidias ballhouse.
At this point, I just want there to be good options in the 60 class.
The way UK-US exchange rates work with tech, is you just change the £ sign for a $ sign.
So the Motherboards won`t cost as much as you think, on the State side of the pond.
Case and point with the PS5 Pro, £699 or $699. If it was done via real conversion rates, it would be equivelant to $1050 USD.
Until new X3Ds come out, they can keep their new motherboards. I would never have believed that I'm actually considering going with 285k this time. Everyone seem to be dragging their feet with new releases. Nvidia too.
amd should be positioned to start offering apu DC to prosumer and smb mkt at some point - we can hope
0:28 "I'd hesitate to dunk on Console Gamers too boldly a misfired meme could easily cause damage to our own tempered glass side panels"
This a great statement as a lot of PC channels have proved that they:
A - do not understand either PC or console hardware
B - Do not know how to build a proper PC as almost all of the builds guarantee stutter and poor performance and those that did not were almost all by accident.
I have had to write off a number of channels as serious information sources because of their responses, which is a pity.
Why not Core Ultra 9 290K? Or Core Ultra 7 270K? Or Core Ultra 5 260K? The fact that they only have 285K/265K/245K suggests, at least to me, that Intel might be holding something back for when AMD releases its X3D. But who knows!