They've been doing this since forever. Remember 1st gen Ryzen was piss poor at gaming. What did AMD do at their presentation? They tested the 1800x at 4k vs the 6800K. Hilarious shit
@@GewelReal Because people didn't care as long as it was cheap. Just look how much the price dropped in 5 months 8core were going from $400+ and then all of a sudden were under $300. 1800x was $500 6months later $400
Did Qualcomm get all those games recompiled for ARMx64? Apps have to be compiled against the CPU arch, so how did they get those game numbers? Is it running with Windows' emulation layer?
Magnetic fan swapping ... as a budget buyer, I think it's overkill if it is just to be able to (rarely) replace a fan without disassembling the card. Sapphire or ASRock uses 1 to 3 screws.
@@GewelReal Doubtful. Every update has brought pretty substantial changes and new implementations of FSR 3 frame gen are looking great. Also the fact you said it will be "the same old pixelated mess" shows you have no idea what we're talking about. In FSR 3.1 they are decoupling frame gen with the upscaling technology so you can use whatever upscaler you want.
Show me a 6500xt hitting 8gb vram usage that isnt bottlenecked by the entire rest of the card... That said, just good to see companies not being stingy with VRAM.
If you check the website that Qualcomm had for the list of compatible games, I can already tell this ARM laptop is gonna useless for me. None of the games I play are compatible, some are classified as "it runs" (0 fps but it opens according to the website), or "playable" but it's 30fps or lower. When I searched for the most popular games that I know, only very few of them are running fine which makes this laptop seem extremely trash for me.
Zen 5 is on 4nm and Zen 5c is on 3nm. But all EUV nodes (N6, N5, N4 and N3 in their various variants) are produced at the same fabs. It's not until we reach High Numerical Aperture EUV that that TSMC will start doing ground-up fabs again.
The part about making fudged graphs that's so silly is that as long as the refreshed refreshed refreshed AM4 chips are a good price, they don't need to lie to present a good value. Let's say they're 20% slower than the Raptor Lake chips, well, if you're upgrading an old mobo you're not having to buy a new one, and you save more than that!
@alb.1911 The buck stops at the top. She is giving the presentations or at the very least is directly overseeing it, she knows what is on the slides, and she approves.
It's all going to come down to the x86 translation layer. Apple took a few tries to get there, but now I don't hear any complaints from anyone. Most people don't know (and hence don't care) about CPU architectures. Most only care about battery life on their laptops and that it will run everything they need.
Change is coming. AMD is working on ARM SoCs as well. Nvidia has their Tegra offerings, if they were bothered they could make it a market leader but AI is making them a tonne more money.
This ARM revolution is the most exciting thing in tech for the past 10 years or more to me. You can already technically run MacOS on an iPhone because it uses the same architecture as Macs, but of course Apple doesn't want that. Microsoft on the other hand... I think it's only a matter of time until we can plug our phones into a docking station and get a full fledged PC.
It's going to take a lot from Microsoft to make people forget the absolutely MASSIVE corporate disaster it was when they dropped Windows Phone Edition after everyone had rolled them out due to Active Directory, Exchange and OneDrive integration.
I've been agonizing over upgrading my 1080 or waiting but i might just spring for a 4080 now, it looks like the 5080 will be gimped for china reasons and now you're saying there's a shortage plus they're talking about tariffs and sanctions in congress, so a 5090 might legit cost 3000 or more if you can get it at all
Give the people a 12 to 16-core X3D cpu to upgrade their AM4-based 5800x/5900x builds or don't even bother-- plenty won't be investing in anything newer until it's completely obsolete.
Because it wasn't actually being gpu limited. There is an absolutely STUPID amount of assumption going on from hardware unboxed, that's now being parroted by others, going off the apparent assumption that the settings on the games they tested were high enough to force gpu limit to be the deciding factor. WE DO NOT KNOW what settings those benchmarks were performed at, or at least I've not been able to find any slides showing that. Literally the only footnote slide I've been able to find is the one HUB showed, which is a partial list and seems to have specifically cropped some info out. It's absurd the amount of ragebaiting they are doing with no real basis, and that people are being STUPID enough to make blind assumptions regarding things.
@@jtnachos16 ? "everyone else is stupid" ... "we don't know" ... pick one It's a 1P benchmark being used to make marketing claims about their product relative to the competition. The onus is on the marketer to be honest in their claims. They get no benefit of the doubt, when they shouldn't be creating the doubt.
Could be that AMD GPUs have some integration with AMD CPUs through software, but the reason is unclear. Regardless, if you check the Hardware Unboxed video for 13700K, you can see that the 13700K is ~30% faster than the 5950X (which is what the 5900XT is essentially) in Cyberpunk.
@@greebj I did not say 'everyone else is stupid', I said that people are making STUPID ASSUMPTIONS without any real basis for that assumption other than 'HUB said this' and 'prior AM4 cpus didn't do this well'. Everybody always ignores the impact of microcode optimization, boost algo tweaking, better refined manufacturing on silicon quality, and better binning allowing more consistently maintainable boost clocks. ALL of these can add up to significant improvements in performance. The reality is, without even TESTING things first, HUB jumped right in with really strong statements that have no real testing backing them. Now others are starting to blindly parrot that, doing NO critical thinking in the process. God forbid we talk about them IGNORING the context around who these CPUs are intended for. You aren't supposed to be buying these to shove into a new build paired with a top end GPU. These are intended as options for budget systems, or as potential upgrades for existing systems. There's also more to computer usage than just gaming purposes. You literally need a --80 class card or better to make the gpu NOT be the bottleneck on even the weaker end of modern cpus. If the new AM4 cpus outperform 13k SKUs in a typical use scenario, which is a low-middle end gpu paired with the cpu, then it is not false marketing. That's just people outright refusing to consider the context of the testing, much as both HUB and Daniel here have done. Where they don't even TRY to think of a reason AMD might have paired that hardware beyond 'AMD is trying to mislead customers'. They put no deeper thought in, just roll with the immediate kneejerk. This is really obvious when they bring up both systems on DDR4. OF COURSE THEY ARE. It's supposed to be an apples to apples comparison, which means the only difference in parts should be the CPU. AM4 literally doesn't support DDR5, it would be a very unfair comparison to compare a rig running DDR4 against one running DDR5.
as soon as i had seen those charts from AMD on the livestream I knew they'd be shunned. I upgraded from a 5800x to a 13600K so I knew the difference is huge, and that AMD is misleading.
5800x to a 13600k? That made no sense of an upgrade. You went with an all new platform by going with Intel so you should have gotten a 7700x which is faster in games than a 5800x and ties the 5800x3d.
TL;DR: you just bought a dead end platform and wasted money. Congrats, you just wasted money. A 5800X delivers ~5.2 fps per watt, a 13600k does ~5.7 fps per watt. The X3D delivers ~6.4 fps per watt. I guess all this ass kissing on this channel didnt do you good, otherwise the smartest choice would've been a 5800X3D if you dont want to buy AM5 or the new Intel socket.
@@andersjjensen exactly what I've said: 5800X3D delivers the same performance (if not better) without the extra cost of a new motherboard, ram and cpu.
Yeah AMD is still the top dog in I GPU although they've had the same igpu for 2 years. And if the snapdragon x elite could have windows x86 games ported to windows arm instead of emulating x86 it would get more performance.
18 to 24 months has been the typical release cadence for years from both Intel and AMD... And AMD's Strix Point is already announced. And yes, it obviously has an iGPU upgrade.
I hope a day won't come where you would be willing to pay 2 grand and still be unable to purchase a GPU. So long as TSMC is the only real option, this stuff is gonna get crazy. Intel has gotta be quick with that fab. Wonder if Intel will remain in business by that time or if it will be able to actually produce anything in its fab. Some crazy stuff is happening.
Bloody hell AMD... heads are going to roll on this debacle. This debacle is another point why everyone should wait for independent reviews before buying GPUs and CPUs. They could have avoided this by included the info in the slides with the graphs (RAM Speed, GPU, Resolution and Quality setting level) and using both a 6600 and a 7900 XTX. Unfortunately the practise "lie of omission", is prevalent in the industry case in point Qualcomm's slide show not comparing AMD's APU. As it's well known Intel's iGPUs are weaker than AMD's APUs.
the qualcomm stuff isnt for your market, its for laptops. and they use technical wording but it is a fraction of 4090 basically. if you want to make it look good its 1/20th of 4090 and much worse if you want the truth.
Seems like the 5800xt and 5900xt are completely pointless CPUs. Why in the world would you ever buy one of those instead of a 5700X3D if you still want to use your old motherboard/RAM.
I know this sound completely bonkers, but stay with me here: "Computers can run other types of programs than games". I know, WILD concept! From what I'm gathering AM4 is still going strong in South America. As in, people buy new AM4 systems still. Both prebuilt and DIY.
screw me. i bought 3060ti when there were massive shortages due to cryptomining rush, dont tell me again 5070 will be again overpriced due to BS AI rush
The ability of AMD to one up itself in the pursuit of dropping the ball is hilarious at this point. I sometimes feel that there is no way this company thinks it is a good move but instead is being paid off by Nvidia to shoot themselves in the foot. It just seems more likely at this point. Everyone watching this video could do a better job with public perception and delivering performance estimates and choosing prices etc. AMD.. what are you doing dawg?
ARM comparison to Laptop Tickler CPU Right now this is kind of hilarious to try to expect anything groundbreaking Intel won't shoot themselves in the foot by making anything amazing for ARM, to destroy their own market share.
Ahh another AI keynote where Jensen will say "the more you buy - the more you save" or "I love Giga-Rays" for 2 hours. AI reality : If(bla bla){ } Else if (bla bla) { } Else { }
Thanks Daniel, love the videos you post. 👍 Now that I've said my thanks. 😂 Question for commenters, 7900xt..... worth the upgrade from a 1070 or should I get the GRE and upgrade to a 1440p monitor? (Newer series for the AV1 encoder if you have suggestions) 🫶
AMD did us dirty on this one, do believe the Snapdragon X is going to be great compared to APU's but it probably won't stack up well against modern discrete graphics. The performance will probably be in line with Apples M series chips here's a video about the latest windows games on them: ruclips.net/video/2KX5SzqrfUI/видео.htmlsi=mMIAkGmUUgsZVHKK
That's why Ryzen 7800x3d is best choice than new 9000 family, it's have great gaming performance and has a great value at this moment (maybe if they make x3d version will be otherwise)
It's why these fanboys need to go touch grass for awhile, Why keep saying now is the time to buy a new cpu when the one in your PC rn is perfectly fine? The economy isn't in a good place and these tech tubers should remind folks not to panic buy products
yes for gaming but we need to wait till their 9k x3d chips possible end of this year or early 2025 its confirmed that they are doing those 3d ones but me with 5800x3d i want to go for am5 soon so im not cpu limited
AMD's comparison: it's cherrypicked, it's misleading, it's disingenuous. and it deserves to be lambasted. but it's technically not a lie. the GPU bottleneck is not ideal, but if you technically run a system with that combo.....lie would simply be false reporting of the numbers. outside of taking issue with calling it a lie, it is a move intended to sucker the unaware.
So, what lesson can we learn from this? If you present positive but obscure CPU performance comparisons without revealing how they were created, a tech youtuber will simply state that he doesn't understand what the comparison means, which more or less amounts to a tentative pass, and you certainly won't be screamed at. However, if you reveal exactly how your positive CPU performance comparison has been made using a lower tier GPU to match your lower tier CPU (a comparison that some gamers actually want to see even it is not the technically correct way to compare CPUs in general), you do get screamed at and (incorrectly) called a liar. A proper response would have been to say that AMD's CPU performance comparison does demonstrate that the new CPU gives comparable performance to a higher tier CPU when bottlenecked by a popular lower tier GPU. It would also be proper to point out that when paired with a more powerful GPU, the situation would be veryt different. Starting with an outrage and screaming about the comparison is not proper, even when AMD can (properly) be faulted for not clearly pointing out the nature of the comparison they present.
TBH the Snapdragon chart is much more fubared than AMD's. They are taking their top line chip and comparing it to a 155h instead of Intels top of the line 285k. They all do this, this is nothing new. Also, it isn't just the 'up to' that is the weasel words. It is the "at similar power levels" Yes, a Intel part at 10watts isn't as efficient as a ARM cpu. Which is why we've always used ARM in our phones. This is not a breakthrough, this is all marketing gimmick. Snapdragon is positioning a part against Apple's M3, while simultaneously comparing itself to Intel chips. That alone should raise alarms. Yes yes, at 10 watts snapdragon blows a 10 watt intel part away. At 150 watts intel still absolutely creams every possible snapdragon part. Gamers need the performance, not the mobility. This is great for handhelds, but isn't much of a improvement over last gen Snapdragon elite chips. And for desktop gamers, it is weird that anyone is even giving this attention. And they wouldn't imo if they were being more honest about this all being clickbait.
Cool more inability for normal people to get computers at reasonable prices. Why would we need those things anyways? Don't you guys have Trillions? (Spoken as Blizzard to Consumers)
But think about some poor schmuck who doesn't know about this stuff who "upgrades" their AM4 based system with one of these chips now(paired with something like an RX 6600) and then plans to upgrade to a better GPU some time down the road. They are gonna be sorely disappointed when their new GPU doesn't perform anywhere near as well as they were expecting, if they see any tangible performance uplift at all. Edit: almost forgot. AMD used DDR4 for both systems with these tests. Not using DDR5 on the Intel platform apparently really holds the Intel CPUs back from their full performance potential.
@@jtenorj I'd say anyone who's into checking processor ability is halfway to having a clue already. It might catch a few people though, so you're right about that. But these days it's so easy to check on youtube etc.
@@ZackSNetwork The most popular card on Steam's latest survey is the 3060. That's almost the same as the 6600. Most people are using average systems to play games.
yea u rely on ur best stable platform CPU and subsystem configuration. because u'r staff is inflated by newbies, know nothing or competitor moles setting u up. mb
@@roki977 If these releases are anything like the 3600XT, 3800XT and 3900XT, all of which shipped without coolers, you're probably gonna need to buy a decent one if you don't already have it. In many cases, the XT processors actually performed worse than their lower clocked X counterparts. And you don't need an expensive AIO for great cooling. You can get a $35ish(give or take a couple bucks) Thermalright dual fan dual tower cooler and it would be plenty to let something like a 5800X stretch its legs.
I hope Colorful go viral and make a tonne of money lol. We need alot more innovation and differentiator in the GPU shroud area even if it's daft as what colorful are doing lol. In regards to the replaceable fans, XFX did this before for example the XFX RX480 had replaceable fans with pushdown clips. Sapphire also do replaceable fans held down by single screw.
I hate how they're calling it XT, because I keep thinking it's a graphics card. I mean the 5700XT was a graphics card and the 5800XT is a processor? What are they smoking over there?
It's misleading, not a lie. Dumb to compare the CPUs with a midrange video card. This is as useful as the 4K FPS comparisons for CPUs I've seen sometimes.
I mean basically they didn't lie... if you put rx6600 and you use those CPUs they'll have nearly identical result... so per corporat saying "where's the lie?". But why they did this? Marketing is the answer I can come up to. If it was with 7900 xtx or 4090 and shown true benchmark capabilities with these 2 XT CPUs... they would have been forgotten as soon as they turned slide to showcase other products... Like this they caused an huge stir. People are now rambling about both CPUs how it is outrageous. Major channels like yours, gamers nexus, unboxed, and many others made topic specifically about this. Literally exposing these 2 CPUs to thousands if not millions in the end and these CPUs may interest about bellow 1% of total viewere just from all channels. That is like 10 thousand customers just by doing this controversal thing. And if they didn't do that would even thousand buy this CPU by how quickly it woul've been forgoten?
AMD, dedicated to taking their marketting queues from nvidia and Intel, even when they are emerging as a leader.... Why show up the competition by being truthful in your marketing? If you can't beat them join them.
Hilarious how wrapped-up in nuanced benchmarking people get when pretty much all modern hardware is plenty fast enough to deliver a quality gaming/productivity experience for home PC users.
Daniel, to stupid questions, stupid answers... WHY would AMD do that? because that is the valid comparison for someone on that budget, for someone with more resources will be looking to other CPUs and platforms, but for someone with mid-card from past generations, those CPUs are valid options and valid comparisons on that budget and use case...
No they are not... 5700X3D and 5800X3D exists and they are a long shot better than those "new" modified 5800X and 5900X. It's literally pointless to buy a non X3D CPU if you build a last gen AM4 gaming system.
@@bartdierickx4630 It's a fake slide to boost sales and fool customers. Think... there's no way that a slightly modified 5900X would be faster than a 13700K. Especially since the 13600K can already match or beat a 5800X3D
@@laszlozsurka8991 you're dodging the question. When coupled with a 6600xt, or other low end gu that usually gets paired with this class of cpu, will you get about the same performance (100-104%)?
Criticizing AMD for being misleading is OK, but it is not a lie (assuming test runs were accurate as reported). Once again, independent reviewers can give a more complete picture. Manufacturers have a self-interest that will frequently mislead unsophisticated observers.
But calling them out on it has actually helped a lot. AMD was extremely tight on the money with their claims up until RDNA3. After that there has been more and more of these BS things. We actually used to tell Intel and Nvidia to "Behave like AMD!" when they did their shenanigans.
@@andersjjensen When you have a good product you can post the numbers, but when it is not, then you have to resort to smoke and mirrors. Pretty much every company does this, AMD included. Most people posting on youtube about what they think companies should or should not do are naive about how business works. You don't generate revenue by telling consumers that your product is junk or unexciting compared to a competitors.
2017 : We show benchmarks live.
2023 : We say "Up To".
2024 : AI.
ayo could you not scream in the title please, my 10 year old gpu trynna sleep rn, thank you
!!!!!!!!!!!!!!!!!1!1!1!!11!!11!1!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
My RTX 3080 screams along unfortunately.
😆
My months-old GPU has a right old whine when it gets woken up
This is why fanboying any brand is wrong...
Fanboying only makes the company you like worse
Fanboying for any company is akin to Simping for any women. You lose respect and get taken for a ride in either case.
Don't Glorify Brands, Rob Them.
@@Satabell_Art expect ubisoft, bankrupt them
@@Mike80528Damn women arem't safe even under a damn gpu video
Another big corporate lying about their products??
"What a shocker" 😂
At this point, I think a small kid would do better in AMD's marketing branch.
They are like not even trying or something 💀
They've been doing this since forever. Remember 1st gen Ryzen was piss poor at gaming. What did AMD do at their presentation? They tested the 1800x at 4k vs the 6800K. Hilarious shit
You commented is an hour old and the AMD copium army hasn’t showed up yet? Wild..
@@mesicek7and it sold like hotcakes
@@GewelReal Because people didn't care as long as it was cheap. Just look how much the price dropped in 5 months 8core were going from $400+ and then all of a sudden were under $300. 1800x was $500 6months later $400
Its a scam, and it should be illegal to fool people like that.
It is!
@@ijaygee1The FTC will never investigate a claim of false advertising seriously
But theyll get away with it with no consequences. Isn't capitalism so great?
No because that would put or stop anyone from doing things that are not really bad, it's your fault if you just believe everything as most people do.
I don't got anything to say really... But cow go moo, sheep go baah, and chicken go cluck.
You said enough, believe me.
Very good points, Sir. I have no counter arguments.
and pig go oink
Did Qualcomm get all those games recompiled for ARMx64? Apps have to be compiled against the CPU arch, so how did they get those game numbers? Is it running with Windows' emulation layer?
Magnetic fan swapping ... as a budget buyer, I think it's overkill if it is just to be able to (rarely) replace a fan without disassembling the card. Sapphire or ASRock uses 1 to 3 screws.
btw: Sapphire already does this easy swap out thing with fans :)
The more you buy the more you save 👍
The more you save the more you buy!
@@MrBalrogosi have got a 4080 super... And 50 series are now leaking... Hope my card will not worth half of what i have paid in a year 😅
Im więcej kupujesz tym więcej oszczędzasz
@@georgemateescu1416 it's not correct, but it's accurate
@@georgemateescu1416 you are definitely not getting dlss 4 fgg💀
Any News about FSR looks like AMD forgot about it
GoW:R Coming in September with FSR 3.1, that mean FSR 3.1 will be aviable before September
@@SmolderingFireit will still be same old pixelated mess
@@SmolderingFire too late to party
You're right It was supposed to be quarter two but quarter two is 17 days from being over. So either it's right around the corner or it's delayed.
@@GewelReal
Doubtful. Every update has brought pretty substantial changes and new implementations of FSR 3 frame gen are looking great.
Also the fact you said it will be "the same old pixelated mess" shows you have no idea what we're talking about. In FSR 3.1 they are decoupling frame gen with the upscaling technology so you can use whatever upscaler you want.
Show me a 6500xt hitting 8gb vram usage that isnt bottlenecked by the entire rest of the card... That said, just good to see companies not being stingy with VRAM.
If you check the website that Qualcomm had for the list of compatible games, I can already tell this ARM laptop is gonna useless for me. None of the games I play are compatible, some are classified as "it runs" (0 fps but it opens according to the website), or "playable" but it's 30fps or lower. When I searched for the most popular games that I know, only very few of them are running fine which makes this laptop seem extremely trash for me.
An incredibly excited about arm gpus in the future. In about 5 years it's going to completely transform the laptop space.
Hello There, Daniel.
From now on I will just buy MEOW parts - I need that in my life.
Isn't only intel lunar lake on tsmc 3nm? AMD is on 4nm.
Zen 5 is on 4nm and Zen 5c is on 3nm. But all EUV nodes (N6, N5, N4 and N3 in their various variants) are produced at the same fabs. It's not until we reach High Numerical Aperture EUV that that TSMC will start doing ground-up fabs again.
New Turok??? Hell yeah!
We need AMD Nvidia and Intel to get their fucking shit together bro their performance to price every generation is fucking ridiculous
Remember that ARM windows systems does not support AVX2 apps and games.
The part about making fudged graphs that's so silly is that as long as the refreshed refreshed refreshed AM4 chips are a good price, they don't need to lie to present a good value. Let's say they're 20% slower than the Raptor Lake chips, well, if you're upgrading an old mobo you're not having to buy a new one, and you save more than that!
If I wanted to build a working time machine = Rise the speed limit from 55mph.
Someone at AMD should be fired and we don't know yet his name...
It was probably done on directions from the top 😂
@@Tugela60 I don’t think Lisa Su is directly responsible for this.
@alb.1911 The buck stops at the top. She is giving the presentations or at the very least is directly overseeing it, she knows what is on the slides, and she approves.
I hope X Elite will show great CPU performance, low heat
oise and long battery life. I hope it will be my new working machine.
A lot of people are talking about snapdragon. That’s a sign to not buy into the hype lol it’s not gonna be break through like last time
It's all going to come down to the x86 translation layer. Apple took a few tries to get there, but now I don't hear any complaints from anyone. Most people don't know (and hence don't care) about CPU architectures. Most only care about battery life on their laptops and that it will run everything they need.
Change is coming. AMD is working on ARM SoCs as well. Nvidia has their Tegra offerings, if they were bothered they could make it a market leader but AI is making them a tonne more money.
This ARM revolution is the most exciting thing in tech for the past 10 years or more to me. You can already technically run MacOS on an iPhone because it uses the same architecture as Macs, but of course Apple doesn't want that. Microsoft on the other hand... I think it's only a matter of time until we can plug our phones into a docking station and get a full fledged PC.
and then wirelessly
Samsung 24 ultra dex
It's going to take a lot from Microsoft to make people forget the absolutely MASSIVE corporate disaster it was when they dropped Windows Phone Edition after everyone had rolled them out due to Active Directory, Exchange and OneDrive integration.
You can already do that wirelessly with Samsung phones.
Dex =/= Windows guys
adding a "T" at the end of the CPU name, with a 0.1ghz clock boost should not make it 32% faster.
How come god of war running slower than witcher 3 in SD Elite x? It one of the best optimised game out there
I've been agonizing over upgrading my 1080 or waiting but i might just spring for a 4080 now, it looks like the 5080 will be gimped for china reasons and now you're saying there's a shortage plus they're talking about tariffs and sanctions in congress, so a 5090 might legit cost 3000 or more if you can get it at all
Daniel you are epic! 😎Hope you are having a good day 😻
Everything is getting expensive. Might as well invest in tech stocks to get a slice of the PI.
Worst part is that people believe any BS they see and then complain.
daniel at this point you're my news for everything pc related so please keep it up lol
Those MEOW cards are absolutely epic. I love those silly designs
I saw the title and came as soon as I could. What's the news?
You are too late, it's not news anymore
Give the people a 12 to 16-core X3D cpu to upgrade their AM4-based 5800x/5900x builds or don't even bother-- plenty won't be investing in anything newer until it's completely obsolete.
Question though, why did the RX6600 perform that much better on the 5800XT in Cyberlink 2077 compared to the 13600KF?
Because it wasn't actually being gpu limited. There is an absolutely STUPID amount of assumption going on from hardware unboxed, that's now being parroted by others, going off the apparent assumption that the settings on the games they tested were high enough to force gpu limit to be the deciding factor. WE DO NOT KNOW what settings those benchmarks were performed at, or at least I've not been able to find any slides showing that. Literally the only footnote slide I've been able to find is the one HUB showed, which is a partial list and seems to have specifically cropped some info out. It's absurd the amount of ragebaiting they are doing with no real basis, and that people are being STUPID enough to make blind assumptions regarding things.
@@jtnachos16 ? "everyone else is stupid" ... "we don't know" ... pick one
It's a 1P benchmark being used to make marketing claims about their product relative to the competition. The onus is on the marketer to be honest in their claims. They get no benefit of the doubt, when they shouldn't be creating the doubt.
Could be that AMD GPUs have some integration with AMD CPUs through software, but the reason is unclear. Regardless, if you check the Hardware Unboxed video for 13700K, you can see that the 13700K is ~30% faster than the 5950X (which is what the 5900XT is essentially) in Cyberpunk.
pure luck basically.
@@greebj I did not say 'everyone else is stupid', I said that people are making STUPID ASSUMPTIONS without any real basis for that assumption other than 'HUB said this' and 'prior AM4 cpus didn't do this well'. Everybody always ignores the impact of microcode optimization, boost algo tweaking, better refined manufacturing on silicon quality, and better binning allowing more consistently maintainable boost clocks. ALL of these can add up to significant improvements in performance.
The reality is, without even TESTING things first, HUB jumped right in with really strong statements that have no real testing backing them. Now others are starting to blindly parrot that, doing NO critical thinking in the process. God forbid we talk about them IGNORING the context around who these CPUs are intended for. You aren't supposed to be buying these to shove into a new build paired with a top end GPU. These are intended as options for budget systems, or as potential upgrades for existing systems. There's also more to computer usage than just gaming purposes.
You literally need a --80 class card or better to make the gpu NOT be the bottleneck on even the weaker end of modern cpus. If the new AM4 cpus outperform 13k SKUs in a typical use scenario, which is a low-middle end gpu paired with the cpu, then it is not false marketing. That's just people outright refusing to consider the context of the testing, much as both HUB and Daniel here have done. Where they don't even TRY to think of a reason AMD might have paired that hardware beyond 'AMD is trying to mislead customers'. They put no deeper thought in, just roll with the immediate kneejerk. This is really obvious when they bring up both systems on DDR4. OF COURSE THEY ARE. It's supposed to be an apples to apples comparison, which means the only difference in parts should be the CPU. AM4 literally doesn't support DDR5, it would be a very unfair comparison to compare a rig running DDR4 against one running DDR5.
as soon as i had seen those charts from AMD on the livestream I knew they'd be shunned. I upgraded from a 5800x to a 13600K so I knew the difference is huge, and that AMD is misleading.
5800x to a 13600k? That made no sense of an upgrade. You went with an all new platform by going with Intel so you should have gotten a 7700x which is faster in games than a 5800x and ties the 5800x3d.
TL;DR: you just bought a dead end platform and wasted money.
Congrats, you just wasted money. A 5800X delivers ~5.2 fps per watt, a 13600k does ~5.7 fps per watt. The X3D delivers ~6.4 fps per watt.
I guess all this ass kissing on this channel didnt do you good, otherwise the smartest choice would've been a 5800X3D if you dont want to buy AM5 or the new Intel socket.
Eh, you do you, but a 5800X3D would have gotten you 90-95% of the performance without spending on a mother board and RAM on top.
@@andersjjensen exactly what I've said: 5800X3D delivers the same performance (if not better) without the extra cost of a new motherboard, ram and cpu.
Yeah AMD is still the top dog in I GPU although they've had the same igpu for 2 years. And if the snapdragon x elite could have windows x86 games ported to windows arm instead of emulating x86 it would get more performance.
18 to 24 months has been the typical release cadence for years from both Intel and AMD... And AMD's Strix Point is already announced. And yes, it obviously has an iGPU upgrade.
Teachers are the best
The arrogance of these tech companies towards gamers is ridiculous! Just wow
Finally, Daniel quit using clickbaits in Title. Much better. Thanks alot!
I hope a day won't come where you would be willing to pay 2 grand and still be unable to purchase a GPU. So long as TSMC is the only real option, this stuff is gonna get crazy. Intel has gotta be quick with that fab. Wonder if Intel will remain in business by that time or if it will be able to actually produce anything in its fab. Some crazy stuff is happening.
Bloody hell AMD... heads are going to roll on this debacle. This debacle is another point why everyone should wait for independent reviews before buying GPUs and CPUs. They could have avoided this by included the info in the slides with the graphs (RAM Speed, GPU, Resolution and Quality setting level) and using both a 6600 and a 7900 XTX. Unfortunately the practise "lie of omission", is prevalent in the industry case in point Qualcomm's slide show not comparing AMD's APU. As it's well known Intel's iGPUs are weaker than AMD's APUs.
But can ELITE run android?
As long as corporations like Adobe and Microsoft push anti-privacy products I am not buying in. Recall AI etc are an affront and danger to everyone.
the qualcomm stuff isnt for your market, its for laptops. and they use technical wording but it is a fraction of 4090 basically. if you want to make it look good its 1/20th of 4090 and much worse if you want the truth.
Snapdragon X Elite 🚀 🚀 🚀
Hardware unboxed is very good
Seems like the 5800xt and 5900xt are completely pointless CPUs. Why in the world would you ever buy one of those instead of a 5700X3D if you still want to use your old motherboard/RAM.
They would have to be really cheap to make sense. Like 250 for the 5900 XT and 199 for the 5800xt
For productivity uses if ur still on am4?
I know this sound completely bonkers, but stay with me here: "Computers can run other types of programs than games". I know, WILD concept!
From what I'm gathering AM4 is still going strong in South America. As in, people buy new AM4 systems still. Both prebuilt and DIY.
screw me. i bought 3060ti when there were massive shortages due to cryptomining rush, dont tell me again 5070 will be again overpriced due to BS AI rush
The ability of AMD to one up itself in the pursuit of dropping the ball is hilarious at this point. I sometimes feel that there is no way this company thinks it is a good move but instead is being paid off by Nvidia to shoot themselves in the foot. It just seems more likely at this point. Everyone watching this video could do a better job with public perception and delivering performance estimates and choosing prices etc. AMD.. what are you doing dawg?
ARM comparison to Laptop Tickler CPU
Right now this is kind of hilarious to try to expect anything groundbreaking
Intel won't shoot themselves in the foot by making anything amazing for ARM, to destroy their own market share.
Intel should mock them for that
8:54, older 7940HS of AMD rleased in Q1 last year is much faster than this.🤭
Ahh another AI keynote where Jensen will say "the more you buy - the more you save" or "I love Giga-Rays" for 2 hours.
AI reality :
If(bla bla){
} Else if (bla bla) {
} Else {
}
Starfield burn mi M.2 slot. Last uppdate. SSD is fine but slot is dead. On internet saw i am not only one.
Thanks Daniel, love the videos you post. 👍
Now that I've said my thanks. 😂
Question for commenters, 7900xt..... worth the upgrade from a 1070 or should I get the GRE and upgrade to a 1440p monitor? (Newer series for the AV1 encoder if you have suggestions) 🫶
Get the GRE and overclock it so it performs like a 7900 XT. Should more than handle 1440P with just about any game.
Adreno didn't want to be compared to it's dad Radeon. ;)
AMD did us dirty on this one, do believe the Snapdragon X is going to be great compared to APU's but it probably won't stack up well against modern discrete graphics. The performance will probably be in line with Apples M series chips here's a video about the latest windows games on them: ruclips.net/video/2KX5SzqrfUI/видео.htmlsi=mMIAkGmUUgsZVHKK
AMD holding gun 🔫
Looks at foot 🦶
Bang 💥
That's why Ryzen 7800x3d is best choice than new 9000 family, it's have great gaming performance and has a great value at this moment (maybe if they make x3d version will be otherwise)
It's why these fanboys need to go touch grass for awhile, Why keep saying now is the time to buy a new cpu when the one in your PC rn is perfectly fine? The economy isn't in a good place and these tech tubers should remind folks not to panic buy products
yes for gaming but we need to wait till their 9k x3d chips possible end of this year or early 2025 its confirmed that they are doing those 3d ones but me with 5800x3d i want to go for am5 soon so im not cpu limited
The graphs were rage bait. Prove me wrong
AMD's true face?
We can all agree that A.I. is ruining gaming PC builders
I am a huge AMD fan, but this is disappointing... They have all my support, but this is not right.
the only thing the snapdragon slides tell me is that we could have had double the gpu power if they didnt add the useless AI bullshit
AMD's comparison: it's cherrypicked, it's misleading, it's disingenuous. and it deserves to be lambasted. but it's technically not a lie. the GPU bottleneck is not ideal, but if you technically run a system with that combo.....lie would simply be false reporting of the numbers.
outside of taking issue with calling it a lie, it is a move intended to sucker the unaware.
6500XT 8GB vs 3050 6GB
I am mad
So, what lesson can we learn from this? If you present positive but obscure CPU performance comparisons without revealing how they were created, a tech youtuber will simply state that he doesn't understand what the comparison means, which more or less amounts to a tentative pass, and you certainly won't be screamed at. However, if you reveal exactly how your positive CPU performance comparison has been made using a lower tier GPU to match your lower tier CPU (a comparison that some gamers actually want to see even it is not the technically correct way to compare CPUs in general), you do get screamed at and (incorrectly) called a liar.
A proper response would have been to say that AMD's CPU performance comparison does demonstrate that the new CPU gives comparable performance to a higher tier CPU when bottlenecked by a popular lower tier GPU. It would also be proper to point out that when paired with a more powerful GPU, the situation would be veryt different. Starting with an outrage and screaming about the comparison is not proper, even when AMD can (properly) be faulted for not clearly pointing out the nature of the comparison they present.
TBH the Snapdragon chart is much more fubared than AMD's.
They are taking their top line chip and comparing it to a 155h instead of Intels top of the line 285k.
They all do this, this is nothing new.
Also, it isn't just the 'up to' that is the weasel words.
It is the "at similar power levels"
Yes, a Intel part at 10watts isn't as efficient as a ARM cpu. Which is why we've always used ARM in our phones.
This is not a breakthrough, this is all marketing gimmick.
Snapdragon is positioning a part against Apple's M3, while simultaneously comparing itself to Intel chips.
That alone should raise alarms.
Yes yes, at 10 watts snapdragon blows a 10 watt intel part away.
At 150 watts intel still absolutely creams every possible snapdragon part.
Gamers need the performance, not the mobility.
This is great for handhelds, but isn't much of a improvement over last gen Snapdragon elite chips.
And for desktop gamers, it is weird that anyone is even giving this attention.
And they wouldn't imo if they were being more honest about this all being clickbait.
Snapdragon X ❤❤❤
i don't know what i hate the most, ''amd performance'' or ''ps5 performance'' like 8k kills me don't lie like that
Ain’t nobody even knows what an intel 155H is 😂
18:02 3D Mark's Wild Life Extreme Graphics Whore.
Cool more inability for normal people to get computers at reasonable prices. Why would we need those things anyways?
Don't you guys have Trillions? (Spoken as Blizzard to Consumers)
damn daniel
U make good sense lol
A 6600 is entirely average cost and ability. That's what makes it a good card to use - because that's what most people will be using.
But think about some poor schmuck who doesn't know about this stuff who "upgrades" their AM4 based system with one of these chips now(paired with something like an RX 6600) and then plans to upgrade to a better GPU some time down the road. They are gonna be sorely disappointed when their new GPU doesn't perform anywhere near as well as they were expecting, if they see any tangible performance uplift at all. Edit: almost forgot. AMD used DDR4 for both systems with these tests. Not using DDR5 on the Intel platform apparently really holds the Intel CPUs back from their full performance potential.
The RX 6600 sucks in 2024. Only 8gbs of vram. It struggles to run games at 1080p high at get 60+fps.
@@jtenorj I'd say anyone who's into checking processor ability is halfway to having a clue already. It might catch a few people though, so you're right about that. But these days it's so easy to check on youtube etc.
@@ZackSNetwork The most popular card on Steam's latest survey is the 3060. That's almost the same as the 6600. Most people are using average systems to play games.
I finally understand what AMD stands for - Atrocious Marketing Department.
4:03 AMD testing like hardware chamels in 2016. Testing with cpu GPU bottleneck. Thanks amd for rx 6600 testing
yea u rely on ur best stable platform CPU and subsystem configuration. because u'r staff is inflated by newbies, know nothing or competitor moles setting u up. mb
inflated . . . INFILTRATED . . . mb
5800xt could be good cpu if price is right. I wonder how it overclocks, maybe 5ghz..With decent ddr4 its no slouch..
Pretty sure a standard 5800X can hit 5GHz + on its own with PBO turned on and really good cooling.
@@jtenorj maybe this one dont need really good cooling for 5ghz..
@@roki977 If these releases are anything like the 3600XT, 3800XT and 3900XT, all of which shipped without coolers, you're probably gonna need to buy a decent one if you don't already have it. In many cases, the XT processors actually performed worse than their lower clocked X counterparts. And you don't need an expensive AIO for great cooling. You can get a $35ish(give or take a couple bucks) Thermalright dual fan dual tower cooler and it would be plenty to let something like a 5800X stretch its legs.
9900k beats 5950x
Stinky!
I hope Colorful go viral and make a tonne of money lol. We need alot more innovation and differentiator in the GPU shroud area even if it's daft as what colorful are doing lol. In regards to the replaceable fans, XFX did this before for example the XFX RX480 had replaceable fans with pushdown clips. Sapphire also do replaceable fans held down by single screw.
*a lot
"alot" isn't a word
I hate how they're calling it XT, because I keep thinking it's a graphics card. I mean the 5700XT was a graphics card and the 5800XT is a processor? What are they smoking over there?
XT it's shitty oc refresh CPU with 2% difference. Ryzen 3000Xt same shiet
The RX 5700XT was a graphics card. The 5800XT is a processor. Now wait until the 7700X gets a facelift, and the confusion will be epic :P
It's misleading, not a lie. Dumb to compare the CPUs with a midrange video card.
This is as useful as the 4K FPS comparisons for CPUs I've seen sometimes.
I mean basically they didn't lie... if you put rx6600 and you use those CPUs they'll have nearly identical result... so per corporat saying "where's the lie?".
But why they did this? Marketing is the answer I can come up to. If it was with 7900 xtx or 4090 and shown true benchmark capabilities with these 2 XT CPUs... they would have been forgotten as soon as they turned slide to showcase other products...
Like this they caused an huge stir. People are now rambling about both CPUs how it is outrageous. Major channels like yours, gamers nexus, unboxed, and many others made topic specifically about this.
Literally exposing these 2 CPUs to thousands if not millions in the end and these CPUs may interest about bellow 1% of total viewere just from all channels.
That is like 10 thousand customers just by doing this controversal thing.
And if they didn't do that would even thousand buy this CPU by how quickly it woul've been forgoten?
AMD, dedicated to taking their marketting queues from nvidia and Intel, even when they are emerging as a leader....
Why show up the competition by being truthful in your marketing? If you can't beat them join them.
What GPU news? That there are no news?
AMD taking notes from Intel...
Hilarious how wrapped-up in nuanced benchmarking people get when pretty much all modern hardware is plenty fast enough to deliver a quality gaming/productivity experience for home PC users.
Daniel, to stupid questions, stupid answers... WHY would AMD do that? because that is the valid comparison for someone on that budget, for someone with more resources will be looking to other CPUs and platforms, but for someone with mid-card from past generations, those CPUs are valid options and valid comparisons on that budget and use case...
No they are not... 5700X3D and 5800X3D exists and they are a long shot better than those "new" modified 5800X and 5900X.
It's literally pointless to buy a non X3D CPU if you build a last gen AM4 gaming system.
@@laszlozsurka8991 Or if you are upgrading an existing AM4 based system with the express purpose of playing games.
First
someone in AMD is being payed secretely by Nvidia to mess up them from inside, otherwsie i can not understand.
people buying a 5800xt will probably pair it with a 6600xt class gpu, not a 4090 rtx. So I think they're showing the most probable use case.
🤦🏻♂️
You completely missed the point Daniel was explaining... 🤦♂
@@laszlozsurka8991 according to you, is this a cut and dry presentation slide or a marketing presentation slide?
@@bartdierickx4630 It's a fake slide to boost sales and fool customers. Think... there's no way that a slightly modified 5900X would be faster than a 13700K. Especially since the 13600K can already match or beat a 5800X3D
@@laszlozsurka8991 you're dodging the question. When coupled with a 6600xt, or other low end gu that usually gets paired with this class of cpu, will you get about the same performance (100-104%)?
Criticizing AMD for being misleading is OK, but it is not a lie (assuming test runs were accurate as reported). Once again, independent reviewers can give a more complete picture. Manufacturers have a self-interest that will frequently mislead unsophisticated observers.
If you rig results so they look good, that IS a lie, lol. Anything that is delibrately misleading is lieing.
But calling them out on it has actually helped a lot. AMD was extremely tight on the money with their claims up until RDNA3. After that there has been more and more of these BS things. We actually used to tell Intel and Nvidia to "Behave like AMD!" when they did their shenanigans.
@@andersjjensen When you have a good product you can post the numbers, but when it is not, then you have to resort to smoke and mirrors. Pretty much every company does this, AMD included. Most people posting on youtube about what they think companies should or should not do are naive about how business works.
You don't generate revenue by telling consumers that your product is junk or unexciting compared to a competitors.
@@andersjjensen I am OK calling out AMD's misleading info and calling it BS.