Why AMD’s Bad Benchmarks Are BAD! Investigating The Lie
HTML-код
- Опубликовано: 26 июн 2024
- Thermal Grizzly: www.thermal-grizzly.com/en/kr...
Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Buy relevant products from Amazon, Newegg and others below:
GeForce RTX 4070 Super - geni.us/wSqSO07
GeForce RTX 4070 Ti Super - geni.us/GxWGmYQ
GeForce RTX 4080 Super - geni.us/80D6BBA
GeForce RTX 4090 - geni.us/puJry
GeForce RTX 4080 - geni.us/wpg4zl
GeForce RTX 4070 Ti - geni.us/AVijBg
GeForce RTX 4070 - geni.us/8dn6Bt
GeForce RTX 4060 Ti 16GB - geni.us/o5Q0O
GeForce RTX 4060 Ti 8GB - geni.us/YxYYX
GeForce RTX 4060 - geni.us/7QKyyLM
Radeon RX 7900 XTX - geni.us/OKTo
Radeon RX 7900 XT - geni.us/iMi32
Radeon RX 7800 XT - geni.us/Jagv
Radeon RX 7700 XT - geni.us/vzzndOB
Radeon RX 7600 XT - geni.us/eW2iWo
Radeon RX 7600 - geni.us/j2BgwXv
Radeon RX 6800 XT - geni.us/yxrJUJm
Radeon RX 6800 - geni.us/Ps1fpex
Radeon RX 6750 XT - geni.us/53sUN7
Radeon RX 6650 XT - geni.us/8Awx3
Radeon RX 6600 XT - geni.us/aPMwG
Radeon RX 6600 - geni.us/cCrY
Video Index
00:00 - Welcome to Hardware Unboxed
01:28 - Ad Spot
02:09 - Something about horses?
04:25 - Test System Specs
05:07 - Party Animals [RX 6600]
06:11 - Party Animals [RX 7900XT]
06:37 - Naraka: Bladepoint [RX 6600]
07:12 - Naraka: Bladepoint [RX 7900XT]
07:44 - Tiny Tina’s Wonderlands [RX 6600]
08:11 - Tiny Tina’s Wonderlands [RX 7900XT]
08:36 - Cyberpunk 2077 [RX 6600]
09:26 - Cyberpunk 2077 [RX 7900XT]
10:10 - F1 22 [RX 6600]
10:38 - F1 22 [RX 7900XT]
10:59 - Shadow of the Tomb Raider [RX 6600]
11:35 - Shadow of the Tomb Raider [RX 7900XT]
11:49 - Average [13600K vs 5800X]
12:35 - Average [13700K vs 5950X]
12:59 - Final Thoughts
Read this article on TechSpot: www.techspot.com/article/2853...
Why AMD’s Bad Benchmarks Are BAD! Investigating The Lie
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Outro music by David Vonk/DaJaVo Наука
Petition for Steve to stream Party Animals
Ohh you! :)
Yes...... I think my kids would love it
My key take-away from this video is “buy party animals”
The game of Kings, followed by Roblox.
W comment XD XD
You opened a can of worms, Steve... now all your benchmarks must include Party Animals :D
OMG! You're right 😅
Along with the 5800X3D... any CPU chart without both for the next 3-5 years will be considered a fail :P
@@antgib Unsure at this point whether there's any point to buying a 5800x3d rather than a 5700x3d, as far as bang for buck goes.
@@auturgicflosculator2183 Yeah, likely not, but it was more of a dig/joke in relation to previous videos and how Steve has said that in the past, if he leaves off the 5800X3D from the charts, he gets a bunch of comments asking why it wasn't listed/tested as well.
@@auturgicflosculator2183 5800x3d is worth it only if you already have an am4 board and ddr4 ram. The 7800x3d performance is nice but it isn’t big enough to warrant a whole new platform. Maybe that will change with the 9800x3d. You need to remember that when you are comparing cpus on different sockets for bang for buck, it isn’t just the price to performance of the cpu, its the price to performance of the cpu+mobo+ram
Alright Steve, you almost convinced us that you don't play Party Animals. Try again mate.
Steve having to lie to us and say he didn’t own party animals prior to this testing.
Sure steve, sure 😂
I'm pretty cracked at Party Animal's though!
@@Hardwareunboxed I'm sure that Steve and Steve are competing to who can be more brutal.
I'm not complaining. It's very good that there are people pointing out when companies do bad things to consumers.
😂lol @@Hardwareunboxed
At least Steve is sharing it with his daughter, though. ;)
Good job including the 5800X3D Steve. Wouldn't have wanted you to have to do the whole video over again.
Imagine all the engineers at AMD, doing their best to make great CPUs, and then comes the marketing team and just screws up.
This kind of thing just gives AMD a bad reputation.
Yep. Same thing for Intel. And Qualcomm. And a ton of other tech companies...
Honestly marketing is such a useless component of society
These "new" cpus did not deserve their own SKUs in the first place.
Marketing and Sales are the bane of all development teams everywhere. Jesus, these people just say shit and then come to the back of the house saying, "We need you to make sure this happens." Fucking Marketing. DIAF.
@@blackraen I will certainly say that marketing has its moments. As someone who has needed to work with marketing teams a decent bit, their understanding of how people work is really helpful to actually deliver the developed product.
... That said, when what marketing is spouting is basically a completely different product than what you're working on, you get what seems to be the majority of what the industry does. I imagine the disconnect only gets worse and worse the larger the company is and the larger the stakes are.
Me holding 1 kg, vs Eddie Hall holding 1 kg. I must be 100% as strong as him right?
Iron or feathers? 😁
you are even stronger than eddie hall - i mean, you lift the same mass with way less strenght!
literally this. good metaphor!
Or Brian Shaw......
@@johnscaramis2515 Iron, everybody knows metal is heavier than feathers.....
Remember fellas, billionaires and corporations are NOT your friends.
No one believes AMD or Intel benchmarks... We just wait for your honest reviews.
No but they do usually represent something this is just untrue
I’m involved in sales and marketing and I don’t trust my boss to come up with something new that tests the limits of integrity. They just can’t help themselves.
I don't believe any of the big tech companies including Nvidia.
What does Intel have to do with this? I can't recall any time I've seen Intel, Nvidia, or even AMD's marketing lie _THIS_ bad. They all cherry pick titles, settings, and synthetics that show them most favorable, but this is just a straight up lie, possibly even bordering on illegal due to false advertising.
@@zodwraith5745 snake oil anyone!
AMD staff involved with this should be fired. This kinda thing ends up hurting your whole company's credibility.
big time.
never hurt intel to much when they did the last 14 times?
@@Alwayslifted
They ended up so loved because they didn’t do that, and early Ryzen generations were tremendously successful with their only lies being underselling the performance. Lisa Su came out later stating Gen1 was a worst case scenario launch. Brutal honesty and the people loved it.
They’re sabotaging themselves going the Intel route.
Steve: "My 11-year-old daughter really enjoys Party Animals."
Steve's Daughter: "Daaad! Let ME play!!"
Steve's daughter probably has her own gaming PC. :D
@@samiraperi467 she does, they did a build on video together a while ago
@@samiraperi467 He has made two videos, making a gaming PC each for his older and younger daughter.
@@samiraperi467 prob has her own benchnarking rig!
Can we get an in-depth analysis of how RAM timings affects 99.9% frame times for Party Animals? I want to go pro and need the very best.
They were playing Party Animals while high as a kite.
Shit, that may actually be loads of fun!
I thought I heard him say that 😄
AMD, the best marketing team, Intel could ever ask for, way better than Intel's marketing team, it's astounding.
honestly kinda agree but with all the instability issues intel is having right now im probably still going to go for amd (if it is the best in performance and doesnt overheat easily) as for gpus idk , if the 5080 isnt 1200 again and is just as good or better than a 4090 ill go for that . (ive got a 6800 xt and i want to test ray tracing so thats why i want to upgrade)
It's only fair, after all the advertising Intel has done for AMD over the years. 😇
honestly it astounds me how consistently AMD, both CPU and GPU divisions, consistently shoot themselves in the foot with really stupid things like these when PC people's perception of them is really good
Amd has had relatively honest marketing on the CPU side since Zen 1, usually erring on the side of underselling. It doesn't excuse the shady stuff they do, but how delusional do you have to be to compare both companies and come to the conclusion that Amds marketing in any way makes Intel look good
@@firescream2528 it s only the i9 with the issues; also u don t need more than an i7 or ryzen 7 for gaming; bcs even if u get a 4090 u won t play at 1080P low settings; u ll go at least 2k, but most definitely 4k if u get a gpu that powerful, so the cpu becomes more irrelevant; so just get the best cpu for the money that provides ur setup with most performance; bcs if u go with a 4090 on 4k there s no point to get more than an i5 or i7/r5 or r7; but if u play esport games then yeah go for the best cpu available since there it will help
Ah yes, my favourite pro-AMD shill channel back again with a totally pro-AMD video, as the internet has told me for the last few years.
I used to be one of those people. Now I just realized most of them were Intel Fanboys defending Intel for their shortcomings.
Er....
What a baloney. It isn't an AMD channel. Might be you are a Intel fan boy who isn't capable to realise that neither Intel or AMD are perfect.
Amdunboxed makes video about amd good, Intel bad !!!!111!1!111 grrr
@@randomtoxxyou really didn't catch that the comment was sarcasm huh?
That Snake Oil must've been on sale, seems like everyone has some.
From AMD to Nvidia, and of course, the inventor of snake oil, Intel. PC gamers definitely have a lot of snake oil in the market.
Intel: AMD is selling snake oil!!!
Everyone: Intel, STFU
AMD: No, Intel is right.
@@auritro3903 When you rub it on, you get 10 fps faster.
@@auritro3903"Snake oil? I think we still have some of that around someplace. Let me check in the back."
Can you submerge your PC in it like mineral oil?
This is like AutoMotorSport testing the newest Lambo and Ferrari in a parking garage.
No it's like testing them but using cruise control at 30 mph
@@hyperturbotechnomike The Chinese cars should be left in China. I cannot wait for everyone in the EU to find out all the information they're allowing to be sent to China with these all-new spyware devices.
And yes, that is typical of any car to have some spyware. The difference being that the CCP is not a particularly human-friendly part of the world. Their definition of human rights is to see how many of them they can take away.
@@mryellow6918 "At 30mph, both the Ferrari and the Lamborghini completes the 30 mile trip in 1 hour."
@@justsomepandawithinternetand for that matter you could compare a Ford T With a buggaty Veiron and hey, they are the same 😂
"Fuck that dead horse.. we're gonna flog it some more.... and its gonna be educational" if thats not a tshirt or bumper sticker lol
Did AMD really think that no-one will notice the BS....how embarrassing
The people responsible for this type of fraudulent marketing should be shamed and immediately fired
That would be Lisa and I don't think she is going anywhere.
@@rooster1012 I think Lisa is working with AI right now that she doesn't even care about gaming anymore, but that's just probably me, I don't know
@@shadowlemon69 no ceo can afford to not care about a market that that's worth billions in revenue every year.
@@matteo964 They most certainly can if they can make just a little more somewhere else. When production is limited (and it always is) then turning a higher margin product pays. Right now there's only so much high-end production time available. These 5900 revamps use older 7nm production lines that don't compete with their other products.
@@DamnedSilly the gaming or cpu market is a safe bet for AMD, they can't just ditch it overnight.
Nvidia's revenue in AI is something like 60+ billions every year yet they still won't leave the gaming market despite the possibility of using the same silicon for products worth 10x as much if not more.
Not only is the gaming market worth billions it's also a very safe investment. AMD ditching the gaming market altogether would be a very bold and unexpected move.
And yet Redditors will still call Steve an AMD fanboy. Sigh. When will people learn that Gamers Nexus and Hardware Unboxed are consumer focused and don't attach themselves to brand favoritism?
Well people know, shills dont 😂
Then why do they act like fanboys then? 😂
but.. but.. muh abUser benchmark and bapco Xprt results
I spent all this money on an EOL platform with DDR4-3600mhz cl18.
it'll perform like 7200mhz CL34 right?.... right?
@Tugela60 what have they "fan boyed" about?
@@tomstech4390 No. That CL18 kills the show. You should've gone for CL16 instead (or even CL15)
Goddamn Steve, chill on the savagery in the intro 😂😂😂😂😂😂😂
Haha can't help it :)
Maybe you don't realize how corrupt and manipulative the behavior from some of these companies is. He was too kind, IMHO.
They should be put under scrutiny, pinned down. That goes for all these companies and politicians/governments too. If only it worked like that everywhere!
@@WholesomePotato1 imo you should try to understand sarcasm lol
@@davidepannone6021 oh ok 😅 sorry
@@WholesomePotato1 no harm done DW lol. I thought the laughing emojis would give that away pretty easily.
AMD really let Intel get to them while, they were ahead.
Thanks for the investigative journalism only two Steves in the world can provide.
when...? The 7800X3D is still the king
@@MallocFree90 Exactly. Now that AMD has the best CPU on the market(for gaming at least), they have now let the power get to their head, and are now resorting to what Intel used to do when they were at the top. With this attitude, Intel is going to again overtake them, and the cycle shall continue.
Intel's most likely going to dismantle Zen 5 if the rumours are true.
Lets hope Nvidia gets to AMD also, we all now know RDNA4 without a high end will hurt their ego, even though they say multi-chiplet needs more work, Which is understandable, but!
Hell why don't they smash two monolithic Navi 48 mid range dyes together add 32gb of ram and compete with the RTX 5090!
Nope AMD will pussy out sadly. There is no fire in their belly.
@@kalidesu Intel is now the one with a fire in its belly.
4:40
Party Animals catching strays for no reason 😥
Not sure why Steve is demeaning it when he plays Fortnite.... It is indeed not a good benchmark for CPUs though.
Best Party Animals review I've seen
Intel: fucks up with stupid ass Snake Oil promo.
AMD: hold my beer.
Hold my barrel of snake oil*
Amazing of AMD to bury the lede; if you have an R5 3600, you never need to upgrade because it's only a few percent slower than a 14900K.
you should have read the data correctly, even the 2600 is perfectly fine - hell, why not throw in an overclocked FX-8350, then you can match the 14900K in power consumption aswell :D
@@suit1337 Good point. Staying on AM3+ is now really looking like the performance parity strat here. I'm starting to really think AMD isn't looking out for the consumers in this situation for some reason... Hmm...
"I had to buy party animals" checks steam profile (12860 hours ingame) Mr. Hardware?
It's sad to see AMD going this slope
Tbh developer benchmarks being dishonest is not surprising to me, would never use them as guidance. Still good its called out tho!
First party benchmarks have always been stupid. Motherboard overclock presets have ALWAYS been stupid. god, it's like the last year has been just youtubers "discovering" this stuff and making a big deal of it like it's new.
@@PineyJustice Disagree, AMD was much more fair in their benchmark numbers a few years ago.
@@Logical They have put out some pretty stupid ones even a few years back. I have zero idea why they even bothered with these new XT chips... A 5950x3d would have at least been something worth launching, or bring ryzen 6000 to AM4 with some new APUs or idk, lots of stuff they could have done, but releasing a slightly downclocked 5950x and a slightly overclocked 5800x is just silly in general.
@@PineyJustice It's because they have additional silicon they need to get rid of. And a 5950x3d wouldn't have sold when you can just get a 7900x3d for the same or better performance.
Educational Flogging......
kinky!
Your safeword must be at least 32 characters long and contain at least one upper case letter, a number and a special character.
@@andersjjensengiven the video is about tech company blunders, the safe word cannot be longer than 16 characters and may not contain German Umlaut (looking at you Hotmail/MS 😁)
AMD Release 5800X 3 times
5800x
5700x
5800xt
5800xtx will be after that... then 5800xtxx, then 5800xtxX0X-X0 Don't laugh - there is a chance someone in AMD marketing just read this and got an idea....
But can it beat Bethesda Game Studio's SKYRIM re re re re re re... releases?
@@wezleyjackson9918 xfx rx 7900 xtx... dropped to £800 last night.
5700X was useful as performance difference was/is minor but price difference worth it.
5800XT useless these days bc people dont care 2% more performance. They care about the better price value and if it ll be more expensive, nobody ll buy it.
@@wezleyjackson9918 reminds me of 14nm++++++
It should be noted that AMD has apparently removed all of these slides from their Computex 2024 footnotes available for download. All slides and notes mentioning these processors miraculously vanished.
From the original endnotes we know they tested everything with DDR4-3200 C16 memory. Since you tested the Intel processors using DDR5, that could explain some minor deviations but there's no way titles like Cyberpunk would ever see double digit differences unless they're abusing the hell out of that "up to" qualifier.
The best part is there was no reason to do any of this to begin with. All this drama to announce products most people glossed over and the average consumer wouldn't even see. It feels like something the Radeon division would spit out since they seem incapable of finishing a single keynote without punching themselves in the face at some point.
I don't trust synthetic benchmarks. They are often manipulated or sponsored to favour a manufacturer, same with games. This is why diversification of testing methods is so important.
b-b-but i thought you are an AMD shill and never say bad thing about AMD?
-Random people on the interweb
Yeah the people on reddit hate this and many other channels.
If you look closer and turn off your brain, this is still an AMD ad. Why else would an AMD RX 7900 XT be used? We're safe, gentlemen. We can keep -witch- AMD-shill hunting Steve without remorse.
...
/s, just in case
@@leonro Steve has been very critical of AMD Radeon cards I dunno where you guys have been.
@@tmsphere You have obviously not actually read the words in the comment.
@@tmsphere Not the point, there are people, many of which frequent r/Hardware, who continue to accuse HUB of being anti-Nvidia and AMD biased. Possibly trolls with multiple accounts, because it’s the same out of context arguments over and over.
did AMD really think they would get away with this? It is laughable
It baffles me why AMD would market such an absurd claim, no one with a bit of technical knowledge would fall for that.
AMD has earned a lot of good will with the community, and the refreshes of AM4 CPUs worked wonders for that.
To this day I see people recomending AM4 because “CPUs are still releasing for AM4”.
Yet they shoot themselves in the foot for seemingly no reason.
It is a completely valid claim - with a rx6600 you don't need to buy a 14900k to play at 1080p.
Now is it stupid for AMD to market it like this? Completely.
Agreed. AMD is doing good things for their users by introducing faster AM4 chips, which they don't need to do. Perhaps they will even come out with a 5900X3D or 5950X3D, which again, would be good for users. But, lying to customers with blatantly false benchmarks undoes most of that goodwill, and even creates badwill. Someone at AMD should lose their job, or at least be censured, over these lies.
Going AM4 with a new build is a bad idea though.
@@gaiahunter3863then market it as "you don't need more"
@@zoopa9988Intel only makes sense with i3 12100 and AM5 is still way too expensive.
AM4 is absolitely valid for a new build
The real story in these benchmarks is the 5800X3D still kicking ass up and down the benchmark charts. Comparable to the 14900K, lol. As someone who bought one of these things on day one and is still rocking it, I really don't feel a need to upgrade anytime soon.
Yep bought one myself a year ago or so and for the reduced price it was and is an absolute no-brainer for AM4.
To the point making a new AM4 build even today for budget gaming with a 5800X3D (or 5700X3D) isn't entirely a bad idea.
Yeah you cannot upgrade it later but by the time it becomes truly obsolete you are looking at a full overhaul anyway, probably.
I know, right? the X3D really showed it's a winamp of it's day in a CPU market. it really whips the llama's a$$.
come on .. Steve will by now be known as "Party Steve"
He can hang out with Party Pete at the Falador Party Room.
if that's not on his next T-shirt, the world is gone to hell.
Is there any chance you might add some productivity comparisons to your test suite? Maybe a chromium compile or some such thing? I can handle a slide show for gaming performance but do care about how fast I can do some various productivity tests.
Thanks for showing that there is no need to update your pc at all if you game in 1080p
Apparently, AMD thinks their customers are idiots.
By the way their fanboys excuse their actions, maybe they are right.
Well those who think their actions in this case are funny certainly are idiots.
ive got a a full amd pc but yeah this is just inexcusable fro amd . The fanboys really are idiots , they were idiots back in 2015 and they STILL are idiots in 2024.
Because they are. There are people who genuinely believe AMD saved gamers from Intel with Ryzen.
To be fair there are people that comment on this channel that always cry when Steve pairs all CPUs with a 4090 when testing to prevent bottlenecking because they don't understand basic testing methods.
Like a 5700X is $160, a 5800X is $180 and a 5700X3d is like $200.
We really need to get what they are smoking.
They just have a lot of defective chips that don't perform to spec, so it does not hurt to release a bunch of different variations instead of throwing them all to the landfill.
i dont smoke anymore but i might take a hit if i can got some of that
@@Marleyismydog420 look at your username bro. nobody quits smoking. they just take a break.
userbenchmark is gonna have a field day with this one
abUser benchmark has never cared about truth.
they'll completely overlook the insane DR5 ram speeds being used.
@@tomstech4390 Yep, they even change the rules whenever they need to in order to keep Intel ahead (like when Ryzen launched and beat Intel in multicore performance). And if someone reading this doesn't believe that, just read their blurb on the 7800X3D and ask yourself whether a serious reviewer would EVER write something like that.
1st party claims are why we come to Hardware Unboxed... Some things will never change
Can't wait for user benchmarks to discover this
Yeah, they GPU handicapped the test machine to death so a celeron would perform the same.
Also dont lie, you did play party animals too and loved it.
Over exaggeration
i BURSTED out loud laughing when Steve said "f*ck that dead horse!"
The only thing you can get from a test with a "more reasonable GPU" is "what is the minimum CPU performance required for said GPU".
"You were the chosen one AMDanakin. It was said that you would destroy the Sith, not join them."
AMD: It's over Intel-kin, I have the lie ground!
Intel: You underestimate my power consumption!
Underrated comment. 🤣
On the plus side, I'm looking forward to more Party Animals benchmarks in the future.
Plot twist: they did the marketing thing to sell only the 2 CPUs to Steve, and now that he has to test them they succeeded
Damn what a mess up. Great PR move for a relatively harmless AM4 launch. Someone is getting fired 😂
🔥 intro and 🔥editing by Balin
This type of benchmark would be great. Because it would let you see how low of a CPU you could get away with and how high of a GPU. Like for example a 5600 would be the play at 7:33 and that if I had added a 7900 XT to my PC at 8:18 it wouldn't be the world's biggest bottleneck
That's the value to have 3rd party independent and honest reviewers. Consumers can be informed with the real benchmark data instead of faked and biases benchmark.
man, if AMD gave us Ryzen 9 5950X3D, I would buy one...
Yeah, from what I see there’s probably 10x the interest in a 5950X3D than there is in any higher binned non-X3D parts. And if you have these higher-binned CCDs why not use them to make X3D chips out of? Then have 5800XT3D and 5950XT3D…
AMD Could've ended AM4 with:
Ryzen 5 5500x3D - $159.99 - (Basically a 5700x3D, but with 2-less cores, or think a 5600x3D)
Ryzen 7 5850x3D - $279.99 - (A 100mhz speed-bump on the 5800x3D)
Ryzen 9 5950x3D - $399.99 - (BOTH CCDs have 3D-VCache, unlike the 7000-series)
---------------This would have gone over better than the XT chips---------------------
Idk why tf did they have to name them XT, now people are gonna think that 5800XT is a graphics card better than a 5700XT.
5:18 My old R5 3600 is as fast as 5950X and R7 5800X3D... but wait it not.
Correcting the misleading. Thanks guys.
But I read on RUclips comments that AMD is our friend. They would never do anything wrong. I refuse to believe any of this. 😅
*Never believe any stats with no context!*
You said 5800xt when pointing at 5800x. Is it an error or i don't understand something?
I understand that testing CPUs while GPU bottlenecked is a bad idea, but I do like the idea of having a 'reality check' section in a CPU review video where you show the kind of performance you can expect when not using the newest and shiniest 'this costs 5x what a flagship in 2016 cost' GPU, just to keep some perspective on things
That sort of thing isn't nearly as useful as you think. Essentially you're asking for the impossible. The testing you're suggesting will lead to data that varies massively depending on the game used and the quality settings. The RX 6600 can render 200 fps in Fortnite, or 20 fps, just depends on the settings. Therefore the best way is to test how we already are, as that tells you everything you need to know.
Can we have a video on the 2 GOATed parts, the 1080ti and 5800x3d? See how they still do?
5800X3D? Too soon to tell it's literally just 3 years old
We have videos dedicated for both of them.
@@Anankin12 2 years old*
By that logic lets mix the 5800x3d and the 8800GT, both goated
@@WamblyHades id be happy to see that for the memes
AMD thought they were sneaky lol
9:13 my guess would be that since they wrote that they are considering best case scenarios at the bottom of the slides, they justified it in their head that comparing intels 1% lows to their averages would be fine
will you review the new xelite processors?
nice review of the 6600!😅 My i3-12100 well represented here, turns out it is more than powerful enough for the 6600.😅 I've been eyeing the 6600 since Diablo4 but game is still a turd. I'm likely to buy a new GPU once the PoE2 comes out, something like that can run it at 4k60 to 120fps.
I'm no Intel fan and AMD fanboy more than a decade ago, but I think AMD is doing this because it is likely that Intel is catching up and might dominate starting next year (ignoring x3d chips).
That has to be it. Why pull this stunt if you're decisively ahead and will remain so? Doesn't make sense.
@@MrHav1k It honestly as little to do with that. Amd knows there is a high number of people still in the am4 platform that most likely will not upgrade to am5 and is trying to leverage that by releasing cpus that "on paper" can compete with discounted intel parts since you now have discounted z690 and b660 lga1700 boards going for 90 to 120 usd, and cpus like the 12600, 12700, 13400 and 13600 going at discounted prices, therefore a path for a "current gen" system that will easily last 5 years for those within a tied budget without the need to go into the pit that is the "h " series of motherboards. I was able to make a system for a friend for slightly less than 650 dollars (cpu,, gpu , 1tb ssd gen 4, 16gb ram and motherboard) that is effectively quite a decent upgrade on the cpu and motherboard side of things and if his gpu and ram was not that bad, we would have spent slightly less than 400 dollars .
F*** that dead horse. I died. 🤣🤣🤣. I don't know what AMD were thinking with this mess. Zen5 looks impressive so there was zero need to do this crap. It could tarnish the roll out of the next gen cpus.
Steve, there is a typo in charts "Ryzen 7 7600". I'm confused is it Ryzen 7 7700 or Ryzen 5 7600?
7600
That 12100 is trying really hard. I like it.
No shame in some casual Party animals fun 😆🤣
Companies need to be fined for misleading graphics
Just never trust first party slides/graphics
its accurate if you use DDR4 like everyone does instead of ddr5-7200mhz CL34
(am5 is expensive because it requires DDR5 so we bought lga1700 and DDR4 remember?)
Thank your for calling AMD out in such an elaborate way!
Steve, are you using Intel's settings for the cpu or using the board makers settings which allows unlimited overclocking. I am still not sure on this one as even you would appear to use overclocked Intel against enforced limits per speck from AMD. Are you able to confirm or otherwise redo the same games based on Intel's cpu speck settings.
253w profile was used.
"F^%# that dead horse" -HUB Steve
I don't wanna _know_ what hardware needs unboxing for that!
It's crazy how much momentum AMD had a couple years back! Seemingly it has come to a screeching halt!
That's not what their financial posting says. They've made substantial inroads in all CPU related market segments and their profits are decent. This is just a complete foot-in-mouth moment that didn't need to happen. Getting rid of some more AM4 silicon isn't going to move the needle by an appreciable amount, so losing credibility over it is just extra dumb.
Wouldn't say that it has halted, but I would agree that it has substantially reduced. Now Intel is the one that has to rapidly improve to take back the throne.
The key thing I got from this video is potential of r5 2600 to be great budget CPU for RX 6600xt or RTX 3070. It often can be bought for only 38$ in my country
This reminds me of Nvidia outrageous claim than the 4070 ti was 2.8 faster than 3090 ti. We get it first party benchmarks are trash, even Apple is guilty of that, we got it time to move on
Graphs have always been a way to lie with maths.
How to buy PC parts:
Look at reviews with CPU's paired with high end GPU's.
Find GPU you can afford, see how much frames it can make.
Pair GPU with adequate CPU with about 10-15% head room for your GPU.
Done.
That's if you're building a whole system at once. If you're clever you then keep an eye open for sales on cpu/mb combos that can offer a significant step up while waiting for the next GPU upgrade. Keep bouncing back and forth and only buy once the market has stabilized post generational launch. That way when the next GPU comes along you'll already have a cpu that can keep up and vice versa.
AMD, and any other brand for that matter, need to be called out on this misleading marketing. Newer builders and people that don't understand how bottlenecks work need this kind of information. Any company engaged in such shady advertising should be put on blast for doing so. Taking advantage of customers lack of knowledge or failing to notice the state of the test parameters is about as shady as it gets.
Did AMD mention in the footnotes whether they enabled SAM in their benchmarks? Im trying to figure out how they got better results in a few games when bothe were clearly GPU limited, and i wonder if they had SAM turned off on the Intel tests, but turned on for the AMD platforms. Perhaps that would be how they magically created small performance leads in a few titles for their graphs. Just a thought.
A few things possible: matching DDR4 instead of letting the Intel chips benefit from DDR5, selecting some of Intel’s newer reduced-power curves, maybe also some contribution if they used stock fans on both chips which could potentially hurt Intel chips more due to their higher general consumption (although this isn’t anything I’ve specifically checked).
11:50 AMD own testing: R7 5800X = 5800X3D
-> X3D is useless...
How to shoot your foot harder in the leg than this?
Intel faking benchmarks : Liars!!! Deception! Scam! 😡
AMD faking benchmarks: whoopsie, chill, they are just trolling 🤭
Nvidia with 2X performance over last gen (FG enabled + DLSS P lol), everybody's faking benchmarks, including Qualcomm
Shhh... AMD shills are now gonna come and start a war in the replies, about why using RX 6600 is accurate since most people use RX 6600 with a 5900XT (which is not even true by itself lmfao)
@@shadowlemon69 Thank you for this comment ,I will buy 3060 ti FE, I think 4060 was faster but....
@@chesslive2714 I know this is a bad time to talk about AMD, but you can check out the RX 6700 XT, it performs really well and has 12GB of VRAM, though go with Nvidia if you're using RT, DLSS and other features that are Nvidia exclusive.
@@shadowlemon69 Precisely. Faking benchmarks isn't anything new. Intel, AMD, NVIDIA, Qualcomm, Apple, literally everyone does it. Unfortunate that it's become a standard now.
We NEED footage of Steve playing party animals.
were the intel cpus set with the default intel specs set on bios ?
Lol good one, wounder how AMD did theirs if they used the 'default or intel default settings.
There will not be that big of a difference if you power limit the intel cpus (which is essentially what the baseline profiles do) even in the most demanding games, at max 10%. Steve has proved that the games being tested here are not cpu demanding at all, so performance will be completely identical. Even the i3 12100 is performing close to the 14900k here, so no difference at all.
The reason AMD was getting worse numbers with the 13th gen processors is they were using DDR4 instead of DDR5. Unfortunately your standard test setup is too honest about the actual performance of 13th gen with reasonable RAM selection.
It buffles me how AMD thought "Yeah let's take generation that lost to Alderlake and pretend it to be better than Raptorlake" XDDDD
Also it reminds me about buldozer era where AMD also made shady benchmarks where they barely beat Intel. Ah good old days 😅
Eh, the gen didnt lose, stuff like 4+ cpu socket and 5800 3D is one of the best things happening to CPUs in a very long time. And the rest was pretty good too, after AMD lowered the silly prices and released actual mid end CPUs.
And on Intels side, Alderlake was awesome, but Raptorlake is basically just overclocked Alderlake.
Marketing lies are of course BS either way.
Zen 3 didn't lose to Alder Lake in gaming, 5800X3D is neck and neck performance with the i9-12900K while being significantly cheaper during year of release. That was the reason why the latter got aggressively price dropped from its initial $650, all the way down to $300 now. Intel would still be committing highway robbery if it weren't for AMD.
@@termitreter6545 yeah but i'm not talking about X3D archcitecture but pure Zen3. Also where I've mentioned chipset support?
Good effort but something I noticed that's always missing from benchmarks is how the CPU is performing relative to its own model number. And it's a bit surprising how that info is never in videos at all. It can be retrieved relatively accurately using Passmark, test the CPU, report its percentile, and show us how it compares with its siblings. That shines a better light on the results we're seeing. For example, if the 5800x is in the 50th percentile compared to other 5800x CPUs while the 13600 is 20th percentile, then the comparison isn't really that valid because it's the top 20% of one model against the top 50% of another.
Embarrassing... This just made me decide to wait till Q4 to see what Arrow Lake has to offer, despite the fact that I now have to deal with every other part other than mobo/chip/ram sitting around unboxed in my house
AMD just become a new intel
This should be a clear violation of any marketing laws and should be directly punished by fines. Absurd that this is allowed.
Nobody has done anything about ludicrously deceptive claims by others in the past. There's nothing new about this, that's why we wait for 3rd party. What Intel did gaming 10% perf by letting mobos set out of spec power defaults has had far more harmful effect.
Where in the text did you see " 5800xt " ?
Maybe they did the old trick of using super fast Ram and gave Intel the "recommended" Ramspeed?
And 5 seconds after I type it you say that's not it, darn. :P
still better than intel chips nuking themselves.
I can see the "but AMD changed Intel" comments already coming...
"Bu..bu..but... AMD is the underdog... and Intel use to lie before... so AMD is excused from lying too 🥺"
Cognitive dissonance at its finest.
If this doesn't make people understand why to use Big GPUs for CPU testing, I don't know what will
damn my old 3600 is as fast as 14900 thx amd love it
they must have really bought that snake oil i guess ...
Userbenchmark was right all along 😱😱😱
Userbenchmark hosting a party right now with this video looped on repeat in the background
Advanced Marketing Device
@@stevanko1 .. while playing Party Animals...