If you missed it, check out our mini-documentary covering the final days of EVGA's GPU division -- they really deserve the watch: ruclips.net/video/Gc0YlQS3Rx4/видео.html WE WILL RUN OUT OF DISAPPOINTMENT T-SHIRT ALLOCATION BY NEXT WEEK! Grab a 100% cotton shirt here: store.gamersnexus.net/products/disappointment-pc-t-shirt-2022-100-cotton-black And lightweight tri-blend here: store.gamersnexus.net/products/disappointment-pc-t-shirt-2022-triblend-black
psst 5600 is 120 bucks free delivery from china. ps AM4 boards are generally a bit cheaper than comparable lga1700 boards. or a bit better for the same money. + AM4 CPUs eat less power = go easier on VRMs and have better upgrade path if you choose from really cheap boards. practically anything (e.g. b450m s2h) can run 5800X3D. but to run 13600k you need a way pricier board than b450m s2h. 5800X3D and 13600k are comparable in gaming.
@@blacksama_ 5500 sucks as a gaming CPU. but 5600 exist and costs only 120 bucks. so 5500 is irrelevant anyway and should be ignored. just like 12100(f) and 13100(f) should be ignored. just get a 5600 from china, MSI B550M PRO or ASUS TUF GAMING B550M-PLUS depending on your budget and be happy. or even GIGABYTE B450M S2H. or MSI A520M PRO if its noticable cheaper for you.
Great video as always. I have a suggestion for a fps/dollar like metric for production tests. You could for example use performance expressed as percentage for you dividend. So, in this case 12100f is 100%/100 dollars and 5600 might be 130% of 12100f performance/140 dollars. You already done it plenty of times verbally. Might be worth putting that in the chart
Just got done going through comments and scrubbing a bunch of the spambots... remember that they're getting tricky! They copy/paste comments from actual users, somehow insta bot-upvote themselves, and then hope you'll click on the profile picture so they can phish you. The profiles are all of a very obvious type. Downvote or report them if you feel like contributing, but we'll keep nuking them for now!
I generally look through the comments and report them as spam, they are ridiculously obvious and it doesn't take long to do. Have to wonder if it achieves anything with youtube though...
@@rickysargulesh1053 They always have links in their profile that are either obviously malicious or are a redirect from a legit-looking site to a malicious site in an attempt to get your personal information. Not 100% sure how it works, but pretty sure it goes something like that.
Yeah. The last good value GPU, the 5700XT, launched in 2019. Since then it's been nothing but turd after turd. edit-------- due to the number of replies I have gotten, I wish to clarify things. I am talking about what AMD/nvidia have been trying to sell their cards for. Excluding companies trying to sell off the last of their stock before new cards come in, or fake MSRPs, everything after the 5700XT has been crap. The 5700XT was good on day one and remained good until crypto went nuts long after launch. Cards after the 5700XT have been worse value or in one or two best case scenarios were able to at least match the 5700XT. But matching your old generation isn't an improvement, it's pointless and a waste of time. So yes, you can find a better card than the 5700XT as the last of the 6600s are trying to be sold off before they're replaced by the 7600 or 7500, but it's the blip at the end of the product's life and not what it was for about a year after launch.
class act to take community feedback, adding more cpus to the chart for comparison, and redoing the video with new testing done. cant imagine the work that required. this is why you guys are awesome.
A large RUclips channel taking suggestions from the comment section and incorporating them into an updated video? Preposterous! Seriously, this is one of the reasons I love you guys. The content here isn't backed up by pride or bias. It's cold, hard facts.
We love these videos but aren't you supposed to be taking a break Steve? *uses concerned parent voice* Come on man take a rest you've been killing it and we want you around for many years.
@@GamersNexus please take the break! Maybe it's the overwork, but the cringy Eminem "Forgot about Cezanne" joke was a definite departure from the absolutely fantastic joke writing of the last few months of videos.
@@ArtisChronicles - Hate when that happens... But then, when you buy something, it better be worth the monetary investment to you, so that you can't complain. - Still, if you might be on a tighter budget (as I often am), it must sting. So I really have to jump on like sales or price-drops, which I did with the 5600 and the promotional "Uncharted 4" key. - That's a good way of throwing in an incentive and adding some value, besides that the price was good for the CPU alone anyway.
I agree with the fps/$ reservations because there are just so many ways people buy CPUs: - drop in upgrades - platform upgrades - new builds For all of these, and especially for a huge amount of varied price tiers in the latter two, the actual cost and fps increase by changing CPUs is hugely different, and it's difficult to show this in one or even 2 or 3 charts
Definitely good points, especially since it is - effectively - theoretical. What GPU you get & what resolution you game at can have a MASSIVE impact on your personal, real-life FPS per dollar rating. When, like in this video, you're looking at budget options, you aren't going to see real-life FPS increase per dollar on a matching lower end GPU compared to something like the 3090Ti they use in their chart. You can get all kinds of differing bottlenecks between different hardware combinations & resolutions. For example, I upgraded from a 6700k to a 5600x the same time I upgraded my GPU to an RTX 3060Ti. The 5600X would have a meaningful FPS impact on that GPU when gaming at 1080p to help justify its value, whereas at 1440p and up the bottleneck shifts pretty heavily to GPU so would be comparably poorer. And even RAM rank, speed, etc can impact the outcome. Feels like FPS/$ charts are simply too broad & theoretical to have much applicable value.
Exactly. The graph basically compares how these CPUs today (as in on this very one day) on the US market do as a drop-in upgrade for people who play those 7 games. If you need to upgrade the whole platform, watch the video on the weekend, don't live in the US or play any other game, the graph is rather worthless for price comparisons.
I bought an i7-12700K for 330, it's a 2021 batch with AVX512. The MSI-Z690 Pro A DDR5 was cheap too, 150 bucks. Then i added two 1TB M.2 NVMe PCIe Gen 4 SSDs. It's a music production and console emulation PC, no graphics card needed. It's air cooled (Deepcool Assassin 3, and Corsair 5000D Airfow). I bought an overkill power supply (wasn't cheap). Waiting for a good GPU. 3080Ti is a good card but it uses way too much power and has a lot of noise, Radeon is even worse. All of the 3D cache nonsense and overheating CPU problem can be solved if both AMD and Intel simply added quad channel memory support and focused on increasing IPC. My alternative was an overpriced gaming laptop with overheating chips, less RAM and storage, slow processors. At least a 3060 mobile is an amazing 1080p beast at a low wattage, and the i5-13600K is a no no.
I almost choosed AM5 because i read somewhere that it would have quad channel memory support, then i remembered that AMD wants to sell gimmicky 3D cache chips, so that never happened. Quad channel memory support is so overlooked. It's crazy the perfomance upgrade it did for Intel HEDT for games. With DDR5, imagine the absurd perfomance increase vs dual channel DDR4.
I feel sorry that you're faced with an impossible task. There seems to be a lot of viewers who can't/are unwilling to look at a graph and analyze the results but simply look for the highest number. Your disclaimers about percentage charts were great, but many won't see the difference. Still, keep doing what you're doing because it's vital info for thousands.
Thanks man. 🙏 amen for that statement. I don’t get why people are not able anymore to use their own freaking cpu aka brain to understand reviews Feels like most of those guys are still on a budget cpu from mid 2000‘s
For budget/value cpu reviews, if feel like the included cooler should be part of the assessment as well. For example, does it even come with a cooler, and does the cooler let it run as intended. I know the coolers are generally the same ones they've had for years but it is part of what let's budget builders save a bit, especially as it affects a higher percentage of the build cost at the low end, as you said. Also i know your charts are done with aio which is sometimes unattainable for budget builders, but I understand it makes the chart more consistent. If there was a i3-12100/13100 vs 5500/5600 air cooled numbers chart, that would make a good additional data point to evaluate. Thanks for the review.
Honestly you've continually pushed your quality higher and higher, and I hope we all agree on that. Thank you for your dedication to clear and informational communication with an emphasis on contextual data.
you remain the hardest working tech tuber imo and the effort, care, and thoroughness you and your team put into GN content is amazing and appreciated. you are the standard for me when it comes to tech information and news.
Yup love his videos, I wonder if he'd ever consider a website similar to how LTT is trying to do where GN can post more lengthy charts with more components. Like I was saying on the previous video I'd love to see older Intel CPUs included in these charts. Because he has stuff like the 1700, 2600, 3600, etc. However no competing Intel CPUs from those eras such as the almighty 8700k, and others such as 9700k, 10600k, etc
Steve and everyone involved, I greatly appreciate your effort to not only show performance for what new things come into existence from our corporate overloads, but to really dig into these new products and find new/better ways to show how they compare to the competition. I know you won't but NEVER STOP FIGHTING FOR THE CONSUMER! As a Raleigh native who is just as enthusiastic to new tech as you, nothing will ever compare.
Which is Ironic, given that AMD's abandoned the lower end CPUs for years now. The 3300 was the last one and that had really low availability before disappearing pretty quickly. E2A: I might be a bit harsh with a second though. They did have the G series which was kinda down there, but only "kinda".
Talking about feedback, I'd love to see some sort of "emulator benchmark", as in taking a set of stable games on RPCS3 and Xenia and using them as a way to compare performance between CPUs. All the cores and threads in the newer CPUs really shine when it comes to emulation of 7th gen consoles.
I'd like to see that. I know there's an RPCS3 CPU tier list written up on Reddit. The i3 12100 is rated decently on it, and rated high if you manage to get AVX-512 enabled. I haven't found much for Xenia yet but it honestly runs even better for me for the 2 games I tested - when it works. Just that its compatibility and stability is not nearly at the stages of RPCS3.
@@Woodzta Xenia is less CPU intensive because the Xbox 360's CPU is more straightforward. It's a 3 core PowerPC CPU with SMT. That's it. The main difficulty here is the L4 cache!, stronger GPU and unified RAM+VRAM. The PS3's CPU has 1 PowerPC core and 8 128-bit co-processors with own cache and insane-for-the-time throughput. Look up the schematic on Wikipedia ("Cell broadband engine"), it's infamous for being very hard to develop for, let alone emulate it
I love how you listened to the comments and revisited this review. That is one of many reasons why this channel is AWESOME! Thanks for what you do Steve! 👊
I've had the 12100F for almost a year now, just upgraded to 1440p. Still does the business. Using G-sync at 165hz I'm perfectly happy... I'd have happily picked an AMD cpu but for the NZD$175 i paid there was simply no equivalent in terms of value/performance. It's great to see for those that are due an upgrade, there is another new budget option.
Good choice. Ive done a lot of builds with the 12100F. If you get the Mortar Max, you can hit 5.1ghz very easily. Worthy upgrade if you get a deal on it 😁. 12400F hits 5.3ghz with a 240mm aio with ease too.
Re: removal of near-identical CPUs I agree with the space+time savings by removing one, but I think you should default to the cheaper part, to represent the choice you actually recommend. That way discussions will be clearer and easier to understand. If you're going to say "maybe consider the 5600", then show the 5600, not the X. Props for taking feedback and reuploading.
@@GamersNexus 🙃 of course, I should have expected as much. I still think going with the part you (at least have a better chance of) recommend(ing) is the way to go, though. May as well get some extra utility if there's gonna be haters either way
I personally think the 5600x is better since the 5600 with pbo ties with the 5600x with pbo. but I wouldnt mind to much either way since its so close, but imo showing the 5600x is better.
@@XX-_-XX420 why not just show the 5600 with pbo, then? Since that's what they're actually recommending people get and use, they should show that instead.
@@tarfeef_4268 thats also fair, but I think just running the 5600x probably does save time while showing about max performance. ( pbo probably gains the 5600x like 5-10% in gaming or something but that doesnt matter to much).
I got my 5600 for $119 and after increasing the boost frequency by 200mhz it has identical performance to a 5600X! It’s a powerful and inexpensive cpu I’ve paired with my RTX 3080
Yea but if you take a 5600x and do the same overclock it again is going to be much faster than the 5500. It literally has half the l3 cache at 16mb versus 32MB on the 5600 (x).
I got my 5600 at the same price point. Instead of adding the 200mhz boost, I fine tuned curve optimer per core. Cinebench increased several percent, but not as much as the 200mhz boost would. The temps and power usage are much, much lower.
@@tilapiadave3234 But the performance of the 5800X3D supersedes the value of your little budget chip. Besides, you can get a 5800X3D for less than $300.
@@jamesm568 Yes the WAY over-priced EXTREMELY over-rated 5800X3D can beat a 12100f ,,,, But is that an i5 YES a mere i5 i see way ahead in many of the charts ,, WOW ,,,, so embarrassing for the 5800X3D shills
It's kinda important to consider the price of the platform in fps/$, it's not really a big problem for the 12100 which you can run on the lowest of low end LGA1700 boards, but especially with the likes of AM5, motherboard costs inflate the overall price more than just the CPU.
This was great! I never shop in budget class, but I still enjoy watching the charts...It's amazing how good a $100 CPU is these days! I appreciate all the hard work you and your team put into these videos, and I'm excited for more fan and PSU content also!
OMG I see you did the Blue = Intel; Orange = AMD thing for the FPS per dollar chart. Not sure if that was on purpose but I recommended you do this a few times and I'm SOOO happy to see it!! Wow, that looks so good :)
@@mtunayucer The companies themselves do. People do. Have for decades. That's why Intel, with only 8p+16e, so 32 threads total, calls itself the i9 13900K and competes with the R9 7950X which is guess what genius, 16 cores and 32 threads! But "wHo cOmPaReS tHrEaD cOuNtS?!" right? 🙄🤦 Quit while you're behind already dimwit. 👎
Thank you for the commentary on fps per dollar. A lot of reviewers have been using that metric lately but without the disclaimer about their questionable utility. (Plus a lot of them call it "dollars per frame" which is just factually an incorrect term for the metric since it removes the time component of fps).
In chart 1 the 4090 is paired with a 12100f, a massive cpu bottleneck, for highest fps/dollar. I think this aptly demonstrates Steve's point about the number not being the whole, or even significant portion of, the picture. And perhaps why he made sure to make that point so thoroughly up front.
I have exactly the same cpu + gpu combo. Its not even remotely massive bottleneck as you would think. There is no lack of graphical settings in games that can tame 4090 to performance that is reachable by this tiny beast.
Thank you for teaching me so much about computers. I have not built one yet but I think I could do it. Maybe I will with my next computer. In the mean time I will keep watching your videos and learning from you.
Speaking of being around here for a bit, I think the first video I watched of yours was the GTX 960 4GB vs 2 GB way back when it came out. To see not just the set change so much but the quality of content come so far is a testament to you and your team's hard work. Keep it up.
Have you considered price-vs-performance scatter plots (possibly with iso-price/perf lines)? With the entire value discussion being comparative, the difficulty of reading accurate values in scatter charts does not matter.
@@GamersNexus if you do add new charts I would suggest to explain how the chart works. Even if it seems simple, it’s always good to take 1-2 minutes explaining the charts and allowing the content to be more accessible to more people.
@@ArdgalAlkeides Obviously as cards stop being sold, they leave the graph. Nobody would be interested to see how well a 8800 GT or a HD 4870 or a GTX 570 are nowadays. And as OP said, it's more about having a broad overview about everything available. That the plot would be obsolete a week later is also obvious, but the same can be said for any price/performance graph.
I feel like the distinction @8:23 is gray with the Intel stuff, as a 13100f and other "Raptor Lake" CPU's using the Golden Cove p-cores really are just remixed/rebadged Alder Lake CPU's. Odds are really good the re-badged Alder Lake 13xxx procs are going to be paired with an Intel B660 series board with DDR4 from last gen which makes it a complete "last gen" solution in line with an AM4 processor and 5xx chipset mainboard. I wonder how much this skews the relative uptake numbers for new generation products from both Intel and AMD... while AMD has more clearly distinct platform and processor divisions for AM4 vs AM5, while Intel is much more mix and match between Alder and Raptor lake generation CPU's and chipsets/boards.
Great video, as always, surprisingly perhaps, it reminds me the most of when the CLC mounting topic was raised, the second video with that awesome thumbnail *looks it up to confirm language* "AAAAAH! IT'LL EXPLODE ANY SECOND!" where you addressed a vocal part of the community as a PSA to say 'Bro, chill a bit' which was really good.
Nice job as always. This must have been a lot of extra work, but it served as a good reminder of your testing philosophies and methods. Well done, Steve and team.
Hey Steve, if you're having trouble incorporating all the relevant information into the Cost-per-Frame graphs then I'd recommend checking out a channel called "Hardware Unboxed", they have some really good graphs you could use as inspiration. They manage to include maximum performance per CPU and a cost breakdown of all cost of CPU/Motherboard/RAM per test, making it much easier for the viewer to check Cost-per-Frame for themselves later when the prices change. It's important to remember the point of a Cost-per-Frame graph is to illustrate value, while a maximum performance graph is better suited to showing maximum performance. 👌
Sorry Guys, the internet has spoken, every graph has to include every processor from the last 5 years. We look forward to your new once-yearly video upload schedule. 😂 But seriously, thanks as always to Gamers Nexus for all the hard work and dedication, you guys rock.
I mean everyone once in a while that would be super cool to see like first gen i7 parts overclocked all the way upto modern parts. or like a 7ghz FX8350 beating stuff like the 5500, that would be hilarious, and cool. ( or I think it was the 8350 @ 7ghz that was as fast as a r5 3600).
Steve, great video (THANKS STEVE!) LOL. When you guys are reviewing really low end CPU’s in the future, maybe it would be useful or helpful to pair them with midrange or budget GPU’s. I understand testing methodology and wanting to remove bottlenecks and hence using a top tier GPU, but in the real world nobody is pairing a $1,500 GPU with a $150 CPU. What’s really relevant for anyone considering buying a CPU this far down the stack is whether the bottleneck is the CPU or the GPU it will be paired with in real life. Testing with something like an RTX3060 is highly relevant at this price point, and it would be really interesting to see if all of these CPU’s are functionally equivalent with that big of a drop in GPU performance.
12:15 - That's why HUB includes value and fps in the same chart, where they sum CPU+RAM+mbd, and I think it works very well (except of course they use $ per fps rather than fps per $; same idea really, one can debate which is easier to comprehend). It allows one to see the value difference, but also see where a more expensive option provides much greater performance. 15:05 - HUB's version of this kind of chart does not leave out which combination has the performance edge, and it really does help that various combinations of mbd and RAM are included too, to show for example DDR4 vs. DDR5 value/speed variance. Overall conclusion still correct I'd say, but of course price volatility (and how it varies between regions) can shove all this around like bumper cars. Strange where I am (UK), the 5600 is now largely unavailable, or places that do have it are charging far more than makes sense vs. Intel competition (for gaming that is). By contrast, the 5500 is easy to find but is exactly the same price as the 12100F, quite unusual for UK pricing of this kind. What would be nice of course is a proper 7500 or 7500X, but AMD isn't going to do that, not for some time; they learned from the 3300X that too strong a value part just negates most of the good-margin SKUs. The Zen4 stack will only receive entry parts when Intel once again kicks AMD in the shins. AMD waited far too long to release the 5600 and meanwhile released a mess of other older-tech parts with irritating caveats & compromises (PCIe, cache, etc.); wouldn't surprise me if they do the same thing again, leave competitive entry Zen4 so late that by the time it comes out Intel has already moved on. Also interesting from the value perspective, the 10100/F is still available to buy and it's 20% cheaper than the 12100F; sure it's slower, but with realistic GPUs and display tech in mind this likely doesn't much matter, most people in this price class are still fine with 60 to 90Hz displays. Btw, RandomGamingInHD recently uploaded a video showing the 13100 with the Arc A750 at 1440p, some interesting numbers.
R5 5500 overclocks to 4.8ghz very easily. I have mine at 4.422Ghz just because I'm using a stock cooler and I want the temps under 80 degrees. intel chips don't overclock as well, I got my 4.4ghz 5500 with a free game I wanted too, so all in all I'm happy with my new UFO RIG. I love you guys but your wrong to hate the R5 5500......its a beast in hiding!
I feel bad for the people that dropped an opinion in the dropped video. Is the single issue I find in relasing an improved video. I whis there was an archived or previous versions, so they contribution doesn't get lost. Nothing to add. Another superb review, as always.
Something that I think would also benefit in budget CPUs reviews like this is a chart or two where the CPUs are paired with one or two budget class / mid tier GPUs as well. Maybe something like the 6600 as kind of a reality check against what an actual budget build with the CPU would perform like. The point being that gaming performance between these CPUs may well flatten out with a lower end GPU which could change the value equation. If a CPU could technically run a game 30% faster, but only when paired w/ a 4090, then you'd be paying a premium for a performance difference you're never going to see. Just a thought.
That is not the purpose of these videos. The whole premise is to remove the GPU as a variable. Push the CPU to it's absolute limits. There are far far too many GPUs available to do testing like this. 10 hours for a single addition of one CPU. Adding in an arbitrary "realistic budget GPU" to the stack means a doubling of testing time on the gaming side. You might just have to use the excellent and extensive data GN has provided and make your own conclusions.
@@jeffjungers2034 I know full well how benchmarking works and why GN does what they do; however, this is the first time they've done a "value" metric in terms of FPS per dollar because it's important to a significant portion of the target market of these CPU (someone building a budget gaming rig). It's that audience that I had in mind when making the suggestion (and IIRC, GN did ask for feedback about this addition). I disagree with your understanding of "the purpose of these videos". GN's really the only one w/ authority to speak on that, but my take is they want to give their audience enough information to make informed purchasing decisions. IMO an important aspect of budget builds, probably more so than higher end rigs, is balancing out component choices. If processors in this range are functionally equivalent when paired with a mid/low tier GPU, that's useful information as the price difference between CPUs and their platforms can be put towards a different component that will impact the user's experience. Put another way, one data point that's largely missing for potential buyers of these CPUs is what class of GPU will be bottlenecked by them. I'd argue that's far more important than how fast they can push a 4090.
i think that you guys are showing what a class act you are here. you have strong opinions, but you do keep yourselves grounded. the first video i saw of gn, i was very critical about in the comments (and i may have been drinking), but since i have become a big fan of the channel, and really appreciate your content. keep up the awesome work!
Well, to be honest, GN has not been this humble forever. I have given some critic also in the times gone bye. For a reason I think. But they've improved all the time.
Ryzen 5 5500 has a very specific use case, cheaply upgrading older boards that only have PCI-E 3.0 anyway. It was useful to give my son's circa 2018 Gigabyte AB350 and Ryzen 3 2200G a cheap boost. Gigabyte's been very good about keeping the microcode updated for new BIOS/UEFI releases.
You do realise that usually GPU PCIE lanes are provided by the CPU, not the motherboard? So a PCIE 4.0 GPU with 5600 on a PCIE 3.0 board will still run in 4.0 mode
Basically a competitor to the 3600. Or for people got got a cheap 1600AF back in the day and want a decent upgrade for a decent price. Would be interesting to see how the 5500 compares to the 3600. Maybe throw a 5600 on an older board into the mix as well to see how much one really saves.
Nah.. I think used R5 3600 makes much more sense than R5 5500. Generally speaking, CPU last much longer than motherboards, if your B350 died, you can upgrade straight to B550, but if you are stuck with PCI-E 3.0 of the R5 5500, you can't fully make use of the newer B550 motherboard which supports PCI-E 4.0. The only good thing about the R5 5500 is the monolithic die because it originates from APU , thus you can overclock the Infinity Fabric much higher compare to chiplet based AMD CPU. Those APU have better memory controller than chiplet based CPU.
FPS/$ is much better suited for graphics cards. For a CPU you have to include motherboard prices and RAM because it doesn't do anything without these components. Choosing good options for these parts is impossible considering the diverse audience, but even the cheapest options will noticeably change the value for these budget CPUs.
Good vid! I just wanna say, I think the primary concern (at least for me) around the whole 5600x/5600 thing is that the 5600 is almost always better value (cheaper for nearly the same performance), so it makes sense to use the better value chip as the baseline instead of the worse value chip. And just for an example why, I was helping a friend upgrade from a 3600 to a 5600 and I sent him one of the gamersnexus charts and he was confused about the 5600x being there and asked me questions about it, are they the same, should he spend more for the x, and I had to walk him through it. It's minor issue but I'd prefer the better value chips being on the chart if we get more these x and non-x chip situations. Like, I'd rather the 7600 be on these charts over the 7600x, for example.
One of the options is to just add a non-x counterpart even if it wasn't actually tested with 2-3% less performance. Of course, as long as the X version is better only by 2-3%
Reminds me of the difference between 5700X and 5800X, the latter clocks higher but also has a higher power target. How fast would a 5800X be under 95W and how fast would a 5700X be under 125W....
amazing vid, was considering 5500 cuz of sites which were comparing the price to benchmark scores and so the price seemt very very good, but I dont do editing and stuff, now I know what I should go after, ty
What's the point of comparing games that are already running on more than 120fps ? Compare those games that struggles to push 60fps on high settings not everyone has a 144hz monitor bt each and every guy has a 60hz monitor. test these cpus on games like cyberpunk, Valhalla and dying light 2 these games may be not optimised well or more cpu demanding but they can keep your build ready for any titles if you can run these games at High on 60+ fps
Steve, you should not have pulled the first video, there is more than enough information out there for the viewers of your channel to make an informed opinion for themselves, this channel is one of the best if not the best for honest no shill reviews of hardware and I cant see a good reason that you unfortunately had to waste resources making this video when it could have been used for something else. Thanks for the excellent content.
@@jeremyleemartens801 You should consider what has caused your stupidity. Your pathetic if you cannot see spending $15 more INTIALLY in order to have at least one extra GPU upgrade free from bottle-necking is a fantastic investment. The 10100 WAS , PAST TENSE , a good value option , it NO LONGER is.
I ABSOLUTELY look for and buy based on FPS per dollar. You're right, that's not the only thing that matters and can mask other factors, but give your audience some credit. I believe anyone who has their own personal "minimum" metric, or needs production performance also, is smart enough to also watch that part of the review. You can also make a two-level chart like you do for avg, 1% and .1% fps, and show frames per dollar along with avg. I use HW unboxed charts like this a lot.
Yeah, I don't know why they made such a big deal of this "problem" when it's been solved perfectly by Hardware Unboxed. The only real criticism against performance/$ charts (in my opinion) is that costs can change. But even then, when you display the raw performance, users can easily update for current costs by just dividing.
I think that fps per dollar are stupid charts, as prices change constantly, they greatly depend of the rest of the system resolution, etc and don’t show performance at all. CPU fps per dollar is more dependant on the GPU than CPU in most cases. 13900k might be exactly the same fps with RX6500XT as Ryzen 5500 in CP2077. If you add the whole system cost, two systems with 100% price and fps difference will be exactly the same fps/$. Like I said, those charts are stupid and too subjective and easy to manipulate to add any real value to the review.
Remember that it also depends on the games you want to play with it. For example I play none of the games in the test and don't live in the US, so the numbers in the FPS/$ chart are completely meaningless to me. Neither the FPS part nor the $ part. But what I know is that in my most demanding game the 5800X3D beats the 12900K by about 40%
@@HappyBeezerStudios So I'm sure you go around saying that all FPS charts are worthless as well because you don't play those games, right? And you do realize that the relative % differences are literally just calculated based off the FPS numbers?
@@azethegreat958 All I'm saying is that the chart displays one situation that is limited in time, location and software used. A rough first start, but to know what is the optimal choice for a person, they have to do the math themselves.
Mad props for pulling and retesting. Much appreciated. Agree to disagree on the 20 to 30 diff on 5600. But at least there's charts to back everything up. So kudos and thank you.
No stress, just remember to include a good coverage of cpu's that are available and in and around the same price bracket. low end is as important as high end. its correct to include cpus in that prices range. its a shocker you didn't include the 4500 that could be picked up for £65. lol. Also Don't always take the new build approach , need to remember the amount of am4 boards out there. this will happen with am5 going forward
I've always appreciated the data first -approach of this channel. If you stop listening actively and just zone out you really start to understand how the more basic consumer has trouble understanding the details when choosing pc components, specifically CPUs especially. "something-hundred-F" vs "something-hundred-X" again, different numbers, different postfixes, different values, different comparisons. Thanks to the team again for great content and thanks to Steve for having the patience for the narration.
Hey, your transparency and community engagement is great! Just some feedback from a semi-regular viewer: - FPS/dollar in CPU section isn't important to me personally, I rly appreciate them in GPU reviews though. I still appreciate you taking feedback from the community. - The disadvantage of not being able to show absolute performance and fps/dollar simultaneously was solved by other reviewers by showing both on the same graph
I feel like this video is even more relevant today with everything going on in the YT tech reviewer space. THIS is how you handle criticism. Objective responses and explanations for why things were done or are done, also with acceptance of criticism and no blame being placed.
20:15 I know it's a slippery slope, but if we're really concerned about cost, then we must also consider motherboard pricing. You'll easily make up any value lost on the extra ~$20 for the AMD CPU with a B450 motherboard. There are new Asus Prime B450 boards for $80 on Amazon right now. I doubt you'll find a new B660/760 mobo for less than $115 without a sale or promotion.
Can get B660 for your 12100f on Amazon for as cheap as $94, so you're really not saving a whole lot, beyond that there are several others for $109 or less. For people looking to game on a budget, the 12100f is the best value.
Thank you guys for all that you do! Great comparison and information. Yall are a class above when it comes to reliable, unbiased and data driven information. I truly appreciate that you are here to provide that. Ignore the jerks/fanboys. It is a testament to the state of our society when you have to make disclaimers about things when common sense should suffice. Cheers.
Removing CPUs from the chart: - 100% agree, saves time, looks better, zero complaints. Removing the 5600, keeping the 5600X: - Don't quite understand - In a conversation about value, it would make sense to show the better value choice of two similar-performing products. Industry-leading work in terms of dedication to clarity though, love everything you do.
The 2nd Look and Extra work you put in to this review was great appreciated. This is one of many reasons that sets GN far apart from other tech channels in my opinion.
@14:00 maybe keep frames per dollar, but rank cpus on the chart by frame rate... Example 4th one down is a cyrix at 60fps and scores 9 f/$. 5th place is a pi3 at 30fps with a score of 7 f/$. 6th is a pentium 4 at 25fps with a score of 8 f/$.
Great suggestion ! (From here below it's my comment LOL) I would also put the (assumed) price used for the FPS/$ in the legend, that way it's clear :) I would also suggest to add a subjective part at the end, where GN could say that : "under xxx$ it's a steal/good value/ etc" , "over xxx$ it would be too much, and we would recommend ...". Maybe it could be done by putting the "inflexion price" in the FPS/$ chart. By "inflexion price" I mean the price where the opinion changes. I understand that GN wants to be the most objective as possible, but IMO such input from reviewers who many products is also a data point in itself. Anyway, I'm trying to open the discussion :) Feel free to comment and respond to my ideas :) Love your hard work and looking forward to your fan testing :P
I am glad you pointed out that i3 12100F also goes on discount sale. Some big tech tuber only pointing out that ryzen 5600 goes on discount sale. Its so easy to be fair but some just choose not to be. I am glad you are not shaping your narrative to favor any brand.
you did so much work to fix a problem you considered worth fixing... respect. I'd love to see a value battle royal but with cpus that have graphics. with the cpus released on ces. (whenever you get them)
Thanks Steve, I think it is good to do these round-up comparisons each generation: one for absolute budget build CPU (like this video), one for best value mainstream CPU, and one for the absolute best CPU. Thanks for all the work you guys do - now take a break for a week before the Ryzen 7000 X3D review samples come in!
I really wish the review was done with reasonable GPU's for the segment, on a budget gaming build no one is going to have an RTX 3090ti, now I understand it's to show which CPU is technically better, in reality it's going to make people building budget systems that they are going to see the in some cases large increase in performance, but when compared with a GPU that would actually be compared with those CPU's then the difference is going to be almost nothing.
If you missed it, check out our mini-documentary covering the final days of EVGA's GPU division -- they really deserve the watch: ruclips.net/video/Gc0YlQS3Rx4/видео.html
WE WILL RUN OUT OF DISAPPOINTMENT T-SHIRT ALLOCATION BY NEXT WEEK! Grab a 100% cotton shirt here: store.gamersnexus.net/products/disappointment-pc-t-shirt-2022-100-cotton-black
And lightweight tri-blend here: store.gamersnexus.net/products/disappointment-pc-t-shirt-2022-triblend-black
Thanks for the updated version. Any chance Davinci Resolve will show on the charts soon?
It has a massive market share
add a quick few game test with RX5600XT or RX6600 as a real world test where budget gpus are paired with these
psst 5600 is 120 bucks free delivery from china.
ps AM4 boards are generally a bit cheaper than comparable lga1700 boards. or a bit better for the same money. + AM4 CPUs eat less power = go easier on VRMs and have better upgrade path if you choose from really cheap boards. practically anything (e.g. b450m s2h) can run 5800X3D. but to run 13600k you need a way pricier board than b450m s2h. 5800X3D and 13600k are comparable in gaming.
@@blacksama_ 5500 sucks as a gaming CPU. but 5600 exist and costs only 120 bucks. so 5500 is irrelevant anyway and should be ignored. just like 12100(f) and 13100(f) should be ignored. just get a 5600 from china, MSI B550M PRO or ASUS TUF GAMING B550M-PLUS depending on your budget and be happy. or even GIGABYTE B450M S2H. or MSI A520M PRO if its noticable cheaper for you.
Great video as always. I have a suggestion for a fps/dollar like metric for production tests. You could for example use performance expressed as percentage for you dividend. So, in this case 12100f is 100%/100 dollars and 5600 might be 130% of 12100f performance/140 dollars. You already done it plenty of times verbally. Might be worth putting that in the chart
Just got done going through comments and scrubbing a bunch of the spambots... remember that they're getting tricky! They copy/paste comments from actual users, somehow insta bot-upvote themselves, and then hope you'll click on the profile picture so they can phish you. The profiles are all of a very obvious type. Downvote or report them if you feel like contributing, but we'll keep nuking them for now!
Thanks Steve, now back to you Steve.
I generally look through the comments and report them as spam, they are ridiculously obvious and it doesn't take long to do. Have to wonder if it achieves anything with youtube though...
How can it phish us by just clicking on a YT profile?
@@rickysargulesh1053 They always have links in their profile that are either obviously malicious or are a redirect from a legit-looking site to a malicious site in an attempt to get your personal information. Not 100% sure how it works, but pretty sure it goes something like that.
thank you for being one of the very few RUclipsrs i watch that actually helps get rid of these
Man, I wish the fight for budget GPUs (remember when those existed?) was as interesting as it is for CPUs nowadays.
100% agreed! GPUs need to get a similar shakeup to CPUs!
RTX 4050: 5% more performance than the 3050 but for 20% more in cost.
@@GamersNexus (pokes Intel) Come on, do something!
Yeah. The last good value GPU, the 5700XT, launched in 2019. Since then it's been nothing but turd after turd.
edit--------
due to the number of replies I have gotten, I wish to clarify things. I am talking about what AMD/nvidia have been trying to sell their cards for. Excluding companies trying to sell off the last of their stock before new cards come in, or fake MSRPs, everything after the 5700XT has been crap. The 5700XT was good on day one and remained good until crypto went nuts long after launch. Cards after the 5700XT have been worse value or in one or two best case scenarios were able to at least match the 5700XT. But matching your old generation isn't an improvement, it's pointless and a waste of time.
So yes, you can find a better card than the 5700XT as the last of the 6600s are trying to be sold off before they're replaced by the 7600 or 7500, but it's the blip at the end of the product's life and not what it was for about a year after launch.
What is a budget GPU?
class act to take community feedback, adding more cpus to the chart for comparison, and redoing the video with new testing done. cant imagine the work that required. this is why you guys are awesome.
Thanks, uh, "pooplord." It means a lot that a Lord of your demeanor says something so kind!
The pumper truck challenges your claim of lord.
ALL HAIL POOPLORD
@@BigDonkMongo Hear hear 🍻
@@GamersNexus Feels like the internet of old
A large RUclips channel taking suggestions from the comment section and incorporating them into an updated video? Preposterous!
Seriously, this is one of the reasons I love you guys. The content here isn't backed up by pride or bias. It's cold, hard facts.
yeah we should always listen to amd fangirls
We love these videos but aren't you supposed to be taking a break Steve? *uses concerned parent voice* Come on man take a rest you've been killing it and we want you around for many years.
Just one more video. I can stop anytime I want! I swear!
@@GamersNexus meanwhile I'm watching the video at 00:30
I woke at 04:00
@@GamersNexus please take the break! Maybe it's the overwork, but the cringy Eminem "Forgot about Cezanne" joke was a definite departure from the absolutely fantastic joke writing of the last few months of videos.
@@GamersNexus YOU BETTER❤
@@GamersNexus no, stop it. You and the staff need rest. I want full energy for 7000x3d chips
One major problem with price per FPS is that prices are always changing.
True. The price of the 5700x went down about a month after I got it lol
It would be ideal to see the price they are using for the FPS per dollar comparison.
Yeah, this is why Hardware Unboxed has to do a monthly price/performance investigation and video...
@@ArtisChronicles - Hate when that happens... But then, when you buy something, it better be worth the monetary investment to you, so that you can't complain. - Still, if you might be on a tighter budget (as I often am), it must sting. So I really have to jump on like sales or price-drops, which I did with the 5600 and the promotional "Uncharted 4" key. - That's a good way of throwing in an incentive and adding some value, besides that the price was good for the CPU alone anyway.
it still provides valuable context. you know if things got cheaper then so did your cost per frame. its not rocket science.
I agree with the fps/$ reservations because there are just so many ways people buy CPUs:
- drop in upgrades
- platform upgrades
- new builds
For all of these, and especially for a huge amount of varied price tiers in the latter two, the actual cost and fps increase by changing CPUs is hugely different, and it's difficult to show this in one or even 2 or 3 charts
Great input & agreed on these points! We'll keep tuning/tweaking it! Thanks for commenting. Good thoughts.
Definitely good points, especially since it is - effectively - theoretical. What GPU you get & what resolution you game at can have a MASSIVE impact on your personal, real-life FPS per dollar rating. When, like in this video, you're looking at budget options, you aren't going to see real-life FPS increase per dollar on a matching lower end GPU compared to something like the 3090Ti they use in their chart.
You can get all kinds of differing bottlenecks between different hardware combinations & resolutions. For example, I upgraded from a 6700k to a 5600x the same time I upgraded my GPU to an RTX 3060Ti. The 5600X would have a meaningful FPS impact on that GPU when gaming at 1080p to help justify its value, whereas at 1440p and up the bottleneck shifts pretty heavily to GPU so would be comparably poorer. And even RAM rank, speed, etc can impact the outcome. Feels like FPS/$ charts are simply too broad & theoretical to have much applicable value.
Exactly. The graph basically compares how these CPUs today (as in on this very one day) on the US market do as a drop-in upgrade for people who play those 7 games.
If you need to upgrade the whole platform, watch the video on the weekend, don't live in the US or play any other game, the graph is rather worthless for price comparisons.
I bought an i7-12700K for 330, it's a 2021 batch with AVX512. The MSI-Z690 Pro A DDR5 was cheap too, 150 bucks. Then i added two 1TB M.2 NVMe PCIe Gen 4 SSDs.
It's a music production and console emulation PC, no graphics card needed. It's air cooled (Deepcool Assassin 3, and Corsair 5000D Airfow). I bought an overkill power supply (wasn't cheap).
Waiting for a good GPU. 3080Ti is a good card but it uses way too much power and has a lot of noise, Radeon is even worse.
All of the 3D cache nonsense and overheating CPU problem can be solved if both AMD and Intel simply added quad channel memory support and focused on increasing IPC.
My alternative was an overpriced gaming laptop with overheating chips, less RAM and storage, slow processors. At least a 3060 mobile is an amazing 1080p beast at a low wattage, and the i5-13600K is a no no.
I almost choosed AM5 because i read somewhere that it would have quad channel memory support, then i remembered that AMD wants to sell gimmicky 3D cache chips, so that never happened.
Quad channel memory support is so overlooked. It's crazy the perfomance upgrade it did for Intel HEDT for games.
With DDR5, imagine the absurd perfomance increase vs dual channel DDR4.
I feel sorry that you're faced with an impossible task. There seems to be a lot of viewers who can't/are unwilling to look at a graph and analyze the results but simply look for the highest number. Your disclaimers about percentage charts were great, but many won't see the difference. Still, keep doing what you're doing because it's vital info for thousands.
You mean I have to READ and THINK!!!
Get out of here!!!
Thanks man. 🙏 amen for that statement. I don’t get why people are not able anymore to use their own freaking cpu aka brain to understand reviews
Feels like most of those guys are still on a budget cpu from mid 2000‘s
@@aqulex84
They need to overclock with adderall
ANNDDD... THERE'S THE RE-ROLL
There's a deep well of jokes somewhere
For budget/value cpu reviews, if feel like the included cooler should be part of the assessment as well. For example, does it even come with a cooler, and does the cooler let it run as intended. I know the coolers are generally the same ones they've had for years but it is part of what let's budget builders save a bit, especially as it affects a higher percentage of the build cost at the low end, as you said. Also i know your charts are done with aio which is sometimes unattainable for budget builders, but I understand it makes the chart more consistent. If there was a i3-12100/13100 vs 5500/5600 air cooled numbers chart, that would make a good additional data point to evaluate. Thanks for the review.
Gamers Nexus killing it with the hard work yet again
Glad to see you guys back and with new charts
Respect for the integrity to pull the review and include the 5500.
Honestly you've continually pushed your quality higher and higher, and I hope we all agree on that. Thank you for your dedication to clear and informational communication with an emphasis on contextual data.
Thank you Gamers Nexus for the revised video! Excellent work, great job.
you remain the hardest working tech tuber imo and the effort, care, and thoroughness you and your team put into GN content is amazing and appreciated. you are the standard for me when it comes to tech information and news.
Thank you for your kind words!
@@GamersNexus No Steve, thank you to you and the team for providing unbiased information that is both useful and relevant.
Yup love his videos, I wonder if he'd ever consider a website similar to how LTT is trying to do where GN can post more lengthy charts with more components. Like I was saying on the previous video I'd love to see older Intel CPUs included in these charts.
Because he has stuff like the 1700, 2600, 3600, etc. However no competing Intel CPUs from those eras such as the almighty 8700k, and others such as 9700k, 10600k, etc
That 'Forgot About Dre' music at 5:10 when u explained u forgot about the 5500 was a nice touch..
All in all another quality GN vid. Keep it up!
Steve and everyone involved,
I greatly appreciate your effort to not only show performance for what new things come into existence from our corporate overloads, but to really dig into these new products and find new/better ways to show how they compare to the competition. I know you won't but NEVER STOP FIGHTING FOR THE CONSUMER! As a Raleigh native who is just as enthusiastic to new tech as you, nothing will ever compare.
Thank you for the support! Keep that tech enthusiasm up! There's a lot to learn at all levels.
The "How Things Work Here" was the best bit! Keep up the AMAZING work 😉
The CPU market has never been this exciting since Ryzen was introduced. Great video as usual GN!
Now if only this could happen to the GPU market ...
@@shadow7037932 IF intel stays in the GPU market, and IF they can leech some nvidia engineers, they'll be amazing in 5-10 years
Yes after AMD finally ,, how many DECADES to catch up
Which is Ironic, given that AMD's abandoned the lower end CPUs for years now. The 3300 was the last one and that had really low availability before disappearing pretty quickly.
E2A: I might be a bit harsh with a second though. They did have the G series which was kinda down there, but only "kinda".
@@ChrispyNut The G series is still just rebranded mobile processors. 5600G is a rebranded 5600H, 5700G is a rebranded 5800H
I came for the unbiased review and in depth testing results, I stayed for the unexpected Dr Dre reference.
I had to mute the video for a second to be sure the music was indeed coming from it XDD
@@Blafard666 hahahaha
watching this again, commenting for the algo and upvoting because you guys deserve it. The amount of work you put into this is insane
Talking about feedback, I'd love to see some sort of "emulator benchmark", as in taking a set of stable games on RPCS3 and Xenia and using them as a way to compare performance between CPUs. All the cores and threads in the newer CPUs really shine when it comes to emulation of 7th gen consoles.
I'd like to see that. I know there's an RPCS3 CPU tier list written up on Reddit. The i3 12100 is rated decently on it, and rated high if you manage to get AVX-512 enabled. I haven't found much for Xenia yet but it honestly runs even better for me for the 2 games I tested - when it works. Just that its compatibility and stability is not nearly at the stages of RPCS3.
They could add the dolphin 5.0 benchmark, anandtech has included it for years, although having several emulator would be pretty nice.
@@Woodzta Xenia is less CPU intensive because the Xbox 360's CPU is more straightforward.
It's a 3 core PowerPC CPU with SMT. That's it. The main difficulty here is the L4 cache!, stronger GPU and unified RAM+VRAM.
The PS3's CPU has 1 PowerPC core and 8 128-bit co-processors with own cache and insane-for-the-time throughput.
Look up the schematic on Wikipedia ("Cell broadband engine"), it's infamous for being very hard to develop for, let alone emulate it
I love how you listened to the comments and revisited this review. That is one of many reasons why this channel is AWESOME! Thanks for what you do Steve! 👊
Properly belly-laughing at the sudden "Forgot About" homage, well played GN.
Your knowledge, detailed testing, and integrity to the game makes your videos my favorite on YT.
OMG you guys even addressed my "percentage is tricky without a baseline" comment! THANK YOU!
I've had the 12100F for almost a year now, just upgraded to 1440p. Still does the business. Using G-sync at 165hz I'm perfectly happy... I'd have happily picked an AMD cpu but for the NZD$175 i paid there was simply no equivalent in terms of value/performance. It's great to see for those that are due an upgrade, there is another new budget option.
Good choice. Ive done a lot of builds with the 12100F. If you get the Mortar Max, you can hit 5.1ghz very easily. Worthy upgrade if you get a deal on it 😁. 12400F hits 5.3ghz with a 240mm aio with ease too.
@@PDXCustomPCS the asrock b660m pg riptide is cheaper and can do the job too..
@@kubotite9168 No it cant. Ove had all of them.
@@kubotite9168 It doesn't have a clock generator. FYI.
GN and Dre, didn't see that one coming.
Great, now I've got 'so you forgot about dre' going round and round in my head.
Great job, Steve!
All that extra work to get the same answer , that is what a good cientific method is about! Great Job !
Re: removal of near-identical CPUs
I agree with the space+time savings by removing one, but I think you should default to the cheaper part, to represent the choice you actually recommend. That way discussions will be clearer and easier to understand. If you're going to say "maybe consider the 5600", then show the 5600, not the X.
Props for taking feedback and reuploading.
We normally do, but then people get mad that we aren't showing the 'best' part. There's no winning.
@@GamersNexus 🙃 of course, I should have expected as much. I still think going with the part you (at least have a better chance of) recommend(ing) is the way to go, though.
May as well get some extra utility if there's gonna be haters either way
I personally think the 5600x is better since the 5600 with pbo ties with the 5600x with pbo. but I wouldnt mind to much either way since its so close, but imo showing the 5600x is better.
@@XX-_-XX420 why not just show the 5600 with pbo, then? Since that's what they're actually recommending people get and use, they should show that instead.
@@tarfeef_4268 thats also fair, but I think just running the 5600x probably does save time while showing about max performance. ( pbo probably gains the 5600x like 5-10% in gaming or something but that doesnt matter to much).
Loved the Dre reference! Thanks so much for adding the 12100F, I really wanted to see this!
The 12100F was already there! haha
@@GamersNexus That's the joke :)
I got my 5600 for $119 and after increasing the boost frequency by 200mhz it has identical performance to a 5600X! It’s a powerful and inexpensive cpu I’ve paired with my RTX 3080
Yea but if you take a 5600x and do the same overclock it again is going to be much faster than the 5500. It literally has half the l3 cache at 16mb versus 32MB on the 5600 (x).
I got my 5600 at the same price point. Instead of adding the 200mhz boost, I fine tuned curve optimer per core. Cinebench increased several percent, but not as much as the 200mhz boost would. The temps and power usage are much, much lower.
R5 54600 ( non x ) is EXCELLENT choice at the $120 - $125 mark ,, FAR better value than the OVER priced 5800X3D
@@tilapiadave3234 But the performance of the 5800X3D supersedes the value of your little budget chip. Besides, you can get a 5800X3D for less than $300.
@@jamesm568 Yes the WAY over-priced EXTREMELY over-rated 5800X3D can beat a 12100f ,,,, But is that an i5 YES a mere i5 i see way ahead in many of the charts ,, WOW ,,,, so embarrassing for the 5800X3D shills
It's refreshing to see a candid explanation of what happened in the other video and take community comments into serious consideration.
It's kinda important to consider the price of the platform in fps/$, it's not really a big problem for the 12100 which you can run on the lowest of low end LGA1700 boards, but especially with the likes of AM5, motherboard costs inflate the overall price more than just the CPU.
Stop explaining, thorough and clear video! ;-)
This was great! I never shop in budget class, but I still enjoy watching the charts...It's amazing how good a $100 CPU is these days! I appreciate all the hard work you and your team put into these videos, and I'm excited for more fan and PSU content also!
OMG I see you did the Blue = Intel; Orange = AMD thing for the FPS per dollar chart. Not sure if that was on purpose but I recommended you do this a few times and I'm SOOO happy to see it!! Wow, that looks so good :)
I really hope they make a significant update to the i3 this year with core counts like 4P cores + 2 E cores or 4+4 cores.
4p+8e would be legit ryzen 5 killer. and lets leave the quad core stuff to pentiums and celerons.
@@mtunayucer Except that's 16 threads which makes it a R7 competitor and the R7 will slap it silly. 4p+4e is 12 thread. 👍
@@MafiaboysWorld bruhhhhhh who compares cpus by thread count?? You compare cpus by PRICE. It doesnt make R7 competitor. Thats the whole point!
@@mtunayucer The companies themselves do. People do. Have for decades. That's why Intel, with only 8p+16e, so 32 threads total, calls itself the i9 13900K and competes with the R9 7950X which is guess what genius, 16 cores and 32 threads!
But "wHo cOmPaReS tHrEaD cOuNtS?!" right? 🙄🤦 Quit while you're behind already dimwit. 👎
Hoping for 4p+8e as well
Maybe for the i3 1x300?
Thank you for the commentary on fps per dollar. A lot of reviewers have been using that metric lately but without the disclaimer about their questionable utility. (Plus a lot of them call it "dollars per frame" which is just factually an incorrect term for the metric since it removes the time component of fps).
In chart 1 the 4090 is paired with a 12100f, a massive cpu bottleneck, for highest fps/dollar. I think this aptly demonstrates Steve's point about the number not being the whole, or even significant portion of, the picture. And perhaps why he made sure to make that point so thoroughly up front.
I have exactly the same cpu + gpu combo. Its not even remotely massive bottleneck as you would think. There is no lack of graphical settings in games that can tame 4090 to performance that is reachable by this tiny beast.
The whole idea of bottlenecks is so overrated its hilarious at this point.
@@filippetrovic845 I like the way you think. Also at 4k gaming the combo is probably about perfect.
Thank you for teaching me so much about computers. I have not built one yet but I think I could do it. Maybe I will with my next computer. In the mean time I will keep watching your videos and learning from you.
Meanwhile at NVIDIA:
what does budget mean
It means increase the available budget for Leather jackets to make "all of the leather jackets" viable.
A $499 4060, which is technically under $500 lmao
Really impressed with the Dre line. Looking forward to Steve's mixtape.
TWO Steve’s in the thumbnail????? You spoil us GN
I lost it when I heard the Dre reference, keep up the great videos GN!
Love watching your content man , you're on another level , we promise we will keep support this amazing effort , keep it up 🌹
Speaking of being around here for a bit, I think the first video I watched of yours was the GTX 960 4GB vs 2 GB way back when it came out. To see not just the set change so much but the quality of content come so far is a testament to you and your team's hard work. Keep it up.
Have you considered price-vs-performance scatter plots (possibly with iso-price/perf lines)? With the entire value discussion being comparative, the difficulty of reading accurate values in scatter charts does not matter.
We can try it. Sounds maybe difficult to understand, but I'll play around with it!
@@GamersNexus if you do add new charts I would suggest to explain how the chart works. Even if it seems simple, it’s always good to take 1-2 minutes explaining the charts and allowing the content to be more accessible to more people.
I've seen a couple scatter plots for that kind of data. Really useful as a visual tool.
@@ArdgalAlkeides Obviously as cards stop being sold, they leave the graph. Nobody would be interested to see how well a 8800 GT or a HD 4870 or a GTX 570 are nowadays.
And as OP said, it's more about having a broad overview about everything available.
That the plot would be obsolete a week later is also obvious, but the same can be said for any price/performance graph.
I feel like the distinction @8:23 is gray with the Intel stuff, as a 13100f and other "Raptor Lake" CPU's using the Golden Cove p-cores really are just remixed/rebadged Alder Lake CPU's. Odds are really good the re-badged Alder Lake 13xxx procs are going to be paired with an Intel B660 series board with DDR4 from last gen which makes it a complete "last gen" solution in line with an AM4 processor and 5xx chipset mainboard.
I wonder how much this skews the relative uptake numbers for new generation products from both Intel and AMD... while AMD has more clearly distinct platform and processor divisions for AM4 vs AM5, while Intel is much more mix and match between Alder and Raptor lake generation CPU's and chipsets/boards.
Gamersnexus again setting the bar even higher on the tech industry reviews ! Thank you for your dedication
Great video, as always, surprisingly perhaps, it reminds me the most of when the CLC mounting topic was raised, the second video with that awesome thumbnail *looks it up to confirm language* "AAAAAH! IT'LL EXPLODE ANY SECOND!" where you addressed a vocal part of the community as a PSA to say 'Bro, chill a bit' which was really good.
Nice job as always. This must have been a lot of extra work, but it served as a good reminder of your testing philosophies and methods. Well done, Steve and team.
Hey Steve, if you're having trouble incorporating all the relevant information into the Cost-per-Frame graphs then I'd recommend checking out a channel called "Hardware Unboxed", they have some really good graphs you could use as inspiration.
They manage to include maximum performance per CPU and a cost breakdown of all cost of CPU/Motherboard/RAM per test, making it much easier for the viewer to check Cost-per-Frame for themselves later when the prices change.
It's important to remember the point of a Cost-per-Frame graph is to illustrate value, while a maximum performance graph is better suited to showing maximum performance. 👌
Sorry Guys, the internet has spoken, every graph has to include every processor from the last 5 years. We look forward to your new once-yearly video upload schedule. 😂 But seriously, thanks as always to Gamers Nexus for all the hard work and dedication, you guys rock.
That's a scary amount of work. A monstrous undertaking...
5 years? We have to go back at least to the Q6600 for include all possibly relevant computers of everyone interested in getting a new CPU :D
I mean everyone once in a while that would be super cool to see like first gen i7 parts overclocked all the way upto modern parts.
or like a 7ghz FX8350 beating stuff like the 5500, that would be hilarious, and cool. ( or I think it was the 8350 @ 7ghz that was as fast as a r5 3600).
Steve, great video (THANKS STEVE!) LOL.
When you guys are reviewing really low end CPU’s in the future, maybe it would be useful or helpful to pair them with midrange or budget GPU’s. I understand testing methodology and wanting to remove bottlenecks and hence using a top tier GPU, but in the real world nobody is pairing a $1,500 GPU with a $150 CPU. What’s really relevant for anyone considering buying a CPU this far down the stack is whether the bottleneck is the CPU or the GPU it will be paired with in real life. Testing with something like an RTX3060 is highly relevant at this price point, and it would be really interesting to see if all of these CPU’s are functionally equivalent with that big of a drop in GPU performance.
12:15 - That's why HUB includes value and fps in the same chart, where they sum CPU+RAM+mbd, and I think it works very well (except of course they use $ per fps rather than fps per $; same idea really, one can debate which is easier to comprehend). It allows one to see the value difference, but also see where a more expensive option provides much greater performance.
15:05 - HUB's version of this kind of chart does not leave out which combination has the performance edge, and it really does help that various combinations of mbd and RAM are included too, to show for example DDR4 vs. DDR5 value/speed variance.
Overall conclusion still correct I'd say, but of course price volatility (and how it varies between regions) can shove all this around like bumper cars. Strange where I am (UK), the 5600 is now largely unavailable, or places that do have it are charging far more than makes sense vs. Intel competition (for gaming that is). By contrast, the 5500 is easy to find but is exactly the same price as the 12100F, quite unusual for UK pricing of this kind. What would be nice of course is a proper 7500 or 7500X, but AMD isn't going to do that, not for some time; they learned from the 3300X that too strong a value part just negates most of the good-margin SKUs.
The Zen4 stack will only receive entry parts when Intel once again kicks AMD in the shins. AMD waited far too long to release the 5600 and meanwhile released a mess of other older-tech parts with irritating caveats & compromises (PCIe, cache, etc.); wouldn't surprise me if they do the same thing again, leave competitive entry Zen4 so late that by the time it comes out Intel has already moved on.
Also interesting from the value perspective, the 10100/F is still available to buy and it's 20% cheaper than the 12100F; sure it's slower, but with realistic GPUs and display tech in mind this likely doesn't much matter, most people in this price class are still fine with 60 to 90Hz displays.
Btw, RandomGamingInHD recently uploaded a video showing the 13100 with the Arc A750 at 1440p, some interesting numbers.
Charts!! Thanks Steve. Really thanks, looks great!
R5 5500 overclocks to 4.8ghz very easily. I have mine at 4.422Ghz just because I'm using a stock cooler and I want the temps under 80 degrees. intel chips don't overclock as well, I got my 4.4ghz 5500 with a free game I wanted too, so all in all I'm happy with my new UFO RIG. I love you guys but your wrong to hate the R5 5500......its a beast in hiding!
Please do a chart on overclocked intel vs AMD budget cpu's, great to see.
I feel bad for the people that dropped an opinion in the dropped video. Is the single issue I find in relasing an improved video. I whis there was an archived or previous versions, so they contribution doesn't get lost.
Nothing to add. Another superb review, as always.
Something that I think would also benefit in budget CPUs reviews like this is a chart or two where the CPUs are paired with one or two budget class / mid tier GPUs as well. Maybe something like the 6600 as kind of a reality check against what an actual budget build with the CPU would perform like. The point being that gaming performance between these CPUs may well flatten out with a lower end GPU which could change the value equation. If a CPU could technically run a game 30% faster, but only when paired w/ a 4090, then you'd be paying a premium for a performance difference you're never going to see. Just a thought.
That is not the purpose of these videos. The whole premise is to remove the GPU as a variable. Push the CPU to it's absolute limits. There are far far too many GPUs available to do testing like this. 10 hours for a single addition of one CPU. Adding in an arbitrary "realistic budget GPU" to the stack means a doubling of testing time on the gaming side. You might just have to use the excellent and extensive data GN has provided and make your own conclusions.
@@jeffjungers2034 I know full well how benchmarking works and why GN does what they do; however, this is the first time they've done a "value" metric in terms of FPS per dollar because it's important to a significant portion of the target market of these CPU (someone building a budget gaming rig). It's that audience that I had in mind when making the suggestion (and IIRC, GN did ask for feedback about this addition).
I disagree with your understanding of "the purpose of these videos". GN's really the only one w/ authority to speak on that, but my take is they want to give their audience enough information to make informed purchasing decisions.
IMO an important aspect of budget builds, probably more so than higher end rigs, is balancing out component choices. If processors in this range are functionally equivalent when paired with a mid/low tier GPU, that's useful information as the price difference between CPUs and their platforms can be put towards a different component that will impact the user's experience. Put another way, one data point that's largely missing for potential buyers of these CPUs is what class of GPU will be bottlenecked by them. I'd argue that's far more important than how fast they can push a 4090.
i think that you guys are showing what a class act you are here. you have strong opinions, but you do keep yourselves grounded. the first video i saw of gn, i was very critical about in the comments (and i may have been drinking), but since i have become a big fan of the channel, and really appreciate your content. keep up the awesome work!
Well, to be honest, GN has not been this humble forever. I have given some critic also in the times gone bye. For a reason I think. But they've improved all the time.
Ryzen 5 5500 has a very specific use case, cheaply upgrading older boards that only have PCI-E 3.0 anyway. It was useful to give my son's circa 2018 Gigabyte AB350 and Ryzen 3 2200G a cheap boost. Gigabyte's been very good about keeping the microcode updated for new BIOS/UEFI releases.
You do realise that usually GPU PCIE lanes are provided by the CPU, not the motherboard? So a PCIE 4.0 GPU with 5600 on a PCIE 3.0 board will still run in 4.0 mode
Basically a competitor to the 3600. Or for people got got a cheap 1600AF back in the day and want a decent upgrade for a decent price.
Would be interesting to see how the 5500 compares to the 3600. Maybe throw a 5600 on an older board into the mix as well to see how much one really saves.
Nah.. I think used R5 3600 makes much more sense than R5 5500. Generally speaking, CPU last much longer than motherboards, if your B350 died, you can upgrade straight to B550, but if you are stuck with PCI-E 3.0 of the R5 5500, you can't fully make use of the newer B550 motherboard which supports PCI-E 4.0. The only good thing about the R5 5500 is the monolithic die because it originates from APU , thus you can overclock the Infinity Fabric much higher compare to chiplet based AMD CPU. Those APU have better memory controller than chiplet based CPU.
Thank you for continuing to improve your processes. 😃
FPS/$ is much better suited for graphics cards. For a CPU you have to include motherboard prices and RAM because it doesn't do anything without these components. Choosing good options for these parts is impossible considering the diverse audience, but even the cheapest options will noticeably change the value for these budget CPUs.
Never did I think that I'd be watching Tech Jesus making a Forgot about Dre reference
Good vid! I just wanna say, I think the primary concern (at least for me) around the whole 5600x/5600 thing is that the 5600 is almost always better value (cheaper for nearly the same performance), so it makes sense to use the better value chip as the baseline instead of the worse value chip. And just for an example why, I was helping a friend upgrade from a 3600 to a 5600 and I sent him one of the gamersnexus charts and he was confused about the 5600x being there and asked me questions about it, are they the same, should he spend more for the x, and I had to walk him through it. It's minor issue but I'd prefer the better value chips being on the chart if we get more these x and non-x chip situations. Like, I'd rather the 7600 be on these charts over the 7600x, for example.
One of the options is to just add a non-x counterpart even if it wasn't actually tested with 2-3% less performance. Of course, as long as the X version is better only by 2-3%
Reminds me of the difference between 5700X and 5800X, the latter clocks higher but also has a higher power target. How fast would a 5800X be under 95W and how fast would a 5700X be under 125W....
amazing vid, was considering 5500 cuz of sites which were comparing the price to benchmark scores and so the price seemt very very good, but I dont do editing and stuff, now I know what I should go after, ty
What's the point of comparing games that are already running on more than 120fps ? Compare those games that struggles to push 60fps on high settings not everyone has a 144hz monitor bt each and every guy has a 60hz monitor. test these cpus on games like cyberpunk, Valhalla and dying light 2 these games may be not optimised well or more cpu demanding but they can keep your build ready for any titles if you can run these games at High on 60+ fps
Steve, you should not have pulled the first video, there is more than enough information out there for the viewers of your channel to make an informed opinion for themselves, this channel is one of the best if not the best for honest no shill reviews of hardware and I cant see a good reason that you unfortunately had to waste resources making this video when it could have been used for something else. Thanks for the excellent content.
Thanks for that. The response was strong enough at the outset that we decided it was worth doing just to be sure.
Thanks Steve
thanks papa
Been waiting three weeks for this. Going to order my parts now!
Tbh the 10100f is also insane value. I think you save 50-100 bucks depending on your config but lose barely any performance
10100f get's STOMPED on by 12100 etc
@@tilapiadave3234 yes, if you buy a 3080. not with a low end card like a 3050 or 60
@@jeremyleemartens801 LAUGHING ,,, I always plan at least one ,, often 2 GPU upgrades ahead.
@@tilapiadave3234 well, then you shouldn't consider budget CPUs anyways
@@jeremyleemartens801 You should consider what has caused your stupidity. Your pathetic if you cannot see spending $15 more INTIALLY in order to have at least one extra GPU upgrade free from bottle-necking is a fantastic investment. The 10100 WAS , PAST TENSE , a good value option , it NO LONGER is.
Wow thanks for listening to us. This is why i stay subscribed to you.
I ABSOLUTELY look for and buy based on FPS per dollar. You're right, that's not the only thing that matters and can mask other factors, but give your audience some credit. I believe anyone who has their own personal "minimum" metric, or needs production performance also, is smart enough to also watch that part of the review. You can also make a two-level chart like you do for avg, 1% and .1% fps, and show frames per dollar along with avg. I use HW unboxed charts like this a lot.
Yeah, I don't know why they made such a big deal of this "problem" when it's been solved perfectly by Hardware Unboxed. The only real criticism against performance/$ charts (in my opinion) is that costs can change. But even then, when you display the raw performance, users can easily update for current costs by just dividing.
I think that fps per dollar are stupid charts, as prices change constantly, they greatly depend of the rest of the system resolution, etc and don’t show performance at all. CPU fps per dollar is more dependant on the GPU than CPU in most cases. 13900k might be exactly the same fps with RX6500XT as Ryzen 5500 in CP2077. If you add the whole system cost, two systems with 100% price and fps difference will be exactly the same fps/$. Like I said, those charts are stupid and too subjective and easy to manipulate to add any real value to the review.
Remember that it also depends on the games you want to play with it. For example I play none of the games in the test and don't live in the US, so the numbers in the FPS/$ chart are completely meaningless to me. Neither the FPS part nor the $ part. But what I know is that in my most demanding game the 5800X3D beats the 12900K by about 40%
@@HappyBeezerStudios So I'm sure you go around saying that all FPS charts are worthless as well because you don't play those games, right? And you do realize that the relative % differences are literally just calculated based off the FPS numbers?
@@azethegreat958 All I'm saying is that the chart displays one situation that is limited in time, location and software used.
A rough first start, but to know what is the optimal choice for a person, they have to do the math themselves.
thanks steve
5:08 no snowmobiles, no ski’s, can finally afford to feed my family with groceries.
The human eye can't even see the first comment.
Mad props for pulling and retesting. Much appreciated. Agree to disagree on the 20 to 30 diff on 5600. But at least there's charts to back everything up. So kudos and thank you.
No stress, just remember to include a good coverage of cpu's that are available and in and around the same price bracket. low end is as important as high end. its correct to include cpus in that prices range. its a shocker you didn't include the 4500 that could be picked up for £65. lol. Also Don't always take the new build approach , need to remember the amount of am4 boards out there. this will happen with am5 going forward
big no to the 4500, sorry. We already said what we think about that in the original review. We're not going back for it.
@@GamersNexus twas as joke hence the lol steve!
I like the Dr.Dre - forgot about Dre tune insertion with the rap from Steve
hahaha
I've always appreciated the data first -approach of this channel. If you stop listening actively and just zone out you really start to understand how the more basic consumer has trouble understanding the details when choosing pc components, specifically CPUs especially. "something-hundred-F" vs "something-hundred-X" again, different numbers, different postfixes, different values, different comparisons. Thanks to the team again for great content and thanks to Steve for having the patience for the narration.
Hey, your transparency and community engagement is great!
Just some feedback from a semi-regular viewer:
- FPS/dollar in CPU section isn't important to me personally, I rly appreciate them in GPU reviews though. I still appreciate you taking feedback from the community.
- The disadvantage of not being able to show absolute performance and fps/dollar simultaneously was solved by other reviewers by showing both on the same graph
I feel like this video is even more relevant today with everything going on in the YT tech reviewer space. THIS is how you handle criticism. Objective responses and explanations for why things were done or are done, also with acceptance of criticism and no blame being placed.
20:15 I know it's a slippery slope, but if we're really concerned about cost, then we must also consider motherboard pricing. You'll easily make up any value lost on the extra ~$20 for the AMD CPU with a B450 motherboard. There are new Asus Prime B450 boards for $80 on Amazon right now. I doubt you'll find a new B660/760 mobo for less than $115 without a sale or promotion.
Can get B660 for your 12100f on Amazon for as cheap as $94, so you're really not saving a whole lot, beyond that there are several others for $109 or less. For people looking to game on a budget, the 12100f is the best value.
Thank you guys for all that you do! Great comparison and information. Yall are a class above when it comes to reliable, unbiased and data driven information. I truly appreciate that you are here to provide that. Ignore the jerks/fanboys. It is a testament to the state of our society when you have to make disclaimers about things when common sense should suffice. Cheers.
Removing CPUs from the chart:
- 100% agree, saves time, looks better, zero complaints.
Removing the 5600, keeping the 5600X:
- Don't quite understand
- In a conversation about value, it would make sense to show the better value choice of two similar-performing products.
Industry-leading work in terms of dedication to clarity though, love everything you do.
The 2nd Look and Extra work you put in to this review was great appreciated. This is one of many reasons that sets GN far apart from other tech channels in my opinion.
@14:00 maybe keep frames per dollar, but rank cpus on the chart by frame rate...
Example 4th one down is a cyrix at 60fps and scores 9 f/$. 5th place is a pi3 at 30fps with a score of 7 f/$. 6th is a pentium 4 at 25fps with a score of 8 f/$.
Great suggestion ! (From here below it's my comment LOL)
I would also put the (assumed) price used for the FPS/$ in the legend, that way it's clear :)
I would also suggest to add a subjective part at the end, where GN could say that : "under xxx$ it's a steal/good value/ etc" , "over xxx$ it would be too much, and we would recommend ...". Maybe it could be done by putting the "inflexion price" in the FPS/$ chart. By "inflexion price" I mean the price where the opinion changes.
I understand that GN wants to be the most objective as possible, but IMO such input from reviewers who many products is also a data point in itself.
Anyway, I'm trying to open the discussion :) Feel free to comment and respond to my ideas :)
Love your hard work and looking forward to your fan testing :P
I am glad you pointed out that i3 12100F also goes on discount sale. Some big tech tuber only pointing out that ryzen 5600 goes on discount sale. Its so easy to be fair but some just choose not to be. I am glad you are not shaping your narrative to favor any brand.
you did so much work to fix a problem you considered worth fixing... respect.
I'd love to see a value battle royal but with cpus that have graphics. with the cpus released on ces. (whenever you get them)
5:07 "...and mothu-truckas act like they forgot about Steve!"
Awesome work. Inclusion of a test done on budget/mainstream CPU&GPU combo would be epic.
Thx for adding more Cpu's in. I am a fan of that. And yes I know it makes the chart busier, but I am ok with that
Thanks Steve, I think it is good to do these round-up comparisons each generation: one for absolute budget build CPU (like this video), one for best value mainstream CPU, and one for the absolute best CPU. Thanks for all the work you guys do - now take a break for a week before the Ryzen 7000 X3D review samples come in!
Back at it again bringing the most useful review:) Thanks Steve
I really wish the review was done with reasonable GPU's for the segment, on a budget gaming build no one is going to have an RTX 3090ti, now I understand it's to show which CPU is technically better, in reality it's going to make people building budget systems that they are going to see the in some cases large increase in performance, but when compared with a GPU that would actually be compared with those CPU's then the difference is going to be almost nothing.
Great job. I didn't see the first one, but I was pretty confident that the end result wouldn't be different. :)