Byte Size Techs: Dual vs Quad Rank RAM - 2 DIMMs vs 4 DIMMs - Which Is Better? - ruclips.net/video/GThdz0kPBms/видео.html Is It OK to Mix RAM Kits? - ruclips.net/video/qxxrnsUXcko/видео.html Does 32GB of Ram REALLY Make a Difference in Your PC ??? - ruclips.net/video/JupBvOfoHjA/видео.html Is 16 GB of RAM Enough for Gaming in 2020? - ruclips.net/video/WimRwGv0XNA/видео.html
I have not run into a game yet that uses more than 16GB of Ram, but if your doing things like video editing then getting 32GB might make a difference. The price increase between 2x8GB vs 2x16GB isn't too much so might as well get 32GB even if your not going to use it. Maybe a future game will use it?
Any Techs out THERE(?) ..... Will my newer revision.... (5.23 or 4.23 REV 'if it matters) .... 3000mhz - Quad Channel Corsair Dominator Platniums... (all from same package) ...work with my X99 platform and its XEON 2679 V4 that I am about to work with on my ASUS-E-WS-USB/3.1 MoBo? They are 1.35, I know they say 1.2 is better so will it just under-clock to the XEONS highest rating as in 2400 or 2666mhz or around give or take?..... I hope, also, if so can I lower the wattage to like the standard x99 broadwell 1.2v for ram or even lower then the 1.35 rating to 1.25 or anything lower if its going to run lower anyway or might it not run at all? That would be terrible as it was over $200 used after tax... barely used but used.... eBay. :)
@@coolcat23 The test is garbage. It puts comparable speeds (3200 vs 3600) against each other in heavily GPU-bound titles. It should compare 3600 to regular 2666 and put them against each other in CPU-heavy AAA titles, like Cyberpunk or RDR2.
@@DFDark2 Those games are hard on CPU, but not so much on GPU. What is ideally needed for establishing RAM utilization is a game that puts GPU into full work while loading up a lot of graphic data. Huge 3D open-world games are perfect for that. Unless you want to measure something like turn time in Civ6 but that's a different, very game-specific test, while most gamers are mostly interested in FPS gains. Either way, RAM frequency can give significant performance gains in many games. This test just happened to mostly benchmark games that aren't as responsive to RAM speed as some others.
The test is inherently flawed. 1% being greatly better because of better RAM, is not a strange result, and is far more prevalent, on weaker CPUs, when you have the slightest amount of bottleneck. Im running a 2600 with a 1080, and as im running games with optimised and minimum settings, the difference between 3200 b-die stock and 3200 OCed. It's not only big, but the minimums sometimes are 2x. My fps in heavy titles, is incredible consistent. RAM speed does not limit performance. CPU might limit performance, thus you increase it with faster RAM, and by faster RAM i dont mean RAM speed only, but tight timings aswell.
I'll take full system stability over a 5%-10% performance increase that I'll never notice when actually using my computer. Run 32GB (4x8GB) at 3200 CL16 and have zero complaints.
You're not talking about 5% either, in some tests on Ryzen you get slightly higher fps on the 3200, on others it's the 3600 but it's never more than 1.5%, it's supposedly a little more on Intel but I doubt that it's more than 2%. The next AM5 socket motherboards may have a more dramatic difference but at present the sweet spot is definitely in between the 2, (3466 apparently) but there's no real difference so the cheaper, more stable 3200 is the better option.
@@paulm2467 I'm really glad you said this. I'm building my first PC and went with 3200 because it seemed like it would be the most stable instead of trying to push for 3600. Makes me feel justified XD
What are you all talking about! That’s the new CPU Thermal Paste he got there... it’s just peanut butter coloured! That’s why tech’s benchmarks are so crispy and tasty!
Actually tuning the subtimings gets u the biggest improvement in terms of perf or fps in gaming , way more then just setting the freq. higher , but thats a little time consuming but worth at the end.
@@sorinpopa1442 I remember seeing a video showing the fps difference & the going from the lowest ddr4 ram to the highest was only 5-10fps increase 🤷🏻♂️
@@mikeramos91 depens on the game also some u only get 2,3 fps dif at max in some games. in some best case scenario up to 15 or more especyally if u play at lower res 1080p or so . Not huge ya but just for tuning ur ram subtimings free perf. gain basically.
@@sorinpopa1442 I increase the MHz one step at a time until the computer wouldn't boot. Clearing the CMOS via the jumper didn't fix it. Removing and putting back the battery didn't. I had to remove a stick of RAM and I assume it reworked its way through a process to recognize it again. Later I put the other stick back, set XMP and everything works the same as before I tried that stunt. It is a 3200 MHz kit, stepped it up to 3333 seeming stable. Now I'm squeamish about trying the timings. Default is c16.
@@jasonlisonbee Very painful process. Glad i have a gigabyte motherboard which goes to default settings whenever ram oc is invalid. It took me time to oc 3000MHz ram to 3600MHz on a first gen Ryzen. Fun times.
I love this channel since it's so straightforward and blunt. More channels need to talk about the fact that you'll only see some differences on a chart, not buying what you dont need, etc. Great stuff!
The testing was "blunt" indeed. Completely unreliable. Getting worse(!!!) performance from faster RAM is not explained by stating that often RAM is not the bottleneck in a system. Rely on these results at your own peril.
Used to be so informative until the talk show with his wife who doesn't contribute anything. I unsubbed. Let's see if it's back to being informative and not gibberish for an hour.
When I first built my system with DDR4 3200 I forgot to make the changes in the bios. When I realized that and did it the most noticeable difference was in system operations in general not in game speed.
One setup where I noticed a overall difference was going from 2666 to 3200mhz on a i5 10400 on a z490 board. The Z board was only $20 more than a decent b460. Everything just feels snappier and certain games like Overwatch netted me another 8-10 FPS avg. that’s a big difference
DDR4 CL14 running 14 14 14 14 34. Cut the time to engage the connection and run it on quad channel. People want faster and faster RAM speeds. AMD offered us QUAD Channel on the TR three years ago. EPYC and Threadripper Pro run 8 channel RAM.
@@MJ-uk6lu At the Threadripper launch no utube channel talked about quad channel doubling ram transfer speeds. Again at TR PRO launch no utube channel talked about 8 channel ram being 4X the speed of dual channel. I had to go to websites for Asus and AMD then search info on the tech terms they used. For that matter no one talked about GEN4 AMD cpu lanes Vs Intel's GEN3 lanes. Faster dual channel may help, but it kills the low latency. WRX80 is the new 'WINNER'.
I've been a professional tester for CPUs, Motherboards, Memory and GPUs for 20 years, working for a leading PC magazine, beginning in the early 90th. I always been told by vendors, that the new generation of RAM will be the big game changer. It never was. There been slight improvements between the generations (DRAM to SDRAM to DDR to DDR2 and so on). But never in a way that we've could recommend Our readers to upgrade because of memory speed. The differences in-between the same generation of RAM (i.g. DDR4) are even more insignificant. All this memory hype reminds me on HiFi enthusiasts who spend a fortune for some voodoo cables. It only improves your system if you firm believe in it.
there is some truth to the quality of cables for HiFi - pure cooper over aluminum clad copper or just cheap aluminum wire , higher gauge over lower gauge, ( 14 gage vs 12 gage )
I'm just getting back into building computers since the switch from AGP to PCIe (just been getting stock laptops through work...bleh...) and really did not have the time to parse out 12+ years of technological progress. Your videos have been an amazing way to catch up on the times and figure out what is worth spending money on these days. Thank you!
Having your RAM in dual rank nets a much bigger difference in games on my 5600X as covered by so many videos. I got 20+ more average FPS in Shadow of the Tomb Raider bench going from a single rank kit to a dual one (tested with an RTX 3070, 1080p highest preset + TAA)
Just run 4 sticks of single rank gives the same results as 2 sticks of dual rank. I tested 4 sticks of single rank 8GB DDR4 3200 16 vs 2 sticks of dual rank 32gb ddr4 3200 16 and all memory and game tests were the same. ZERO difference.
11:52 About your final thoughtson server hardware being okay with "slower RAM," don't forget they are usually quad channel (or more), which vastly offsets the slower clocked RAM. The consumer desktop space, being only dual channel, has to rely on faster clocks / lower latencies. Server builds prioritise slower, more stable, lower power configurations.
I had heard that AMD 5000 chips relied heavily on RAM speed to achieve optimal performance, but I guess that was only applicable to the older generations when upgrading from speeds below 3000 MHz. Thank you for this video - it cleared things up for me. I suppose I mostly wasted my money on these 3600 MHz CL16 RAM sticks I bought, but I guess it doesn't hurt to have them, though. Now, I'll probably just run them at 3200 MHz instead of 3600 for the sake of stability when I finish my build.
about the stability bit,its true. Cannot run my "expensive" xlrb 3600 mhz at 3600,windows keep bugging out and I had to change it for 3200 mhz. Dont know if it will stabilize these overpriced ram
@@luckydepressedguy8981 the sweet spot for amd even though it’s recommended to be 3600mhz is actually 3200mhz I didn’t even bother wasting the extra 100 bucks on the 3600 mhz version of my ram sticks
This was an excellent comparison that will provide cost benefits. I especially liked hearing that it would be more effective to buy more ram to improve performance. When I do my next build I will use this information. Thanks
nice. Gonna upgrade from a 2600X to 5600X sometime next year and was thinking about a RAM upgrade from my 16GB 3200/CL16 RAM to 16GB 3600 RAM to maybe push the new Ryzen. Well, when I see these "dramatic" differences I'm better off saving my money and just keep on using the existing ram, which works like a charm anyway. :)
this guys got no idea, Higher ram speed kits can be lowered down and timings tightened I bought 4 sticks of 8gig DDR 4 4000 then lowered the timings down to CL 14 14 14 14 28 at 3600 or CL16 16 16 16 32 3800 depending on the games I wish to play or work on, Running on a 5950x and got around 15% - 20% boost in most games
Id love a timing tuning video for older systems like DDR 3. Often on locked down or oem boards its the only thing you can tune as even BLCK is locked. (looking at you Dell and HP) And maybe ram speed/Timings effect on APU to cater to the budget and international crowd. :)
Good advice. Just ordered a kit of 2x8 DDR4 3600 CL16 to replace the 2400 sticks in my rig. Cost finally came down enough that I thought I'd try it. 3600 wasn't in the budget when I built it. Would have liked to see some comparisons between 2400 MHz ram.
This shows there's no need to overclock my RAM for gaming. Higher voltage, less difference in performance. Instead I reduced clocks on my stock 3200MHz without change of voltage ^^ Thanks for this video, it helped me pick good way ♥
Wish I saw this 2 weeks ago. Had 32GB(2x16) 3600CL18 in my PC. Decided I'd need at least 64 for my video work after seeing the 32 fill up pretty quick on small projects. Figured I may as well go all out and bought 64GB(2x32) 3600 CL16 instead. Zero difference between CL 18 and 16. Aside from the price.
Great stuff! I have 64GB of 3600MHz and I've been running it at 2133 for ages as a 3D artist. Thought I'd try running it at 3200 and 3600 and saw no difference in my workflow except sometimes I would BSoD during meetings or while working 👌 Back to 2133 for me
Yeah, got myself a 64gb 3200MHz ram kit for productivity. I'll probably check if docp on/off makes any difference for the kind of workloads I have, but in any case its pretty much academic, my pc is already anywhere from 2x-5x faster (depending on the task) than that of my colleagues, so I doubt adding maybe 5% is even going to make a difference even if it does materialise. Productivity is not that different from gaming in that respect, if its not a whole tier faster it doesn't matter. The question isn't is it running 2 or 3 seconds, the question usually is is it running 30-150 seconds or long enough for me to get up and get a coffee in the meantime.
RAM speed, primary, secondary, and tertiary timings don't amount to a hill of beans for most users in most applications. Almost all the time, something else is affecting performance to the point where RAM speed is inconsequential. Using mildly slower RAM will go unnoticed in real-world use but will have a noticeable effect on your real-world wallet.
I have 16 GB of 3200mz C16 RAM, and am debating if I should upgrade to 32GB(2x16) 3600mz C16. I know my motherboard is dual channel so I would only want 2 sticks. I just upgraded from a 3600X to a 5700X, and have a X570 motherboard. I am a multi-tasker, sometimes training in Runescape(3), and playing COD at the same time, while also having OperaGX and discord open, while voice chatting in there. I'm one to make sure I'm getting good use of my money before upgrading, and it seems like a good idea to me, but would love your thoughts! The same kit you have in the video is $96 right now on amazon, is pretty tempting! Great video as well, as always, and happy new year!
CPU's caching algorithm, branch prediction speculation and stuff is what makes the CPU show this kinda result. A CPU core is unaware of the RAM's existence. All it needs is that specific data block must always be available onto it's local cache memory (L1 data and instructions cache), it takes around ~1 nanosecond to get the core to fetch the data from here. If a core can't find data in its L1 cache then it waits for the L2 cache to send the data over to L1. Since it takes time for the data to arrive fron L2 to L1 and then accessing it (total time is around ~5 nanoseconds). But, if that data block is not available in L2 as well, then that core will have to wait until that data block comes from L3 to L2 to L1 and then access it (takes around 10-15 nanoseconds). And, if that block of data still isn't founded, even on the L3 cache then.... oh boy this is going to be the longest time for CPU to wait for that data block since that data block will now be fetched from thousands of miles away; from the RAM, and it takes about (40 - 80 nanoseconds). This is when all that "RAM latency" thing starts to matter. Fun fact: for most CPUs the "cache hit rate" (it means the rate of finding the required data block for a core on its L1 cache) is said to be near ~95% for modern CPUs. That means, 95% time of the total running time of your application, a CPU core will always find all the data that it needs on its local cache (L1), and, only for 5% time that core will have to wait for the data to arrive from other type of memories (L2, L3, L4 and RAM). Now consider a theoretical CPU that always finds data on it's cache, it means 100% cache hit rate. That means RAM latency doesn't matter anymore, at all. But reality isn't like that. I think, the reason you don't see AMD Ryzen CPUs suffer from real world performance loss despite these have the highest memory latency is not because their architecture is efficient in caching algorithms, it is because these CPUs have way way more cache memory size than any Intel CPU. That probably keeps the cache hit rate higher most of the times. Intel on the other hand don't have the advantage of using a large cache (for now), hence Intel has to use superior caching algorithms, branch prediction stuff. This is why despite Intel is low on cache memory they are still competitive. But, in the end when the frequency of data blocks keep increasing (future even heavier data sizes; AAA games) , only having a superior caching mechanisms won't be enough and you will need more cache memory as well, eventually, and this is where AMD has more chances to be in the future competition than Intel. At least current theory based on my speculation says so.
Why do people pay more for faster RAM? Because it makes a (small) difference in a proper system that does not have some issues. Server farms are an entirely different matter; inadequate analogy.
Beautiful test bench. One of my projects is to turn an old case into an open air test bench. I'll send a link to my Twitter feed when I can show something.
So what i don't understand is, why do many Tech channels or people online say that for Ryzen you should get 3600 instead of 3200? I always see comparisons where the difference is like 2% or so but price difference is about 10-20% more.
Here are the differences between your regular PC gamers and the Rams Cult/guru's/Junkies. PC Gamer's- buys almost the cheapest rams he can get because it doesn't improve his fps enough to warrant the more expensive rams. Ram Guru's- Spend more money on his Rams than his CPU. Hoping to get that one golden bin Samsung B-die that can do CL 15-15-15 at 4400 MHz. He would skip sex if he can shave off just one more nanosecond off his ram latency. He would spend countless hours, weeks, and months living inside his pc bios tweaking every setting there is and leaving no stone unturned if he can just find that one setting that would shave off another 0.5 nanoseconds off his ram latency. He remembers all his Rams RLH/IOL numbers by heart and his most used words are "Please please post" begging and praying to his motherboard after putting in a new set of RLH/IOL numbers. Rams are not just about fps and the one that you buy from the bargain bin rack even at real high speed but the timing is way too loose to tell the differences. The super rams like Samsung b-die is not just about speed but tight and very low latency something like this is a CL 16-16-16 at 4400mhz and super tuned to 35 nanosecond latency or lower. I promised you, it will blow your mind!
I am not shocked, I was expecting maybe a bit more difference, but even the most convincing videos on RAM speed showed fairly minimal differences after jumping through quite a number of hoops and using the most convincing benchmarks and tests. That being said, I just obtained 32GB of 3600 CL16 for my next build (was running 16GB of 3200 CL16), mainly cuz it was only about 10 or 15usd more expensive than the 3200 RAM, maybe it'll be utilized better in the future? dunno, but I got it anyway
I'm really curious to know if I bought bad RAM with my RTX 3080, Ryzen 5900x. I've got 3200Mhz CL18 4x8GB I didn't have a lot of options with the vendor. The other configs was 3200Mhz Cl16 2x16GB. I really tried going for a balanced build, but this is honestly bothering me. Should I sell the memory and go for something else like 4x8GB 3600 Mhz Cl16 instead? Will I notice any difference? Maybe just go straight for 64GB?
the server comparison does not make sense when servers are all about efficiency and gamers just often care about raw speed, which does not matter in servers where your heat matters much more
Yea, I've done this before, but people need to see it again. I may do a follow up using Zen 2 on a B550 just because. Or not... maybe Intel? It doesn't really matter however.
Any chance you'd be willing to do this test again only with a 5900x? Not so much for the memory speed itself, but curious if there will be any potential greater gains for a multiple chiplet CPU with the Infinity Fabric also going up in speed.
I did a little research myself before I built my 2nd to last cpu and found the same thing. Even in some newer RAM you sacrifice performance. Glad I new this, a lot of people don't know this. I chose 32 GB Trident Z 3200 and they work great for everything I do without a skip!!! Great video!!!
Always depends on the game. In my case the higher ram speed brings me more minimum fps in World of Warcraft and in raid this is very important. In terms of price, there isn't that big a difference. If you then make the effort and adjust the timings and sub-timings, you can get good performance out as long as you are within the CPU limit.
Really glad you made this video! This is what we need to shut up those that think they know and really don't. I am one of those surprised because it's always someone posting you need FASTER RAM. Yet, personally I never saw why. So, really glad that at least some testing is done to show that more RAM speed, for the most part, is mostly hype and mental comforting.
I recently upgraded my old rig a few weeks ago to a Ryzen 7 5800x, and picked up G.skill ripjaws dual ch. 16GB 3600mhz CL16 which was only $17 more than the 3200mhz thinking the 3600 would perform better, and an MSI B550 MPG gaming carbon wifi board, WD M.2 SN850 7000mb/s read. And for now an XFX RX 580 8GB
Why did you chose to compare two relatively fast memory modules? Would you not see a more realistic difference if you compared some slow cheap ram to some 3600mhz?
So i have a B450i and Ryzen 5 2700, i got DDR4 3600, Bios says its running @3600... System seams stable... Is there a possibility thats its really not running at that speed?
Thank you for this, I thought I was about to be told I bought the wrong RAM. I'm only interested in running stock, no overclocking. I kept seeing people say that Ryzen likes faster ram, but figured since the motherboard and CPU both say 3200, that's what I should buy. I think I've also read that faster RAM would be throttled, but I wasn't sure. Glad to see it wouldn't have made much of a difference either way.
Could you test a 4x8gb (4ranks) kit vs a 4x16gb(8ranks) kit at 3600mhz and cl16? It would be interesting to know if for any weird reason running an 8 rank configuration would hurt or maybe even improve fps numbers, if just by a little.
Essentially, u might gain 2-5 FPS on a good day going from 3200 to 3600 xD. But with Amd, if you want your speeds to run 1:1 with your infinitu cache, 3600mhz will do that, while any faster will give no further gains, any slower might actually hurt you ever so slightly. With Intel, You can just get the slower ram regardless lol
Would the difference be apparent on Ryzen chips with more than one chiplet? I heard the bandwidth limit was with the infinity fabric, and that fast ram would help alleviate this. The CPU tested has a single chiplet, and therefore doesn't rely on infinity fabric like the multi chiplet processors. Right?
Totally unrelated, but can you test some old CPUs vs last gen CPUs? For example i5 2500K/8GB/old GPU vs i5 10600K/Ryzen 5 5600X/16GB/newer GPU. Of course the new ones will be N times faster, but many will keep their PCs for many years and for general tasks like office work, movie watching, browsing, photo editing in Lightroom/Photoshop, even some old games will still work. With these in mind, an upgrade should be made so how will this translate for a new PC? Times in exporting photos, smoothness in general, browsing, and so on. Might help and could be interesting to see how everything evolved over 8-9-10 years compared side by side.
Silly question but am I able to use two ddr4 4000 sticks with two ddr4 3200 sticks side by side? I would put the 3200 sticks in the dimma1 slots 😳 or does my ram need to all match up
Since the F Clock in Ryzen 3000 series CPU’s stays 1:1 until 1900Mhz, wouldn’t it make sense to say get a 4000Mhz Ram kit and clock it down to 3800Mhz with tighter timings? Rather than getting a 3600Mhz kit and trying to overclock it to 3800Mhz (and loosening timings)?
I was able to find Triden neo 3600mhz ram with CL 14... 32 gib kit.. problem is it costs twice as much as cl 16 for the same RAM frequency... if I want the ram for video rendiring and such... will the CL14 be a better deal for a processor like 5900x?
so adding an extra ram slot into my laptop wont speed up my laptop unless the ram is currently hitting 100% utilizing? what about dual channel then? when i have many chrome tabs open, like 20, my pc becomes slower tho, which part is bottlenecking then?
@@madmax2069 Hi max.Go back to the launch of the TR 1900X. An 8 core Ryzen Vs an 8 core TR with quad channel memory was a perfect opportunity to show 4 Vs 2 channels. Utube never did this. Only the fact that a 16 core 1950x could be installed. X370 boards had one X16 socket and TR had X16 X8 X16 X8 was ignored too. In two years AMD dropped the 8/12/16 core TR cpus. 2020, oh hell a quad M.2 PCIe card won't work on an X570 board. Oh hell one M.2 is on cpu lanes and one M.2 is on shared chipset lanes. Oh hell, a new M.2 gen4 drives run at 3500/3500 on Intel and 7000/6600 on AMD. And when did you see 4x8GB Quad 3200C14 Vs dual 3200C14 on Utube? It's just crazy that servers run 8 channel memory.
So you're saying newer motherboards can deal with ram speed better. But how about laptops? I bought an MSI laptop last year and I want to upgrade the RAM but how do I find out if my motherboard can support those RAM speed?
I recently picked up a Ryzen 2700X, Gigabyte B450M Aorus Elite and a 16GB (2x8) dual channel DDR4-3600 kit from Teamgroup for the same price as a DDR4-3200 kit from Kingston, so I thought, why not? Gigabyte's website says the board will support up to 3600 (OC). I went to the BIOS straight away (where else?) and there I just loaded the XMP profile and it's 3600 right away. I don't even know if I can run at 3200. Default with no XMP is 2400. Just two choices, I think: 2400 and 3600. The RAM kit isn't even in the QVL list although it says it's optimized for AMD. What do you think?
Byte Size Techs:
Dual vs Quad Rank RAM - 2 DIMMs vs 4 DIMMs - Which Is Better? - ruclips.net/video/GThdz0kPBms/видео.html
Is It OK to Mix RAM Kits? - ruclips.net/video/qxxrnsUXcko/видео.html
Does 32GB of Ram REALLY Make a Difference in Your PC ??? - ruclips.net/video/JupBvOfoHjA/видео.html
Is 16 GB of RAM Enough for Gaming in 2020? - ruclips.net/video/WimRwGv0XNA/видео.html
I have not run into a game yet that uses more than 16GB of Ram, but if your doing things like video editing then getting 32GB might make a difference. The price increase between 2x8GB vs 2x16GB isn't too much so might as well get 32GB even if your not going to use it. Maybe a future game will use it?
AMD/DRC Gamer R9 ram and Samsung B-die on a quad channel Threadripper run mixed kits without a hitch.
I wonder if that changes at the higher end of the cpus. Like 5900x and 5950x
@@VadimZverev xmp 2 would probably be better xmp runs all kinds of voltages you dont want mb vendors love to crank the voltage
Any Techs out THERE(?) ..... Will my newer revision.... (5.23 or 4.23 REV 'if it matters) .... 3000mhz - Quad Channel Corsair Dominator Platniums... (all from same package) ...work with my X99 platform and its XEON 2679 V4 that I am about to work with on my ASUS-E-WS-USB/3.1 MoBo? They are 1.35, I know they say 1.2 is better so will it just under-clock to the XEONS highest rating as in 2400 or 2666mhz or around give or take?..... I hope, also, if so can I lower the wattage to like the standard x99 broadwell 1.2v for ram or even lower then the 1.35 rating to 1.25 or anything lower if its going to run lower anyway or might it not run at all? That would be terrible as it was over $200 used after tax... barely used but used.... eBay. :)
"RAM speed is only going to improve performance when RAM speed is the limit to performance." Well said. Thanks for this video. 👍
Yes, but where do the worse(!!!) performances come from? Something is obviously not right with these tests.
@@coolcat23 The test is garbage. It puts comparable speeds (3200 vs 3600) against each other in heavily GPU-bound titles. It should compare 3600 to regular 2666 and put them against each other in CPU-heavy AAA titles, like Cyberpunk or RDR2.
@@steelin666 Not sure about those titles. More like late game Stellaris, Civilization etc.
@@DFDark2 Those games are hard on CPU, but not so much on GPU. What is ideally needed for establishing RAM utilization is a game that puts GPU into full work while loading up a lot of graphic data. Huge 3D open-world games are perfect for that.
Unless you want to measure something like turn time in Civ6 but that's a different, very game-specific test, while most gamers are mostly interested in FPS gains.
Either way, RAM frequency can give significant performance gains in many games. This test just happened to mostly benchmark games that aren't as responsive to RAM speed as some others.
@@steelin666 oh god. another "ddr speed matter" enthusiast despite overflowing proof on internet. smh.
Test like this is all I need from a tech channel.
The test is inherently flawed.
1% being greatly better because of better RAM, is not a strange result, and is far more prevalent, on weaker CPUs, when you have the slightest amount of bottleneck. Im running a 2600 with a 1080, and as im running games with optimised and minimum settings, the difference between 3200 b-die stock and 3200 OCed. It's not only big, but the minimums sometimes are 2x.
My fps in heavy titles, is incredible consistent.
RAM speed does not limit performance. CPU might limit performance, thus you increase it with faster RAM, and by faster RAM i dont mean RAM speed only, but tight timings aswell.
I'll take full system stability over a 5%-10% performance increase that I'll never notice when actually using my computer. Run 32GB (4x8GB) at 3200 CL16 and have zero complaints.
That's the full honest truth right there!
That's what I did. I bought 16GB of 3600 18CL and there was no real world difference. In fact in benchmarks the 3600 ran slower.
You're not talking about 5% either, in some tests on Ryzen you get slightly higher fps on the 3200, on others it's the 3600 but it's never more than 1.5%, it's supposedly a little more on Intel but I doubt that it's more than 2%. The next AM5 socket motherboards may have a more dramatic difference but at present the sweet spot is definitely in between the 2, (3466 apparently) but there's no real difference so the cheaper, more stable 3200 is the better option.
Does that explain my games crashing and errors on memtest when xmp is enabled at 3600mhz? Would be nice if the ram worked as advertised lol
@@paulm2467 I'm really glad you said this. I'm building my first PC and went with 3200 because it seemed like it would be the most stable instead of trying to push for 3600. Makes me feel justified XD
Me the whole time: What's that peanut butter doing there?😂
The strangest thing that someone has bought with his referral link
gives extra 30fps at 4k with ray tracing at ultra
@@no.no.4680 time for a challenge
What are you all talking about! That’s the new CPU Thermal Paste he got there... it’s just peanut butter coloured! That’s why tech’s benchmarks are so crispy and tasty!
He's going to overclock it in the next video 🤣
to be clear: ram speed does make a difference, just not big enough to be worth it
Actually tuning the subtimings gets u the biggest improvement in terms of perf or fps in gaming , way more then just setting the freq. higher , but thats a little time consuming but worth at the end.
@@sorinpopa1442 I remember seeing a video showing the fps difference & the going from the lowest ddr4 ram to the highest was only 5-10fps increase 🤷🏻♂️
@@mikeramos91 depens on the game also some u only get 2,3 fps dif at max in some games. in some best case scenario up to 15 or more especyally if u play at lower res 1080p or so . Not huge ya but just for tuning ur ram subtimings free perf. gain basically.
@@sorinpopa1442 I increase the MHz one step at a time until the computer wouldn't boot. Clearing the CMOS via the jumper didn't fix it. Removing and putting back the battery didn't. I had to remove a stick of RAM and I assume it reworked its way through a process to recognize it again. Later I put the other stick back, set XMP and everything works the same as before I tried that stunt. It is a 3200 MHz kit, stepped it up to 3333 seeming stable. Now I'm squeamish about trying the timings. Default is c16.
@@jasonlisonbee Very painful process. Glad i have a gigabyte motherboard which goes to default settings whenever ram oc is invalid. It took me time to oc 3000MHz ram to 3600MHz on a first gen Ryzen. Fun times.
short answer, save your money and buy the slower RAM, but spend more money to buy more RAM.
you can always download ram
@@tsuna111 dude do you think everyone can just download ram wifi is expensive
@@kyleforgeard4148 I downloaded RAM from a friend to my USB Flash drive stick, and then transferred it to my PC, it works.
Okay, buy lot's of fast RAM. Check!
or just buy ram that has a cas latency lower or equal to 16
I love this channel since it's so straightforward and blunt.
More channels need to talk about the fact that you'll only see some differences on a chart, not buying what you dont need, etc. Great stuff!
I appreciate that he takes a more pro-consumer approach where he talks about more common use cases instead of just talking over benchmarks.
Absolutely. It's such a grounded channel all in all.
The testing was "blunt" indeed. Completely unreliable. Getting worse(!!!) performance from faster RAM is not explained by stating that often RAM is not the bottleneck in a system. Rely on these results at your own peril.
Used to be so informative until the talk show with his wife who doesn't contribute anything. I unsubbed. Let's see if it's back to being informative and not gibberish for an hour.
You unsubbed so... why are you still here lol, you said it yourself you don't want to watch this and yet you're still here. Just to complain?.
When I first built my system with DDR4 3200 I forgot to make the changes in the bios. When I realized that and did it the most noticeable difference was in system operations in general not in game speed.
One setup where I noticed a overall difference was going from 2666 to 3200mhz on a i5 10400 on a z490 board. The Z board was only $20 more than a decent b460. Everything just feels snappier and certain games like Overwatch netted me another 8-10 FPS avg. that’s a big difference
CAS latency/CL is probably a bigger factor than frequency on fast modern CPUs nowadays, but even then don't expect much for most things.
DDR4 CL14 running 14 14 14 14 34. Cut the time to engage the connection and run it on quad channel. People want faster and faster RAM speeds. AMD offered us QUAD Channel on the TR three years ago. EPYC and Threadripper Pro run 8 channel RAM.
@@maxhughes5687 I'd happily take quad channel over two faster modules anyday.
2nds and 3rds make almost as much of a difference, if not more, than just just primary four timings.
@@maxhughes5687 People don't know what they want
@@MJ-uk6lu At the Threadripper launch no utube channel talked about quad channel doubling ram transfer speeds. Again at TR PRO launch no utube channel talked about 8 channel ram being 4X the speed of dual channel. I had to go to websites for Asus and AMD then search info on the tech terms they used. For that matter no one talked about GEN4 AMD cpu lanes Vs Intel's GEN3 lanes. Faster dual channel may help, but it kills the low latency. WRX80 is the new 'WINNER'.
Got 4400 CL19 trimmed to 4200CL16 since 2017. Keeps my 8700K competitive in games against 2020 chips.
please share aida64 memory bandwidth and latency score if possible. I do want to see how that looks like :D
I add the link to my channel description next to ram specs cause youtube won't let me share in the comments.
@@club4ghz okay thanks!
@@club4ghz saw it. No doubt that system will be relevant for a long time lol
I've got an 8700k, but my ram is only DD4 3000 lol
It just occurred to me that Tech Deals is probably going to write off that peanut butter as a business expense and I respect him for that.
I've been a professional tester for CPUs, Motherboards, Memory and GPUs for 20 years, working for a leading PC magazine, beginning in the early 90th. I always been told by vendors, that the new generation of RAM will be the big game changer. It never was. There been slight improvements between the generations (DRAM to SDRAM to DDR to DDR2 and so on). But never in a way that we've could recommend Our readers to upgrade because of memory speed. The differences in-between the same generation of RAM (i.g. DDR4) are even more insignificant. All this memory hype reminds me on HiFi enthusiasts who spend a fortune for some voodoo cables. It only improves your system if you firm believe in it.
there is some truth to the quality of cables for HiFi - pure cooper over aluminum clad copper or just cheap aluminum wire , higher gauge over lower gauge, ( 14 gage vs 12 gage )
I'm just getting back into building computers since the switch from AGP to PCIe (just been getting stock laptops through work...bleh...) and really did not have the time to parse out 12+ years of technological progress. Your videos have been an amazing way to catch up on the times and figure out what is worth spending money on these days. Thank you!
Having your RAM in dual rank nets a much bigger difference in games on my 5600X as covered by so many videos. I got 20+ more average FPS in Shadow of the Tomb Raider bench going from a single rank kit to a dual one (tested with an RTX 3070, 1080p highest preset + TAA)
can also use 4 normal rams for same effect
Just run 4 sticks of single rank gives the same results as 2 sticks of dual rank. I tested 4 sticks of single rank 8GB DDR4 3200 16 vs 2 sticks of dual rank 32gb ddr4 3200 16 and all memory and game tests were the same. ZERO difference.
Do you think peanut butter can be a good replacement for thermal paste
11:52 About your final thoughtson server hardware being okay with "slower RAM," don't forget they are usually quad channel (or more), which vastly offsets the slower clocked RAM. The consumer desktop space, being only dual channel, has to rely on faster clocks / lower latencies. Server builds prioritise slower, more stable, lower power configurations.
I had heard that AMD 5000 chips relied heavily on RAM speed to achieve optimal performance, but I guess that was only applicable to the older generations when upgrading from speeds below 3000 MHz. Thank you for this video - it cleared things up for me. I suppose I mostly wasted my money on these 3600 MHz CL16 RAM sticks I bought, but I guess it doesn't hurt to have them, though. Now, I'll probably just run them at 3200 MHz instead of 3600 for the sake of stability when I finish my build.
Yeah that’s good news
about the stability bit,its true. Cannot run my "expensive" xlrb 3600 mhz at 3600,windows keep bugging out and I had to change it for 3200 mhz. Dont know if it will stabilize these overpriced ram
@@luckydepressedguy8981 you could stabilize them at 3600mhz?
@@luckydepressedguy8981 the sweet spot for amd even though it’s recommended to be 3600mhz is actually 3200mhz I didn’t even bother wasting the extra 100 bucks on the 3600 mhz version of my ram sticks
Nah, run them at 4133.
This was an excellent comparison that will provide cost benefits. I especially liked hearing that it would be more effective to buy more ram to improve performance. When I do my next build I will use this information. Thanks
nice. Gonna upgrade from a 2600X to 5600X sometime next year and was thinking about a RAM upgrade from my 16GB 3200/CL16 RAM to 16GB 3600 RAM to maybe push the new Ryzen. Well, when I see these "dramatic" differences I'm better off saving my money and just keep on using the existing ram, which works like a charm anyway. :)
this guys got no idea, Higher ram speed kits can be lowered down and timings tightened I bought 4 sticks of 8gig DDR 4 4000 then lowered the timings down to CL 14 14 14 14 28 at 3600 or CL16 16 16 16 32 3800 depending on the games I wish to play or work on, Running on a 5950x and got around 15% - 20% boost in most games
Thank you sir
I have a b450 aurous mobo with a ryzen 7 5800x and rtx 3070 and 16gb 3200 speed ram, I wanted to see if I should upgrade to 3600 speed?
Id love a timing tuning video for older systems like DDR 3. Often on locked down or oem boards its the only thing you can tune as even BLCK is locked. (looking at you Dell and HP)
And maybe ram speed/Timings effect on APU to cater to the budget and international crowd. :)
Dell systems with ECC are going to be locked down for stability not kids tweaking. They have to be built like tanks not never go down.
I've got my memory sticks running @ 3733 and low CL 16s. THAT actually makes a bit of a difference.
What about tRCD, tRP and tRAS?
@@sabishiihito : I've got those cranked down quite low too, TYVM. 🙄
Great video as always.
Did you change the memory fabric multiplier for each ram ? 1600mhz vs1800mhz?
Good advice. Just ordered a kit of 2x8 DDR4 3600 CL16 to replace the 2400 sticks in my rig. Cost finally came down enough that I thought I'd try it. 3600 wasn't in the budget when I built it. Would have liked to see some comparisons between 2400 MHz ram.
What was your cpu?
What about timings cl 14 vs cl16 or maybe 3600 cl 16 vs 3200 cl14?
2 vs 4 sticks, single rank vs dual rank sticks
@@miniweeddeerz1820 Yep subtimimgs are important. Both of my systems Intel and ryzen run b die 3200 14 14 14 memory kits
This shows there's no need to overclock my RAM for gaming. Higher voltage, less difference in performance. Instead I reduced clocks on my stock 3200MHz without change of voltage ^^ Thanks for this video, it helped me pick good way ♥
was the infinity fabric changed to match the different ram speeds?
Wish I saw this 2 weeks ago. Had 32GB(2x16) 3600CL18 in my PC. Decided I'd need at least 64 for my video work after seeing the 32 fill up pretty quick on small projects. Figured I may as well go all out and bought 64GB(2x32) 3600 CL16 instead. Zero difference between CL 18 and 16. Aside from the price.
Great stuff! I have 64GB of 3600MHz and I've been running it at 2133 for ages as a 3D artist. Thought I'd try running it at 3200 and 3600 and saw no difference in my workflow except sometimes I would BSoD during meetings or while working 👌 Back to 2133 for me
The peanut butter is the secret advertisement for all Tech Deal's fans. Just how some food advertisement work in movie.
So, what about infinity fabric clock? What's the point of buying a racecar if you're driving city speed limits?
I’m so glad I watched this video before spending a crap ton of money on high speed ram for my Ryzen 7 5800X3D.
Yeah, got myself a 64gb 3200MHz ram kit for productivity. I'll probably check if docp on/off makes any difference for the kind of workloads I have, but in any case its pretty much academic, my pc is already anywhere from 2x-5x faster (depending on the task) than that of my colleagues, so I doubt adding maybe 5% is even going to make a difference even if it does materialise. Productivity is not that different from gaming in that respect, if its not a whole tier faster it doesn't matter. The question isn't is it running 2 or 3 seconds, the question usually is is it running 30-150 seconds or long enough for me to get up and get a coffee in the meantime.
RAM speed, primary, secondary, and tertiary timings don't amount to a hill of beans for most users in most applications. Almost all the time, something else is affecting performance to the point where RAM speed is inconsequential. Using mildly slower RAM will go unnoticed in real-world use but will have a noticeable effect on your real-world wallet.
That's true, but if you tell everyone the answer, they won't watch the video! :)
@@TechDeals They won't read my comment, so no worries.
the real question is, is that peanut butter a better thermal paste than mx-4?
I have 16 GB of 3200mz C16 RAM, and am debating if I should upgrade to 32GB(2x16) 3600mz C16. I know my motherboard is dual channel so I would only want 2 sticks. I just upgraded from a 3600X to a 5700X, and have a X570 motherboard. I am a multi-tasker, sometimes training in Runescape(3), and playing COD at the same time, while also having OperaGX and discord open, while voice chatting in there. I'm one to make sure I'm getting good use of my money before upgrading, and it seems like a good idea to me, but would love your thoughts! The same kit you have in the video is $96 right now on amazon, is pretty tempting! Great video as well, as always, and happy new year!
Look at that, AMD improved Ryzen that much that the 'speed' of RAM does not matter much any more. Wonderful!
CPU's caching algorithm, branch prediction speculation and stuff is what makes the CPU show this kinda result. A CPU core is unaware of the RAM's existence. All it needs is that specific data block must always be available onto it's local cache memory (L1 data and instructions cache), it takes around ~1 nanosecond to get the core to fetch the data from here. If a core can't find data in its L1 cache then it waits for the L2 cache to send the data over to L1. Since it takes time for the data to arrive fron L2 to L1 and then accessing it (total time is around ~5 nanoseconds). But, if that data block is not available in L2 as well, then that core will have to wait until that data block comes from L3 to L2 to L1 and then access it (takes around 10-15 nanoseconds). And, if that block of data still isn't founded, even on the L3 cache then.... oh boy this is going to be the longest time for CPU to wait for that data block since that data block will now be fetched from thousands of miles away; from the RAM, and it takes about (40 - 80 nanoseconds). This is when all that "RAM latency" thing starts to matter.
Fun fact: for most CPUs the "cache hit rate" (it means the rate of finding the required data block for a core on its L1 cache) is said to be near ~95% for modern CPUs. That means, 95% time of the total running time of your application, a CPU core will always find all the data that it needs on its local cache (L1), and, only for 5% time that core will have to wait for the data to arrive from other type of memories (L2, L3, L4 and RAM).
Now consider a theoretical CPU that always finds data on it's cache, it means 100% cache hit rate. That means RAM latency doesn't matter anymore, at all. But reality isn't like that.
I think, the reason you don't see AMD Ryzen CPUs suffer from real world performance loss despite these have the highest memory latency is not because their architecture is efficient in caching algorithms, it is because these CPUs have way way more cache memory size than any Intel CPU. That probably keeps the cache hit rate higher most of the times. Intel on the other hand don't have the advantage of using a large cache (for now), hence Intel has to use superior caching algorithms, branch prediction stuff. This is why despite Intel is low on cache memory they are still competitive. But, in the end when the frequency of data blocks keep increasing (future even heavier data sizes; AAA games) , only having a superior caching mechanisms won't be enough and you will need more cache memory as well, eventually, and this is where AMD has more chances to be in the future competition than Intel. At least current theory based on my speculation says so.
@@deletevil Sure makes that multi 8 core dies on a Threadripper look good.
Did you just use same sticks to compare? Are you sure when you underclocked them the timing didn’t changed? Because usually they do change.
Very eye opening and bodes the question: why would you ever pay an extra 30 for for 400MHz. Good stuff
People like bigger numbers :)
Why do people pay more for faster RAM? Because it makes a (small) difference in a proper system that does not have some issues. Server farms are an entirely different matter; inadequate analogy.
kingston fury beast 2x16gb 3600MHZ on ryzen 5600x has been a nightmare, freezing all the time with or without OC
Beautiful test bench.
One of my projects is to turn an old case into an open air test bench.
I'll send a link to my Twitter feed when I can show something.
So what i don't understand is, why do many Tech channels or people online say that for Ryzen you should get 3600 instead of 3200? I always see comparisons where the difference is like 2% or so but price difference is about 10-20% more.
Finally a straightforward no BS explanation, thanks man.
Glad it helped!
I very much appreciated the splits you made so I could go to final thoughts. That deserves a thumbs up.
Here are the differences between your regular PC gamers and the Rams Cult/guru's/Junkies.
PC Gamer's- buys almost the cheapest rams he can get because it doesn't improve his fps enough to warrant the more expensive rams.
Ram Guru's- Spend more money on his Rams than his CPU. Hoping to get that one golden bin Samsung B-die that can do CL 15-15-15 at 4400 MHz.
He would skip sex if he can shave off just one more nanosecond off his ram latency.
He would spend countless hours, weeks, and months living inside his pc bios tweaking every setting there is and leaving no stone unturned if he can just find that one setting that would shave off another 0.5 nanoseconds off his ram latency.
He remembers all his Rams RLH/IOL numbers by heart and his most used words are "Please please post" begging and praying to his motherboard after putting in a new set of RLH/IOL numbers.
Rams are not just about fps and the one that you buy from the bargain bin rack even at real high speed but the timing is way too loose to tell the differences. The super rams like
Samsung b-die is not just about speed but tight and very low latency something like this is a CL 16-16-16 at 4400mhz and super tuned to 35 nanosecond latency or lower. I promised you, it will blow your mind!
@@forthosewhodare7325 you sound like the latter
What is that racing game called at 11:05 ?
I am not shocked, I was expecting maybe a bit more difference, but even the most convincing videos on RAM speed showed fairly minimal differences after jumping through quite a number of hoops and using the most convincing benchmarks and tests.
That being said, I just obtained 32GB of 3600 CL16 for my next build (was running 16GB of 3200 CL16), mainly cuz it was only about 10 or 15usd more expensive than the 3200 RAM, maybe it'll be utilized better in the future? dunno, but I got it anyway
I'm really curious to know if I bought bad RAM with my RTX 3080, Ryzen 5900x. I've got 3200Mhz CL18 4x8GB I didn't have a lot of options with the vendor. The other configs was 3200Mhz Cl16 2x16GB. I really tried going for a balanced build, but this is honestly bothering me. Should I sell the memory and go for something else like 4x8GB 3600 Mhz Cl16 instead? Will I notice any difference? Maybe just go straight for 64GB?
So ddr5 is probably not even going to recommended for gaming on launch like pci gen 4 then huh
not until its optimized
faster memories only have an effect on the use of APUs....And the difference is quite noticeable!
Funny I just bought some 3600 memory for my B550 board. Guess I'm about to find out if my deal was a deal!
did it work?
the server comparison does not make sense when servers are all about efficiency and gamers just often care about raw speed, which does not matter in servers where your heat matters much more
Might be a dumb question, but would the results be about the same at higher resolution? 1440 and 4K?
The higher resolution, the more GPU-bound the task will be. So it basically makes even less difference when increasing the res.
Kan je met een ryzen 5 5500 3600mhz gebruiken?
Your findings are consistent: I recall another vid of yours covering the same thing in the past, with the same results.😊
Yea, I've done this before, but people need to see it again. I may do a follow up using Zen 2 on a B550 just because. Or not... maybe Intel? It doesn't really matter however.
@@TechDeals agree that it does not. Thank you for doing one of these again.
Any chance you'd be willing to do this test again only with a 5900x? Not so much for the memory speed itself, but curious if there will be any potential greater gains for a multiple chiplet CPU with the Infinity Fabric also going up in speed.
I would have to have a 5900X first...
I did a little research myself before I built my 2nd to last cpu and found the same thing. Even in some newer RAM you sacrifice performance. Glad I new this, a lot of people don't know this. I chose 32 GB Trident Z 3200 and they work great for everything I do without a skip!!! Great video!!!
Nice video! Can you mix a set of two 3200 sticks with another two 3600 sticks for a total of four? thx
Always depends on the game. In my case the higher ram speed brings me more minimum fps in World of Warcraft and in raid this is very important.
In terms of price, there isn't that big a difference.
If you then make the effort and adjust the timings and sub-timings, you can get good performance out as long as you are within the CPU limit.
Really glad you made this video! This is what we need to shut up those that think they know and really don't. I am one of those surprised because it's always someone posting you need FASTER RAM. Yet, personally I never saw why. So, really glad that at least some testing is done to show that more RAM speed, for the most part, is mostly hype and mental comforting.
Hell, I had no issue running 3600MHz ram on an a520 board with a 3200G.
Spreads peanut butter all over ram during first 15 seconds of the video, proceeds to watch the rest of the video 🫠
I recently upgraded my old rig a few weeks ago to a Ryzen 7 5800x, and picked up G.skill ripjaws dual ch. 16GB 3600mhz CL16 which was only $17 more than the 3200mhz thinking the 3600 would perform better, and an MSI B550 MPG gaming carbon wifi board, WD M.2 SN850 7000mb/s read. And for now an XFX RX 580 8GB
You're so great for providing this video so fast after the launch!!! thanks!!!!
Why did you chose to compare two relatively fast memory modules? Would you not see a more realistic difference if you compared some slow cheap ram to some 3600mhz?
Bought a 3600mhz 8x4 kit yesterday upgrading from a 2400mhz 8x2 kit. Funny how you upload the answers to whatever’s on my mind any given day, lol
So i have a B450i and Ryzen 5 2700, i got DDR4 3600, Bios says its running @3600... System seams stable...
Is there a possibility thats its really not running at that speed?
Thank you for this, I thought I was about to be told I bought the wrong RAM. I'm only interested in running stock, no overclocking. I kept seeing people say that Ryzen likes faster ram, but figured since the motherboard and CPU both say 3200, that's what I should buy. I think I've also read that faster RAM would be throttled, but I wasn't sure. Glad to see it wouldn't have made much of a difference either way.
Please can you say how much can I overclock Corsair Vengeance lpx 16GB 3200 mhz Ram
This dude's comedic timing is EFFORTLESS! 🤣 I'm subscribing right now!
Could you test a 4x8gb (4ranks) kit vs a 4x16gb(8ranks) kit at 3600mhz and cl16? It would be interesting to know if for any weird reason running an 8 rank configuration would hurt or maybe even improve fps numbers, if just by a little.
Essentially, u might gain 2-5 FPS on a good day going from 3200 to 3600 xD. But with Amd, if you want your speeds to run 1:1 with your infinitu cache, 3600mhz will do that, while any faster will give no further gains, any slower might actually hurt you ever so slightly. With Intel, You can just get the slower ram regardless lol
What about at 1440p and 4k there doesn't seem to be much of a difference
I suspect that your secondary timings are loosened automatically on the 3600mhz kit
Rather tightened automatically when 3200mhz is set. Should have used a lower binned, cheaper 3200mhz kit. Other tech channels show bigger differences
My gold star count from Tech is over 9000!
I see what you did there.
Would the difference be apparent on Ryzen chips with more than one chiplet? I heard the bandwidth limit was with the infinity fabric, and that fast ram would help alleviate this. The CPU tested has a single chiplet, and therefore doesn't rely on infinity fabric like the multi chiplet processors. Right?
great vid Tech! i'm gonna keep my DDR4-3200mhz ram for a long while!
NO one says dual or quad RAM. No one says at what latency.
@@maxhughes5687 agreed, testing only higher ram frequency is not a good test on Ryzen. The differences appear when you go dual rank and lower latency.
Good video. You just added some clarity to the big question "is the ram speed does matter for the new 3th gen ryzen " and I saved $300-$400.
Thanks for reassuring me, I bought a 3200 kit with the intent of getting a 3600 kit. Woosh.
You regret it?
@@lost_places_global9008 only for a day :P
@@wiltak_ I bought 3600 cl16
Only 6€ more and definitely better than 3200mhz cl16
@@lost_places_global9008 oh yeah for sure, no point in me regretting it now though. I doubt I'd notice much difference.
@@wiltak_ yeah; some say I shouldn’t buy a 3600mhz cl16 for my intel i5-9600K
I don’t get their points tho.
Totally unrelated, but can you test some old CPUs vs last gen CPUs? For example i5 2500K/8GB/old GPU vs i5 10600K/Ryzen 5 5600X/16GB/newer GPU. Of course the new ones will be N times faster, but many will keep their PCs for many years and for general tasks like office work, movie watching, browsing, photo editing in Lightroom/Photoshop, even some old games will still work. With these in mind, an upgrade should be made so how will this translate for a new PC? Times in exporting photos, smoothness in general, browsing, and so on. Might help and could be interesting to see how everything evolved over 8-9-10 years compared side by side.
Extremely helpful and very surprising results! Thank you!
Silly question but am I able to use two ddr4 4000 sticks with two ddr4 3200 sticks side by side? I would put the 3200 sticks in the dimma1 slots 😳 or does my ram need to all match up
Is the ram compatible with the peanut butter? 😂
Since the F Clock in Ryzen 3000 series CPU’s stays 1:1 until 1900Mhz, wouldn’t it make sense to say get a 4000Mhz Ram kit and clock it down to 3800Mhz with tighter timings? Rather than getting a 3600Mhz kit and trying to overclock it to 3800Mhz (and loosening timings)?
ram speed does make a small difference...why use ryzen 5 5600x for these tests?
Thanks!
thanks for this! perfect timing i been looking for ram for my first build 👍🏽
I was able to find Triden neo 3600mhz ram with CL 14... 32 gib kit.. problem is it costs twice as much as cl 16 for the same RAM frequency... if I want the ram for video rendiring and such... will the CL14 be a better deal for a processor like 5900x?
You just wasted alot of money
Very good video. 3200 is the sweet spot. I'd like to see a video for 2666 Vs 3200 on intel.
so adding an extra ram slot into my laptop wont speed up my laptop unless the ram is currently hitting 100% utilizing? what about dual channel then? when i have many chrome tabs open, like 20, my pc becomes slower tho, which part is bottlenecking then?
Going from single to dual is like night and day. Increasing RAM speeds gets you diminishing returns.
And utube never covered the advantage of going from dual channel to quad on a Threadripper cpu.
@@maxhughes5687 are you sure about that, seems like you didn't realize search too well.
@@madmax2069 Hi max.Go back to the launch of the TR 1900X. An 8 core Ryzen Vs an 8 core TR with quad channel memory was a perfect opportunity to show 4 Vs 2 channels. Utube never did this. Only the fact that a 16 core 1950x could be installed. X370 boards had one X16 socket and TR had X16 X8 X16 X8 was ignored too. In two years AMD dropped the 8/12/16 core TR cpus. 2020, oh hell a quad M.2 PCIe card won't work on an X570 board. Oh hell one M.2 is on cpu lanes and one M.2 is on shared chipset lanes. Oh hell, a new M.2 gen4 drives run at 3500/3500 on Intel and 7000/6600 on AMD. And when did you see 4x8GB Quad 3200C14 Vs dual 3200C14 on Utube? It's just crazy that servers run 8 channel memory.
The case for droping a 5600x on x370 and buy a 7700xt or a 4060 keeps getting stronger. RUclips algorithm doing it's magic.
This guy will speed up your computer just by it being in the same room as him.
So it takes time to get the best out of ddr5?
We should compare speeds of 2400 and 3200 as well. I think at higher speeds you will see diminishing returns until CPUs get faster.
So you're saying newer motherboards can deal with ram speed better. But how about laptops? I bought an MSI laptop last year and I want to upgrade the RAM but how do I find out if my motherboard can support those RAM speed?
I am not skipping ads for you, good sir.
I appreciate that
I recently picked up a Ryzen 2700X, Gigabyte B450M Aorus Elite and a 16GB (2x8) dual channel DDR4-3600 kit from Teamgroup for the same price as a DDR4-3200 kit from Kingston, so I thought, why not? Gigabyte's website says the board will support up to 3600 (OC). I went to the BIOS straight away (where else?) and there I just loaded the XMP profile and it's 3600 right away. I don't even know if I can run at 3200. Default with no XMP is 2400. Just two choices, I think: 2400 and 3600. The RAM kit isn't even in the QVL list although it says it's optimized for AMD. What do you think?